Quote:
Originally Posted by dschnit1
I recently lost my CC library (I told a cleaning app to delete "junk" files without reading which apps were being targeted) and am in the process of reloading my approx. 9100 book library. I am using wireless connection and batches of 400 each time. After loading 3000 books or so, the process slows down to where it literally takes hours to upload each batch. Is this a known phenomenon or is it my system? I am going from a Windows 10 PC to a Samsung Galaxy Tab S running Android 5, storing the library on a SD card.
|
Starting with version 5, CC uses Android's "Storage Access Framework" (SAF). This is required by Android to permit you (a user) to put books on an SD card anywhere other than the private app-specific folder.
One problem with the SAF is that it requires CC to talk to another application/system service to read and write files. The performance of these operations is entirely controlled by that other service. Unfortunately, it appears that this "other service" can be very slow on some devices.
I suggest you check the following things:
- Be sure calibre has enough memory for the library. Run calibre 64 if you can.
- Check that there aren't other apps running on the device, especially ones that consume network resources.
- Verify that you have a good wifi connection from your device, and from the computer running calibre if it uses wifi.
Hmmm ... something just occurred to me. Using a book path that specifies a folder hierarchy might improve things dramatically. The reason: SD cards usually run a FAT32 file system. Directories in FAT32 are not indexed, so the operating system (Android) must do a linear search (1, 2, 3, 4 ...) whenever CC creates a file in order to see if the file is already there. The file is never there, so the time required is the worst case: a search of all the elements. Clearly the more files it must look at the slower it will run. For example, in your case with 3000 books, doing the check runs 3,000 times slower than when the library was empty. The open question is "3,000 time what?" If that "what" is a millisecond then each search is now taking 3 seconds. Several searches are required to store a book. Another open question is how the OS caches the information. A bigger cache will give better performance until the number of files exceeds the size of the cache.
To test this I ran an experiment on my Fire 7/rooted running CM12. I cleared CC's library then sent 600 books to the sd card using three different file name templates. The first template stores all the books in the root folder, using no sub-folders. The second template creates a folder per first-author then stores the book. The third creates a folder per first letter of the first author, then a sub-folder per author (all beginning with the same letter), then the book. The results are quite surprising:
Results
In all cases the average time for the first 10 books is .65 seconds.
Code:
Template :: average time per book :: avg last 10 books
{title} - {authors} :: 2.00 secs :: 3.2 secs
{first_author}/{title} - {authors} :: 1.15 secs :: 1.2 secs
{first_author:%1.1s}/{first_author}/{title} - {authors} :: 0.65 secs :: 0.65 secs
We see that with the flat template, at the 600th book the time per book is approximately 6 times worse than for the 1st book. For the second template the average time for the last books is approximately 2 times worse. For the third, the average time for last ten books is the same as the first ten.
At 3,000 books and using the first template, the time to store a book should be approximately 16 seconds, or approximately 4 books per minute. For the last template, that time would probably still be very close to 0.65 seconds which is approximately 90 books/minute.
For info: Android internal memory uses a file system that does not have this problem. It is near constant time regardless of the number of files. This is why we don't see these performance problems when using internal memory.