![]() |
#1 |
Groupie
![]() ![]() Posts: 190
Karma: 134
Join Date: May 2010
Device: IREX DR1000
|
Very long startup time
Now that my imported ebooks are growing I face a problem.
The startup time is very long: about 1 minute. What is Calibre doing ? I have only a 14MB metadata DB. Is there anything that could be done to lower the startup time ? Regards Giuseppe Chillemi |
![]() |
![]() |
![]() |
#2 | |
Grand Sorcerer
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 28,810
Karma: 206879174
Join Date: Jan 2010
Device: Nexus 7, Kindle Fire HD
|
Quote:
![]() Set it to close to the system tray. |
|
![]() |
![]() |
Advert | |
|
![]() |
#3 |
US Navy, Retired
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 9,897
Karma: 13806776
Join Date: Feb 2009
Location: North Carolina
Device: Icarus Illumina XL HD, Kindle PaperWhite SE 11th Gen
|
One minute for start up isn't bad at all. I have 4058 books and just now my start time was 15 seconds. The start up time has improved but I don't anticipate much more improvement. This has been discussed in depth in other threads, I'll let the technical folk explain the development philosophy behind ease and quickness of development versus the apparent slowness of the start.
|
![]() |
![]() |
![]() |
#4 |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 4,812
Karma: 26912940
Join Date: Apr 2010
Device: sony PRS-T1 and T3, Kobo Mini and Aura HD, Tablet
|
Start up time is pretty random or me and this from a fresh boot. SInce upgrading yesterday it seems pretty fast but at times it has been as long as 5 minutes.
Now the bulk changing of metatdata seems to take forever. I have a new I7 desktop coming this week to replace my current 2 year old quad core but am not sure it will make much of a difference a difference with calibre. Perhaps an SSD? As Walt seemed to imply I would happily wait 15 minutes for it to start if it worked faster on the GUI/processing end. And of course I will wait anyway ![]() |
![]() |
![]() |
![]() |
#5 | |
US Navy, Retired
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 9,897
Karma: 13806776
Join Date: Feb 2009
Location: North Carolina
Device: Icarus Illumina XL HD, Kindle PaperWhite SE 11th Gen
|
Quote:
It only seems to take forever because you didn't use it much 6 months back. ![]() |
|
![]() |
![]() |
Advert | |
|
![]() |
#6 |
Groupie
![]() ![]() Posts: 190
Karma: 134
Join Date: May 2010
Device: IREX DR1000
|
|
![]() |
![]() |
![]() |
#7 | |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 4,812
Karma: 26912940
Join Date: Apr 2010
Device: sony PRS-T1 and T3, Kobo Mini and Aura HD, Tablet
|
Quote:
Helen |
|
![]() |
![]() |
![]() |
#8 |
US Navy, Retired
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 9,897
Karma: 13806776
Join Date: Feb 2009
Location: North Carolina
Device: Icarus Illumina XL HD, Kindle PaperWhite SE 11th Gen
|
It might help to document this, exactly what are you doing when it is so slow. With all of the updates something might have slipped up along the way and if you can ID what is going on for the developers there is a good chance it might be corrected.
|
![]() |
![]() |
![]() |
#9 |
Grand Sorcerer
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 12,522
Karma: 8065528
Join Date: Jan 2010
Location: Notts, England
Device: Kobo Libra 2
|
One thing I am cogitating about is the relationship between the size of the calibre database and memory. Calibre reads the spreadsheet data from the DB into memory so that searches and sorts can work reasonably well. If the size of that data is larger than main memory, the computer will likely start to use virtual memory, in which case performance can drop by orders of magnitude. The performance slowdown could begin with a single book, and would get rapidly worse as more books were added.
As a rough guide, compare the size of metadata.db to the amount of RAM on your computer. Free RAM after boot would be better, if you can easily get that number. Please post this info if calibre on your machine seems to have slowed down dramatically at some point. I am not sure what, if anything, we can do with this information, but knowledge is power. |
![]() |
![]() |
![]() |
#10 | |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 3,130
Karma: 91256
Join Date: Feb 2008
Location: Germany
Device: Cybook Gen3
|
Quote:
![]() I just had a look at my metadata.db, and, while I, of course, am no expert on this, I'm sceptical as to your reasoning. From my database, assuming linear growth per book, I'd get to about 150.000 books (assuming 50% free memory on a 1 GB RAM system) until free RAM runs out. I know that some people have huge library, but still, that sounds like a really huge number to me. |
|
![]() |
![]() |
![]() |
#11 | ||
Grand Sorcerer
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 12,522
Karma: 8065528
Join Date: Jan 2010
Location: Notts, England
Device: Kobo Libra 2
|
Quote:
Quote:
If this number is correct (and it seems to be for me), then a 20,000 book library will consume almost 1GB of memory. To get this amount for calibre, a windows machine would need around 2GB of physical memory. As for comparing the size of the DB to memory needs, the multiplier seems to be between 1,500 and 2,000. The multiplier value will vary tremendously depending on metadata. |
||
![]() |
![]() |
![]() |
#12 | |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 3,130
Karma: 91256
Join Date: Feb 2008
Location: Germany
Device: Cybook Gen3
|
Quote:
I only did a very rough and sketchy order of magnitude calculation from the size of my metadata.db and the known number of books in the library. I totally forgot, and thus omitted in the calculation, the GUI and other stuff a book needs. Your reasoning makes sense now. One way to circumvent the memory limitation would be to only load relevant data into RAM and load the other data as needed. However, improvements would vary for certain values of "relevant"- I'm thinking along the lines of only loading title, autor, ID and whatever GUI stuff is needed and only loading the rest as needed on access or search. However, that would increase search times by the number of books times the cost of loading the rest of the metadata to search and would similarly increase the time needed to switch selection from one book to the next, so it would be no real improvement. A thing that just came into mind is that the OS should shuffle off a certain portion of the DB into swap, if your reasoning is correct. So, wouldn't that imply that there would be a significant delay, if one were to scroll quickly through the library, at a specific point, when the rest of the DB has to be loaded from swap and the beginning dumped into the swap? |
|
![]() |
![]() |
![]() |
#13 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,566
Karma: 28548962
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
@charles: When you're doing that calculation, be aware that it will be highly distorted for small libraries by cover caching. IIRC, the cover cache stores ~50 covers in memory, so results will only be meaningful for libraries larger than 50 book.
Also you should allow a few minutes for the garbage collector to collect all objects before checking the size of a library. Comparing a 422 book lib to an 86 book lib I get 14,355 bytes per book on a linux 64-bit system, which would mean you'd need a 75K book library to consume 1 GB of RAM |
![]() |
![]() |
![]() |
#14 |
Grand Sorcerer
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 12,522
Karma: 8065528
Join Date: Jan 2010
Location: Notts, England
Device: Kobo Libra 2
|
I am measuring (well, Windows is measuring) the size of process-private memory. Program and shared data memory is not included.
The raw numbers, after waiting for a time for memory to stabilize, are: Empty library: 64MB 24 books: 88MB (110KB/book) 980 books: 110MB (47KB/book) 1400 books: 138MB (53KB/book) (this is my production library, where almost all books have summaries, etc.) The smallest is 980-book library, at 47KB/book. My production library, which has the most metadata (see below), is 53KB/book. Just for fun, I tested a 5000 book library that has almost no extra metadata (no comments, no custom columns, no tags, etc). It was constructed by adding a 1,000 book library 5 times, so the number of authors is small. The number comes out at 14KB/book. It seems fair to conclude that memory usage is very sensitive to the amount of metadata in each book. |
![]() |
![]() |
![]() |
#15 |
creator of calibre
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 45,566
Karma: 28548962
Join Date: Oct 2006
Location: Mumbai, India
Device: Various
|
Try measuring the memory with this:
calibre-debug -c "from calibre.library import db; db = db('/home/kovid/documents/library'); db.refresh(); raw_input('Press enter to quit: ')" that should give pure database mem consumption numbers with no GUI cruft Last edited by kovidgoyal; 11-03-2010 at 01:03 PM. |
![]() |
![]() |
![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
STARTUP TIME, WORKING TIME | jomaweb | Calibre | 2 | 02-13-2010 11:17 AM |
Long time stalker, first time poster | yellowfeather | Introduce Yourself | 8 | 09-08-2009 08:08 PM |
Hi - First time poster long time ereader | werdegast | Introduce Yourself | 5 | 11-15-2008 01:15 AM |
Long time listener, first time caller | reddots | Introduce Yourself | 2 | 01-23-2008 08:40 PM |