View Single Post
Old 08-26-2008, 07:01 PM   #16
murraypaul
Interested Bystander
murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.murraypaul ought to be getting tired of karma fortunes by now.
 
Posts: 3,726
Karma: 19728152
Join Date: Jun 2008
Device: Note 4, Kobo One
Quote:
Originally Posted by Flub View Post
It's getting closer to release day now and I'm getting excited. How many books would people say is the limit before the database rebuilding gets to be a problem?

How long am I likely to be waiting if I kept 200-300 books on there and changed something?
With 990 books on the internal memory, a scan takes less than 2 minutes. The performance problems only occur with large numbers of books on memory cards. Adding a memory card with 500 books triggers a scan which takes about 4 minutes. Adding a memory card with 1000 books triggers a scan which takes about 15 minutes. Get up to 2000 books on a card and the scan will take over an hour, 2600 books takes ~1h45m. There is some sloppy programming at work in the Reader software, and it suggests they never tested it with large numbers of books, to have such a strong non-linear behaviour.

This would suggest that the best way to use the reader for a complete library is to have a large number of small memory cards, and divide the library up in a logical way between them. With 500 books per card you are never more than 5 minutes away from a book.

As the time taken for the card scan seems to be related to the number of files, my current test is to combine all books in a series into a single lrf file, with a TOC jumping to the start of each book. The reader shows no problems opening or navigating a 10000+ page book, although the conversion time with txt2lrf increases dramatically as the size of the input file goes up.

Last edited by murraypaul; 08-26-2008 at 07:40 PM.
murraypaul is offline   Reply With Quote