Quote:
Originally Posted by GeoffR
The problem is not large books in general, but large html files (or other objects) within the book.
The device only needs to have one or two html files loaded into memory at once, the one containing the page you are reading, and perhaps the one containing the page a link leads to if you are activating the link.
If the book is broken up into lots of small html files then limited RAM is no problem no matter how long the book. But if the book consists of a small number of very large html files then there will be problems. (Some devices will simply crash, others will be slower.)
If the publisher is competent then they will break the large book up into many small html files, but if not then the only real solution is to edit the book and fix the publisher''s mistakes yourself.
(If the book has been converted from another format then it might not be the publisher's mistake but the converter's mistake.)
|
Geoff, that all makes sense. Maybe as an experiment, I should convert the book simply to split apart large files into smaller ones of 260K or less? I noticed just now clicking into edit mode on the book, there are a boatload of contained XHTML files that are of larger sizes: many many files are 400K to 800K, I didn't spot any over 1MB.