View Full Version : Max OPF filesize (can smaller text files be counter productive)


paul-b
05-11-2013, 10:45 AM
I've split up very large text files into smaller ones in the 750K range to improve reader performance.

The source text is formatted in such a way that if I chose to, I could split text files to as small as 2point5 K.

If I split down to the 2.5K text file size, my typical EPUB of 7 MB, would have approixmately 2800 file entries in the OPF. I don't know how large that OPF would be.. but it would be huge.

Overall question: can splitting text files into smaller text files be done to the point of be counter-productive? That is: too many, too small files.

Is there a size limit on the OPF?

mrmikel
05-11-2013, 03:55 PM
The point at which some older readers choke on file size is above about 260k.

It is convenient and useful to simply use chapters as sections, so long as they are below 260k.

If you have endnotes or links within a chapter, having to jump out contributes to more overhead, so that would be another reason to keep them at about that size.

Toxaris
05-11-2013, 04:12 PM
That limit of 260k is not for the OPF, but for the XHTML though.

Agama
05-16-2013, 07:41 AM
Overall question: can splitting text files into smaller text files be done to the point of be counter-productive? That is: too many, too small files.

I would have thought it likely that a huge OPF would affect performance at some point, possibly depending on how much working memory a reader has and how much of the OPF the reader holds in its memory.

I know that my aging PRS-300 struggles with really large toc.ncx files, (after about 20 seconds it just resets), but I don't know if would do the same with the OPF.

If you care to post an ePub, (respecting copyright), with 2.5K files and a huge OPF then I'll certainly test it and let you know what happens. :)

In any case, as mrmikel says, it make good sense to split a book into chapters and sections.

AlPe
05-16-2013, 10:48 AM
You can try this file (http://www.smuuks.it/downloads/ComeNonDetto.epub) (it is under CC BY-NC-ND 3.0). It has >2K XHTML pages.

Agama
05-17-2013, 04:03 PM
I've now tried it and on my PRS-300 it is much slower than a more typical ePub. An ePub usually opens in about 1 second whereas this one takes nearly 10 seconds. It is also much slower paging through it.

So it certainly looks like too many small xhtml files and a corresponding large opf can be bad for performance. I expect newer readers would have less trouble since they are likely to be more powerful than my old Sony.

JSWolf
05-17-2013, 09:56 PM
I know that my aging PRS-300 struggles with really large toc.ncx files, (after about 20 seconds it just resets), but I don't know if would do the same with the OPF.

One thing that causes the ncx ToC to take longer to load is if it has id tags (file.xhtml#ch01).

theducks
05-18-2013, 11:33 AM
I would have thought it likely that a huge OPF would affect performance at some point, possibly depending on how much working memory a reader has and how much of the OPF the reader holds in its memory.

I know that my aging PRS-300 struggles with really large toc.ncx files, (after about 20 seconds it just resets), but I don't know if would do the same with the OPF.

If you care to post an ePub, (respecting copyright), with 2.5K files and a huge OPF then I'll certainly test it and let you know what happens. :)

In any case, as mrmikel says, it make good sense to split a book into chapters and sections.
My PEz just barfs selecting past a point (title) on a large (4451 books,~5K pages) Calibre (EPUB) catalog (67MB). Just too many files and thumbnails.
No single HTML file exceeds 260K. :thumbsup: The OPF is 967KB. :eek: