Love this plugin. Works fine for most of my huge Calibre library, processes usually faster than 1 minute per book.
Unfortunately longer books do not take proportionately longer, but rather MUCH MUCH longer. The longest books I've tried never complete even after running over 12 hours.
I am not out of disk space or memory, and it's still churning away using CPU.
An omnibus of 6 books should take 6x as long as processing one book, but instead it takes hours.
I'm doing:
Page Count Paragraphs APNX Accurate
Word Count use ICU algorithm
Gunning Fog Index
Are any of these algorithms non-linear?
Is there a key part of the plugin that I could re-code in a compiled language to speed it up?
An easy source of large books are cheap/free classics like these from Golden Deer
https://www.amazon.com/kindle-dbs/en...price-asc-rank
The most extreme example I have is the Complete Harvard classics which is over 70,000 pages.