Quote:
Originally Posted by kovidgoyal
Optimizing one-shot additions of very large numbers of books has never really been something I have looked at. AS adding themis typically a one-time operation and should be done in smaller batches anyway, that way if something goes wrong/crashes you dont have to redo them all.
|
I prefer to do it in one batch without making any interaction. if something goes wrong, I am prepared to do it again. I am now able to do it with calibre command line tool, see:
Code:
calibredb add --help
With closed GUI, setting environment variable cal_lib to the new and empty library, using Ubuntu (with ext4) I am importing the epub books:
Code:
calibredb add -r -d --dont-notify-gui --library-path=$cal_lib .
Using the command line tool is important to finish in a reasonable time!
Additional I used some performance improvements during the import:
Setting the database metadata.db to RAM Disk. Setting the Journalling Mode of Sqlite for the time of the import to Write-Ahead Logging (with ext4).
Importing 106000 epub books (with and without images) of:
Mirroring How-To - Gutenberg
http://www.gutenberg.org/wiki/Gutenb...rroring_How-To
in less than 16 hours. This includes generating al opf backup files. So there is no need to call calibredb backup_metadata after the import.