I would provide a conversion log but there isn't one. ebook-convert -d crashes during the rendering phase without writing anything but the README.txt. This is large (500MB-1GB) CBZ files with high resolution PNG images (3550x4950 up to 10115x14450). Humble Bundle can deliver quite high resolution scans. The Python traceback isn't much help to me:
Code:
5% Rendered /tmp/calibre_3.39.1_tmp_ULi06y/q46NYH_comic_extract/v01/004.png
5% Rendered /tmp/calibre_3.39.1_tmp_ULi06y/q46NYH_comic_extract/v01/063.png
Traceback (most recent call last):
File "/usr/bin/ebook-convert", line 20, in <module>
sys.exit(main())
File "/usr/lib/calibre/calibre/ebooks/conversion/cli.py", line 391, in main
plumber.run()
File "/usr/lib/calibre/calibre/ebooks/conversion/plumber.py", line 1106, in run
accelerators, tdir)
File "/usr/lib/calibre/calibre/customize/conversion.py", line 244, in __call__
log, accelerators)
File "/usr/lib/calibre/calibre/ebooks/conversion/plugins/comic_input.py", line 182, in convert
pages = self.get_pages(fname, cdir)
File "/usr/lib/calibre/calibre/ebooks/conversion/plugins/comic_input.py", line 146, in get_pages
self.report_progress, tdir2)
File "/usr/lib/calibre/calibre/ebooks/comic/input.py", line 271, in process_pages
job.log_file.read())
Exception: Failed to process comic:
I can downscale with ImageMagick and then feed into ebook-convert but IM requires increased memory, map, area etc. limits in policy.xml from Debian's defaults so I'm wondering if there's a memory allocation limit I'm hitting in Calibre. 64-bit version if it matters.