View Single Post
Old 04-12-2011, 04:43 PM   #1
Doug-W began at the beginning.
Posts: 18
Karma: 10
Join Date: Feb 2011
Device: Nook
Profiling Large Libraries

I have a large library (39,000 entries over 12,000 Authors) That is pretty much unusable due to how long it takes to do anything. Is there any hooks to run cProfile against it and see where it's spending it's time?

One thought outright is over 12,000 entries just doing an fstat can take a very long time. When I used to own an ISP we wrote some custom Linux kernel hacks just to avoid the inode lookup on the news server. Is there any way to have a module specify the way to convert a book entry into the path? IE: rather than being library root/Aadam Johnson/His First Book (ID) it may help on large libraries to go library root/Aa/Aadam Johnson/His First Book (ID) so that the initial lookups are from a directory with ~676 max entries and not the 12,000 I have now.
Doug-W is offline   Reply With Quote