Quote:
Originally Posted by chaley
Its obvious that you will get comments that its obvious that you will get comments that it is obvious that (stack overflow!) 
|
Lol...
Quote:
(to my statement on cleaning out books...) This one surprises me. The only thing in calibre's startup that looks at formats is building the tag browser format category, and that is a single query on a view. Calibre doesn't look at the actual files until you ask it to do something to one. Do you have an idea how much this cleanup helped?
|
Thanks for clarifying... To your point, this particular action may not have had any impact at all on load speed, I'm afraid I didn't do a very good job of doing before/after comparisons with each action I took. I cleaned out ~2,000 out of 6,000 books, some individually, some through format lookup. I'll try to pay more attention to how usage patterns and optimization efforts impact performance for now on. However, cleaning up formats *seemed* to impact search speeds while using calibre. Although my perception here is extremely subjective... I suppose it could have been the same part of my brain that makes me think my car runs better after it gets washed
Quote:
(to my comments about defragging...) Do you have before and after times? The reason I ask is that suggesting defragging has occasionally met with derision. It would be nice to know what level of difference it made in your case.
|
Suggesting defragging should not be met with derision out of hand, for a few reasons. There is a really arbitrary urban myth that NFTS-based OS's don't need to be defragged, so if you're running Vista, Windows 7, or even Windows Home Server, you might not think about defragging often. I'm not sure why, but I think this view was probably founded on the misconception that defragging primarily eliminates what is called "free space fragmentation", which some folks might not think is necessary with the massive hard drives available to use today. This is quite wrong. Free space fragmentation on it's own is a very big cause of sluggish read/write operations on massive hard drives. But defragging can also improves file fragementation, directory fragmentation, metadata/MFT fragmentation. In my case, for instance, I haven't had defragging my servers at the top of my mind for the last 6 to 9 months, so when I checked the other day I found that I had from 3 to 6% free space fragmentation on many of the 2 TB drives on my server, but I had as high as 15% to 42% fragmentation in the directories, and metadata/MFTs of almost all the drives. Without getting too technical here, let's say you're moving very small files around quite a bit, AND you're changing names, tags, etc. Over time you will notice a gradual performance tax if not only the files themselves are becoming fragmented, but the "compounded issue" of having fragmented directories and metadata that the hard drive uses to point to these files.
That said, the noticable impact of defragging with be like testing for low blood sugar. You only get a good read when you're testing at the right time, under the right circumstances. But, regardless of what you observe, the read/write heads on your HDs will thank you (and possibly live longer) for making their job easier.
DISCLAIMER: DO NOT, EVER, EVER... use the standard Microsoft dafragging tool on Windows Home Server. Don't log in through remote desktop and start defragging drives this way. You need to use a specialized defragging tool (like DiskKeeper or PerfectDisk) that knows how to work with shadow copies and other things that make WHS different from your desktop,
Quote:
Going from 120-180 seconds to 15 seconds is good work. It is hard not to like factors of 10.
|
Which reminds me, I did one other very important thing to speed up performance. I rebuilt the search index on both my local PC and the server. Before doing that, however, I checked the search options and found that several file types were not even being indexed. Such as epub and mobi. That's a problem because Windows had to find those files over again every time a search was performed. But also, over time Windows Search builds up artifacts, like bad links to files that were deleted long ago... Rebuilding the index had a tremendous impact on how fast searches were taking in Explorer. Prior to rebuilding some searches were instant, but in other cases I would often sit there for several minutes waiting for Windows to deliver results. Now, most searches are instant. Calibre seems a bit snappier too. My *theory* is that despite the fact that Calibre doesn't touch Windows Search, Windows is working to index the directories (libraries) that Calibre uses to store books and other files; which is normal, but (again my theory) without certain file extensions being added to the index, Windows 7 seems to want to rediscover them every time you browse your library. Just an observation. [/QUOTE]
Quote:
Thank you for taking the time to test, and (even more) taking the time to report.
|
Absolutely my pleasure. Let me say again how much I appreciate this application and its developers.