Originally Posted by yuri_b
I remember how long it tooks to view current directory in Norton Commander when number of files >500.
With number of books > 500 it's not good way to use FAT file system and trust me data base will search for file faster then file system, just because this is data base's job.
Another aspect is a limitation of FAT32:
a) The maximum possible number of clusters on a volume using the FAT32 file system is 268,435,445 http://support.microsoft.com/kb/184006
b) Cluster size for file on FAT32 for 32GB: 16 KB, so every thumbnail will cost you no less then 16K. So 150000*4 files will cost no less then 9.6 GB on FAT32.
c) Every entry occupied 4 bytes in FAT32 table (up to 8 MB). http://www.pcguide.com/ref/hdd/file/partFAT32-c.html
P.S. A program, that I wrote for home use, a film catalog, uses sqlite for storing all data for a film, including several pictures, db file today about 60MB in size and it works as a sharm (Win32 )
Sorry, I don't want to sound rough, and my English is far from being perfect too, so don't take this as a personal attack.
But I think that 150k files is far from DR devices capacity, we can't design and write code to support that huge numbers. Even I don't know any other user than you with more than 5k files.
I'm not a professional developer as my daily job, it's only a hobby for me, but I've been writing code since almost 30 years ago.
One of my current (halted) projects is a book manager. There I was considering to store covers on the db (SQLite) or not. After some tests I finally I preferred to keep them in the file system. I have over 55k files and my machine is a linux AMD64x2 with 6GB RAM, file system is ext3, much powerful than the DRs.
And here, on the DRs, maybe SQLite is enough to handle large number of entries, but both CPU and RAM are really not as powerful as we would need.
I think in cases like yours is better to use a plain file system browser, not a database-based one.