Quote:
Originally Posted by BetterRed
@yonkyunior - your link gives 404 error
==============================
@dedeaux - when I went though a similar exercise I used a process of moving a batch of 50 books from my 'messy' library into a 'workbench' library where I would fix the metadata, covers, do conversions etc. Once satisfied I moved the books from the 'workbench' library to a 'clean' library, on 2nd and subsequent batches before doing the move I resolved what ever the Find Duplicates (PI)->Find library duplicates function found.
Doing the above in situ gave me no sense of progress and it didn't encourage me to work methodically, whereas in doing the above I got to see the 'messy' library dwindling and the 'clean' library growing - i.e. more 'job satisfaction'.
Today I import books into the 'workbench' library where I do what ever needs doing with metadata, covers, conversion, editing etc before moving them to the 'clean' library. Most (> 90%) of my books are non commercial public domain texts from government agencies, quangos, ngos, etc, so I'm importing 10-20 new 'books' a day.
BR
|
Thank you for your input. I have such a huge task in sorting out the library that it is almost too hard to get started.
Nonetheless, I am going to get it done!
I have already mapped out a plan of attack, similar to the way you describe. It will take a while, and I will incorporate new additions as I go.