View Single Post
Old 04-07-2009, 04:19 PM   #23
pilotbob
Grand Sorcerer
pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.pilotbob ought to be getting tired of karma fortunes by now.
 
pilotbob's Avatar
 
Posts: 19,832
Karma: 11844413
Join Date: Jan 2007
Location: Tampa, FL USA
Device: Kindle Touch
Quote:
Originally Posted by Xenophon View Post
If python has reasonable collections libraries available (surely it does!), you can also switch that in-memory cache to be a balanced tree or other nicely optimized structure. That'd give n-log-n performance -- another factor-of-14 speedup for Student1's case. If that takes more than a couple more lines of code, I'd be really surprised.
Of course, I am not sure I want 20k book records cached in memory. I prefer a smaller memory footprint.

BOb
pilotbob is offline   Reply With Quote