Dude, I used to surf the web on mozilla 1.0-3.0 on my mac that had 32 mb ram and it could display images and format pages about as good as the kindle. Now that thing was also multitasking with an OS in the background + extensions and other crap too, and no virtual memory. So let's be serious now.
I'm pretty sure that a web browser, programmed correctly, can browse and view web pages (images+text) with many many many many pages as long as there is no active content (flash/java applets/etc)
Now then, as far as ebooks go, you can easily load the entire thing + the index into memory (take a few mb for most novels) and then render many many many pages in in both directions and store those too (which it probably does). I have no idea how the thing works. I'm not a hacker, I don't pretend to be. I'm not a script kiddie either. I have moderate programming skills with a background in c/c++/java/x86 assembly however, so I sometimes have some idea of how things work (or atleast think I do, though am often wrong).
Anyways. The hardware is well beyond what the thing needs to do IMO. So my question simply remains. Why can't the thing do what it should be able to do?
Simple answer? Nobody has bothered to write the software to do it yet. For whatever reason it has not yet happened. Perhaps the coders don't want to take the time to do it, or the coding is poor and uses a lot of resources that it should not (if this is the case, they should revise their code and fall back to simpler/different rendering engines to make better use of the hardware they have).
|