Has anyone run a neural LLM on a jailbroken Kindle Paperwhite?
Has anyone here tried running a quantized neural language model (like a really tiny GPT-2 or RWKV) directly on a jailbroken Kindle Paperwhite (512 MB RAM, ARM Linux)? Is it actually possible to get any real LLM working locally on the hardware, even just as a proof of concept?
In theory, it seems doable with the smallest models—something like TinyLlama, DistilGPT2, or a “world’s smallest” RWKV build. Obviously, it would be painfully slow and not very smart, but technically it would be a real AI running natively on a Kindle, which would be kind of amazing.
Would love to hear if anyone’s managed this, or even just tried!
|