In Hawkins' book On Intelligence, he talks about how adaptable the brain is in terms of pattern recognition. He mentions a technology that puts patterns of vibrating nodes on the palate. In a short (?) time the brain learns to recognize the patterns as images. Something close to this might give is an auxiliary neural input that, by training, we can optimize for reading.
Imagine a reader you put in your mouth!
Edit: I tried to track this down. There is a technology for enhancing speech recognition called "
Tadoma". In following leads regarding a visual-tactile input display I found
this article (PDF warning) which describes two way tactile communication using the tongue and palate. It has pictures too.
Here is a good (but long) article on this technology. If you have had the urge to put electronic devices in your mouth, you might be interested in a couple of articles on electropalatography:
here and
here.