Thread: iLiad Teasing :D
View Single Post
Old 04-11-2008, 08:41 AM   #9
mrdini
Connoisseur
mrdini doesn't littermrdini doesn't litter
 
Posts: 97
Karma: 177
Join Date: Sep 2007
Device: Hanlin V2 & V6
Quote:
Originally Posted by rio View Post
You can tell the framework that you will feed it points; so I simply "replay" the points from the .irx. The Ink engine will automatically send you the recognized text as carbon events, so you have to register for those and that's about it !

The only problem is that you are supposed to tell the engine when phrases are complete, and that seems to impact quite a bit the recognition. So far I just have a very crude way of segmenting the strokes in phrases, which kinda work but it shouldn't be difficult to come up with a vertical frequency method to classify the strokes, and that should be more robust.
*Reads ADC* Ah, now I get it. Quite straightforward really!

Unfortunately, if I understand you, & the Ink Services docs correctly, this would only work with actual "vector"/points data i.e. a PNG image file wouldn't work as an input. Damn.

Hopefully, one day, I'll be able to hack the V2, & be able to obtain raw data from the touchpad...

Ah well, wish you all the luck with this!
mrdini is offline   Reply With Quote