Quote:
Originally Posted by mrdini
Quite neat! I'm interested in this as well, as my V2 also has a touchscreen (& generates notes in .png format)...
How do you hook into the library...? (I had a browse through the ADC notes, but as far as I can tell, it only works with tablets...?)
|
You can tell the framework that you will feed it points; so I simply "replay" the points from the .irx. The Ink engine will automatically send you the recognized text as carbon events, so you have to register for those and that's about it !
The only problem is that you are supposed to tell the engine when phrases are complete, and that seems to impact quite a bit the recognition. So far I just have a very crude way of segmenting the strokes in phrases, which kinda work but it shouldn't be difficult to come up with a vertical frequency method to classify the strokes, and that should be more robust.