Where I have managed to test all three in one app is the modified Draw app mentioned before.
Onyx's stylus using their TouchHelper shows a stroke pretty much as one draws. I'd call this fast - looking at the logs, detection is fast and so is displaying the resulting stroke.
When using the mouse, it's quite noticeable that there is a gap between where the pointer is displayed and where the stroke is. One thing about using the mouse is that unlike in "ordinary" environments (e.g. PC), one doesn't see the pointer image updated enough for it to look smooth when moving the mouse -- it's jumpy. (For reference, I compared the "mouse" gap with the gap when using the bundled Note app for the original Max using the stylus. The experience of the original Max is superior.)
When drawing strokes with a finger tip, the gap appears bigger than when using the mouse.
As far as I can tell, detection is fast in all three cases, it's the resulting drawing that's slow for the mouse as well as for the finger tip.
So to summarize the delays, my current impression is:
Detection: stylus == mouse == finger
Drawing: stylus < mouse < finger
Hope this was clear.
(On a related note, I noticed in Onyx's PenStylusTouchHelperDemoActivity (part of their 'sample' sample app), when the checkbox for RawDrawingRenderEnabled is unchecked (and subsequently the PEN button is pressed), drawing with the stylus results in rectangles being drawn (w/ the just drawn one erased if the stylus is moved further). The speed of this drawing reminds me of the speed of drawing with the mouse.)
|