Monday, September 13, 2010

Reading #5: Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes (Wobbrock)

COMMENTS:

Crazy Chris

SUMMARY:

This paper describes the $1 gesture recognizer. The authors compare the $1 recognizers performance to that of Rubine's classifiers and Dynamic Time Warping (DTW) algorithms. In developing this recognizer, the authors sought out to create something that was resilient to variations in sampling, supported position variance, simple, easily written, fast and produced results comparable to existing recognition algorithms.

The $1 algorithm involves 4 steps: Resample the Point Path, Rotate Once Based on the “Indicative Angle”, Scale and Translate, and Find the Optimal Angle for the Best Score. Testing of the $1 recognizer resulted in 97% accuracy with one loaded template and 99.5% accuracy with 3+ templates. However one weakness of the algorithm is that it cannot distinguish gestures based on orienation or aspect ratio (i.e. it cannot tell the difference between a circle and an oval shape).


DISCUSSION:

Compared to most sketch recognition algorithms, the $1 recognizer gets the most bang for the buck (no pun intended...ok maybe a little one). The paper gives a nice explanation of the each of the four steps performed by the recognizer which correspond well to the pseudocode in the appendix. One thing that was a little confusing was the discussion the affects the gesture speed can have on recognition. The authors mentioned that Rubine's best results were obtained when the subjects' gestures were of medium speed. I was curious to know what the speed range was for the fast, medium and slow categoies in milliseconds; I don't think table 1 in the paper was very clear about this or maybe I misread it.

No comments:

Post a Comment