Complex Gesture Recognition in iOS – Part 2: The iOS Implementation

Well, it took a little longer than I had hoped, but I finally got around to implementing the N-Dollar Gesture Recognizer in Objective-C for iOS. If you remember, this all started with a need for an iOS implemenation of a good multi-stroke gesture (I call them glyphs) recognizer. N-Dollar was the clear winner in my research, so without further ado I present my attempt at implementing it.

MultistrokeGestureRecognizer-iOS

Grab the source here…

To summarize how to use this library in a few bullet points:

  • Initialize a Detector and seed it with templates to the gestures you want to recognize
  • Capture user-input and pass it to the Detector
  • When you’re ready to detect the gesture, ask the Detector to calculate which of its templates the user input matches the most.

Initializing the Detector

Initializing the detector is a simple affair:

It’s important to note that the detector is pretty useless unless you seed it with some templates to match against. Here I add a template of a gesture with [glyphDetector addGlyphFromJSON:].

The JSON representation of these gestures are simply arrays of X,Y coordinate points. For instance, here’s a JSON array-of-points of a gesture that resembles the letter ‘D’. You can choose to use multiple strokes, or a single stroke with all of the points combined. Honey badger The detector doesn’t care!

Capturing User Input

It’s really up to you how you want to capture the user input. The UIResponder Class is a good start, but I’m personally using Cocos2d, so the following code makes use of their abstraction of touch events:

Note that at each firing of the touch event handler, I take the corresponding point and add it to the recognizer through the [glyphDetector addPoint:] message.

Detect the Glyph

Once you’re satisfied with the user input, you can call [glyphDetector detectGlyph]. This will use the N-Dollar/Protractor algorithm to compare the user input with the templates you defined. For each template, the detector determines a score — higher is better. It’ll return the pre-defined gesture with the highest score to its delegate and from there it’s up to you whether to trust the match, or wait for more user input!

Update: Here’s more info on interpreting the score that’s returned.

Up Next, An App

Don’t expect it any time soon, but I hope to put together an example app that demonstrates this library in its entirety. Please feel free to contact me at brit (at) this domain with any questions, or just shoot me a message on the GitHub repo!

21 thoughts on “Complex Gesture Recognition in iOS – Part 2: The iOS Implementation”

  1. Hi,
    Great work, and thanks for the tutorial!!
    Any chance to get an iphone app implementing this library?

    Thanks

  2. Thanks! I’ve got an app in the works but it’s slow going. I hope to have it done soon, and I’ll definitely make a post about it.

  3. Hi, I’m so happy I found your site.  I looked at N$ about a year ago but couldn’t find an objective c implementation.  One question, whats the best way to create these JSON template files?  Specifically I plan to be creating template files for characters “a, b, c, etc…”

    Thanks again!

  4. Hi! Thank you for your work. I’m trying to use the WTMGlyphDetector in my own cocos2d Project and I’m experiencing some difficulties. I don’t know how to correctly import your library in my project. I’ve added the .xcodeproj file to my project and added WTMGlyph library in target dependencies of my project and then added to it the header search path for WTMGlyph files. When I import “WTMGlyphDetector.h” and when I set a property for my class variable WTMGlyphDetector all works fine. But when I write glyphDetector = [WTMGlyphDetector detector] as you suggest in the init method xcode returns a linker error: “Undefined symbols for architecture i386:

      “_OBJC_CLASS_$_WTMGlyphDetector”, referenced from:”.
    Can you explain me what I’m doing wrong? Thank you.

  5. Looks like an old comment but…
    I print the touch location to the debug console whenever the touch begins, moves and ends. Then I manually copy/paste it to a text editor…

    When touch begins:
        printf(“[ [%d,%d]“,(int)(touchLocation.x), ((int) touchLocation.y));

    When touch moves:
        printf(“,[%d,%d]“,(int)(touchLocation.x), ((int)touchLocation.y));

    When touch ends:
        printf(“,[%d,%d] ],n”,(int)(touchlocation.x), ((int)touchlocation.y));
    (as you see this is c++, but you get the idea)

  6. hi thank u for the implementation file,so please send me with example app its very useful for all..

     

  7. Shouldn’t the line : [self addGlyphFromJSON:jsonData name:@"D"];
    really be [glyphDetector addGlyphFromJSON:jsonData name:@"D"];
    CCLayer doesn’t implement addGlyphFromJSON:

  8. My glyphDetector is return the same score for all possible glyphs (0.00000000)  Therefore it selects the same  glyph (the first one loaded) everytime regardless of what I draw on the screen.  Any ideas ?

  9. How exactly did you fix this?  

    I am having the same issue and can’t figure out what the problem is.

  10. I’m loading up the whole alphabet via json files and it takes almost 20 seconds !! I’m looking for a way to speed up this process.  Object serialization, static json data in my code ?  thanks.

  11. Hello. It’s a great tutorial. I am a beginner. I have got the project itself to run. But where is the library that I can import? Is it there, or do I have to build it from the project? Thanks in advance

  12. Thanks for the awesomeness,I am a begginer, but i want to warn for a little problem that kept me for an hour until i found out. If you import the .xcodeproject by dragging it ( the main one (not the demo)), just be careful that it doesn’t reference the WTMDetectionResult.h/m files, so you have to fix it otherwise you will get an error.

Leave a Reply