You have chosen to sponsor your bid up to a maximum amount of .
WANTED: Programmer for External Accelerometer Gesture Recognition App (EAGRA)
EAGRA is an iphone application which recognizes and counts repeating gestures. The initial app will have four gestures and three users, plus an “open” user which uses data from all three users for a selected gesture. The data that will drive these recognition modes has been collected, and will be provided to you along with a full functional spec and detailed gesture descriptions. The app is expected to work on iphones and use real-time accelerometer data gathered via the Node sensor platform (API available at: develeoper.variabletech.com).
Scenario 1: Alice Goes to the Driving Range
User Action: Alice likes to practice her golf swing, and tapes a Node sensor platform to the shaft of her club when she goes to the range. Today, she wants to count the number of swings she takes while at the range. She opens the app and sees Screen 1. She selects “Alice” from the User Menu and “Golf Swing” from the Gesture Menu. She then hits the Start Button.
App Action: Oh No! The app shows an error message: “No Sensor Found.”
User Action: Alice looks at her Node, and realizes that it’s out of battery. She recharges the Node, then presses the Node Handshake Button. The app finds the Node via Bluetooth, and the Node Handshake Button goes from reading “Connect Node” to “Node Connected.” Alice presses the Start Button.
App Action: The app loads a gesture-recognition mode corresponding to the User and Gesture selected, and then presents Screen 2. The Mode Display reads “Alice: Golf Swing”
User Action: Alice begins hitting balls with her club.
App Acton: The Gesture Count increases 1-for-1.
User Action: Alice sets down her club, picks up her phone, and hits the Reset Button.
App Action: The app saves the data, and gesture count to the SD Card on the phone. Since Alice didn’t input any text into the comment box, no text is saved.
Scenario 2: Suzie Tries out the App
User Action: Suzie sees her friend Alice practicing her swing with a widget taped to her club. “What the hell is that?” Suzie asks, “Can I try?”
Alice hands her club and phone to Suzie, saying, “Sure. Just select ‘Open’ from the User Menu and make sure the Node Handshake Button reads ‘Node Connected’.” Suzie selects “Open” from the User Menu, sees that the Node is connected, and hits the Start Button.
App Action: Oh No! The app shows an error message: “Please Select Gesture.”
User Action: Suzie selects “Golf Swing” from the Gesture Menu. She then hits the Start Button.
App Action: The app loads a gesture recognition mode drawing on golf swing data from all users, and then presents Screen 2. The Mode Display reads “Open: Golf Swing.”
User Action: Suzy hits some balls with Alice’s club. She then enters her name and a few thoughts on the application in the Comment Box. She then hits the Reset Button.
App Action: While Suzy hits balls, the Gesture Count increases 1-for-1. When Suzie hits the Reset Button, the app saves the raw accelerometer data, the gesture count, and the text in the comment box to the phone’s SD Card.
This is meant to be a small, initial test project for someone who has basic familiarity with gesture recognition and/or using algorithms to process data. Upon prompt, effective completion, more work will immediately follow.
IMPORTANT: Please include "I read the spec" in the subject line of your email so that I know you've read this far. In addition, please include in your proposal some thoughts as to the algorithms, frameworks, or technologies you hope to use to create the gesture recognition modes. Thank you for your consideration.