January 2013
Built in 48-hours at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion, our team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures. Social Sign visualizes these hand gestures and broadcasts them in a textual and pictoral representation to other signers in the signing room.
In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate. SocialSign - handing communication to you. Built using Node, MongoDB, Socket.io,Three.js,WebGL.
Aspects: Leap Motion, Javascript
Team Members: Michael L. Rivera, Natalie Gravier