I came across an article describing this CMU students project which is a glove that will take sign language and translate it into English. The link above will take you to his blog where he describes each aspect of it in detail, but I will give you a basic overview.
The glove is attached with flex sensors which simply measure flex. The value returned by the sensor is altered by how much flex is put onto it. These flex sensors are attached to each finger and are plugging into an Arduino Mega micro controller. The input from the flex sensors is compared to a library of signing gestures and matched to the proper gesture. The value of the gesture is then displayed onto the LCD screen that is hooked up to the Arduino. This way, the user can see what he or she is signing. The student then hooked up a speaker to the Arduino so that once the user has the desired input, they may share it with those around them instead of making others read the screen.
This is a good example of how embedded and sensory systems are being used for augmented reality!
Hi, my name is Jared Smith. I studied Computer Science with a minor in Math at NC State University. I am currently working at Schneider Electric as an Application Design Engineer. There, I help design and program industrial automation machinery using high-end servo controllers that we manufacture. This blog contains information about projects from my undergrad research as well as projects I currently work on in my free time.