Spaceflight Research Goal #1: MYO Sign Language Translation

A week ago, I got into touch with a research group at Arizona State University, who had managed to get their MYO armbands to recognize (some) sign language. For background on the MYOs, see Thalmic Labs and my older post.

front-view
MYO muscle-sensing armband, from Thalmic Labs

A few years ago, I ordered a pair of MYOs when they had just come out. I tested them, and it turned out they weren’t sensitive enough to “read” sign language. But the ASU group proved me wrong. So I reached out to them, introduced myself, and told them about my involvement in the PHEnOM Project.

Their project, Sceptre, managed to interpret simple signs with 97% accuracy from a sample collection of ASL users. Their system used MYOs paired with a smartphone linked to a server for translation (see below). They are now working on a system that doesn’t use a server, for easier portability.

Communication
Original SCEPTRE system, with server

We have agreed to design some experiments on this over the summer. We will find ASL signs for words and phrases used in mission control and test them under clothing, to simulate a spacesuit. At PHEnOM’s first ground school this August, I will test it inside a real spacesuit from Final Frontier Design (if the arms aren’t too tight for MYOs!).

Even though the New Shepard suborbital craft is 100% computer-controlled, some communication with mission control or fellow astronaut-passengers would be beneficial. If successful, this could find its way into orbit, where communication is absolutely crucial!

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s