Jan 02 2014

How NASA’s Jet Propulsion Lab Is Using Microsoft’s Kinect Motion Sensor

This revolutionary technology could change the way space missions are operated.

It’s not every day that consumer technology is good enough for NASA, but Microsoft’s Kinect motion sensor recently made it into NASA’s Jet Propulsion Laboratory. Researchers were able to use the device’s sensor to remotely control a robotic arm. The technology is revolutionary and could one day allow a human to operate robots, on the surface of Mars for example, from a spacecraft, or even the safety of the Earth’s Surface.

Tech site Engadget has more on the amazing technology:

NASA's Jet Propulsion Laboratory has been on the hunt for a more natural way to maneuver robots in space for some time now, resulting in cool experiments like using a Leap Motion controller to remotely control a Mars rover and using an Oculus Rift plus a Virtuix Omni to take a virtual tour of the Red Planet. It therefore made sense for the folks at JPL to sign up for the latest Kinect for Windows developer program in order to get their hands on the newer and more precise Kinect 2 (which, incidentally, is not available as a standalone unit separate from the Xbox One) to see if it would offer yet another robotics solution.

They received their dev kit in late November, and after a few days of tinkering, were able to hook up an Oculus Rift with the Kinect 2 in order to manipulate an off-the-shelf robotic arm. According to our interview with a group of JPL engineers, the combination of the Oculus's head-mounted display and the Kinect's motion sensors has resulted in "the most immersive interface" JPL has built to date.

Alex Menzies, also a Human Interfaces engineer, describes this combination of a head-mounted display and the Kinect motion sensor as nothing short of revolutionary. "We're able for the first time, with [a] consumer-grade sensor, [to] control the entire orientation rotation of a robotic limb. Plus we're able to really immerse someone in the environment so that it feels like an extension of your own body -- you're able to look at the scene from a human-like perspective with full stereo vision. All the visual input is properly mapped to where your limbs are in the real world." This, he says, is very different from just watching yourself on a screen, because it's very difficult to map your own body movements. "It feels very natural and immersive. I felt like you have a much better awareness of where objects are in the world."

Check out the full story here and the video below to learn more.

<p>Credit: NASA/YouTube</p>

More On


Zero Trust–Ready?

Answer 3 questions on how your organization is implementing zero trust.