The STEM system is open, modular, and completes virtual reality with positional tracking
I’m standing outside a rather pretty house in Italy. The sun is shining, I can see the calm ocean over the balcony, and everything is at peace. I notice a butterfly is fluttering past, and I reach out my hand to grab it. The butterfly rests on my hand, and I bring my head closer to my hand to study the insect, turning it this way and that with my hand, while also craning my neck around to see every leg, and to study the pattern on its wings.
I let go, and the butterfly falls from my hand, and lands on the ground. “I think that kills them,” I hear a disembodied voice say from behind me. I pull off the helmet and head phones, and I’m once again assaulted by the noise of PAX. “What did you think?” the man asks.
The STEM system
“It’s next-generation motion tracking from Sixense, much better precision, much better range, obviously wireless, and lower latency,” Danny Woodall, the creative director of Sixense tells me. I’ve been using a prototype of the wireless controllers, and in a goofy twist one of them is actually strapped to the top of my head.
The Oculus Rift provides directional movement, this is what allows you to look up and down, or left or right. What it lacks, at least at the moment, is positional tracking. You can’t lean down to study the words on a book, or lean forward to peer out the window. Although thanks to the Virtuix system you can now walk through VR worlds. The STEM system added to the Rift’s strap adds that positional tracking, and it completely changes the feel of virtual reality.
Now I can lean my head into the chimney, and turn to look up the flue. I can lean out of a window and then look left or right to see what’s outside. My head in the game does everything my head does in real life, and the effect is surreal. I have the use of all my movements and natural instincts to use to explore the world.
“I’m going to give you your hands,” Woodall says, and once again I find myself dealing with that odd feeling of being in two places at once. He puts a controller in each hand, and walks me through the calibration dance. I pull my hands to my shoulders, and pull the trigger. I hold them out and pull the trigger. I’m calibrated, and I can see my hands in the game.
The Rift can be disconcerting because you may feel like you’re holding a controller or moving your hands in real life, your brain is telling you that you’re making these motions, but you have no visual cues for those actions.
That sense of disconnect is removed when I’m holding the STEM controllers, I can move my hands closer to my face, or further away. I can twist my wrists and see the hands move in every direction. It feels natural, and right. I pull the analog trigger under each index finger, and the hands on the screen make a fist. I let go, and my virtual muscles relax.
This is how prosthetic limbs must feel.
It takes a while to get used to using these fake limbs, and the results are crazy. I have the full range of my normal movements, so everyone makes sure I have plenty of space to swing my arms and explore. I pick up a barrel, wind up, and throw it over the balcony, and watch it sail off into the distance. I feel like I have super powers.
A flexible, powerful system
The power of the STEM system isn’t just in these wireless controllers, although they dramatically change and improve how you interact with the environment in virtual reality. The system is built around a modular sensor, and each base can use up to five sensors, that can be placed in or on anything. While I have a controller strapped to my head right now, the final design will allow the sensor to be placed inside a small power pack with a clip, and you could put that on either the Rift or even a normal cap to add head-based positional tracking.
The possibilities are limitless. Clip a sensor on each sleeve and you can use your hands in a boxing game. Clip one on each sleeve and then one on each foot and you can create a kickboxing game. Clip one on your belt and you can add a motion for diving under cover.
There will be a full SDK so developers will be able to clip the sensor onto any existing or new controller to add positional tracking and then incorporate it into demos or games. In fact, Sixense will be releasing the files for each part of the hardware for 3D printers, allowing anyone to rapidly create their own hardware with sockets for the sensor, or you can simply clip it onto an existing controller.
“We’re going to make it very open, so anyone can do this. That’s our goal,” Woodall tells me. The sensor has an 8 foot range, so you can stand up and take a few steps away from your computer and still use it. Sixense claims that since the sensors aren’t using inertia to track movement, there won’t be any drift, even with quick motions.
They also stressed that this technology isn’t just for virtual reality. Clipping a STEM module to the bill of your cap will add Track IR-like functionality to any game, and you can image an interface that actually zooms images automatically when you lean closer to the screen.
A developer could create scenes with multiple levels of parallax scrolling that use the positional tracking to adjust the image as you look at it from different angles, creating an effective, intuitive, and glasses-free 3D effect. The handsets will allow you to interact with any virtual world with a full range of motion in 3D, whether you’re looking at a monitor, television, or using the Rift.
The system isn’t just powerful, it’s flexible, and easy to use. It’s also backwards compatible with all the programs created for the Razor Hydra, so there is already a variety of demos and games to try. This new version isn't just wireless and more precise, the ability to track up to four modular sensors that can be placed on different parts of the body will open up an entirely new world of possible interaction with games, whether people are playing in virtual reality or not.
The Kickstarter has just launched, and $200 will get you a set of the hardware with two sensors. Sixense expects the hardware to ship in summer of 2014 and, after playing with the hardware at PAX Prime, I plan to be one of the first backers.