clock menu more-arrow no yes mobile

Filed under:

Intel Invests in Voice-Controlled Indie Game There Came an Echo

"Instead of 'Grace, hold position,' you might choose to say 'yo girl, hold up.'"

One of the Big Ideas on Intel’s plate in 2014 is RealSense, the chipmaker’s attempt to guide the future of human-computer interaction. First debuted at CES, RealSense is about letting computers see, hear and understand users’ gestures and emotions in human-like ways.

To support that interaction, the company has already partnered with Nuance, to make a special version of its Dragon Dictation software; Microsoft, to enhance Skype; 3DSystems, to make scanning 3-D printable objects from your desk more feasible; Dreamworks and Scholastic, to create content; and with seven OEMs to get RealSense cameras and microphones integrated into tablets, notebooks and desktops in the second half of the year.

Add one more to the list: Intel RealSense has invested directly in Iridium Studios, the developer behind voice-controlled indie game There Came an Echo, which raised $115,000 on Kickstarter last year. Intel representatives and Iridium CEO Jason Wishnov both declined to comment on the specifics of the investment, but it’s easy to see why it makes (real) sense.

There Came an Echo is an upcoming strategy game that turns players into generals, of a sort. Instead of clicking around a screen to move troops into position and take out enemies, players issue orders like “all units, shields up,” “switch to pistol” and “move to Bravo-3.”

Those orders all come from a whitelisted dictionary. One of the features advertised in the game’s Kickstarter, though, is the ability to teach There Came an Echo custom phrases.

“For instance, instead of ‘Grace, hold position,’ you might choose to say ‘yo girl, hold up,’ or instead of ‘weapons free,’ you might say, ‘not the gumdrop buttons!'” the Kickstarter reads. “You’ll be able to direct your troops in a style completely your own.”

Put another way: The game lets you talk and be understood like a human. Which is part of Intel’s future vision for many types of programs.

“We believe real world, which is a part of you, and the virtual world can actually blend,” Intel marketing director Anil Nanduri said.

Most of Intel’s OEM partners, Nanduri added, will be shipping hardware with both the RealSense camera — which can perceive how far away objects are, in 3-D — and “multi-array microphones” for voice control.

“When you have both, you can have multi-modal interaction,” he said. “For example, you can give voice commands while playing a game to change a map, or duck, or hide, et cetera. … [The camera] can use facial analysis to understand if you’re smiling or excited.”

This is not the company’s first gaming investment. Two years ago, Intel Capital invested in Korean social game developer LIFO Interactive and Chinese computer-to-TV gaming B2B company Transmension. But the RealSense-Iridium funding is a more direct articulation of how important software is to the hardware company’s ambitions.

Intel is hardly the first company to try and shake up how we interact with our computers. Recall the Leap Motion, which tried to bring gesture controls to the PC last year; at launch, my colleague Katie Boehret found it to be “tiring after even a short time” and “challenging to integrate it into my regular computer routine.”

And that’s just it — for all the arguable limitations of a mouse and keyboard, they’re part of an automatic routine for millions of people. No matter how good its technology winds up being, Intel will need to nudge people out of their routines if it wants RealSense to click. A fun, addictive game that makes users comfortable with voice commands could go a long way toward that goal.

This article originally appeared on Recode.net.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.