Project Natal and other natural user interface products Microsoft is working on are helping usher in a new generation of human–computer interaction.
Robbie Bach demonstrates how body movements control Project Natal, a natural user interface Xbox 360 add-in scheduled for release in the fall.
Take a look at the viedo here-below.
Bill Buxton first used a computer in 1971. It changed his perspective on life. Even four decades ago, Buxton could picture a future enhanced by technology. Eventually he came to dream about humans and computers having close interaction—being able to operate a computer by gesturing at it or by touching it or having a computer recognize your voice and face. “I’m excited more now than I have been since I’ve been in the business because I can taste it now,” said Buxton, a principal researcher at Microsoft since 2005. “Stuff I’ve been working toward and thinking about and dreaming about for 20 or 30 years is now at the threshold of general usage, and I’m still around to touch and play with it.”
The future has arrived, said Buxton, a leading researcher in human–computer interactions. Touch, face- and voice-recognition, movement sensors—all of these are part of an emerging field of computing often called natural user interface (NUI).
Interacting with technology in these ways is no longer limited to high-tech secret agents and “Star Trek.” Buxton said everyone can enjoy using technology in ways that are more adaptive to the person, location, task, social context, and mood.
He sees a bright future in which entire “ecosystems” of devices and appliances coexist with humans in a way that makes life better for people. Microsoft, with researchers like Buxton, is a leader in developing these more natural ways of interacting with computers. The company will showcase some of this technology at the Consumer Electronics Show (CES) in Las Vegas this week. Project Natal, which turns players into human controllers, is among the most high-profile examples of the coming shift in technology, Buxton said.
Microsoft announced at CES this week that the Xbox gaming device will be available in stores this holiday season. Project Natal is the code name for an Xbox 360 add-on that incorporates face, voice, gesture, and object recognition technology to give users a variety of ways to interact with the console, all without needing a controller.
It’s a “delightful” new way to spend time with friends and family playing games, watching TV shows and movies, and listening to music, said Robbie Bach, president of Microsoft’s Entertainment & Devices Division. Bach said Project Natal and other NUI-related products will offer more natural ways to interact with video games, computers, and other technology. “For me, when people talk about touch and voice technologies, or anything related to natural user interface, it all comes back to what’s most natural for the users,” Bach said in an interview before CES. “That’s why you’ll see a variety of user interfaces that are considered natural because each one is tuned to the environment in which it operates.”
The holiday 2010 release of Project Natal will come exactly one decade after the first Xbox console hit the shelves in the holiday season of 2000. “Natal is a next-generation experience that we’re actually delivering this generation,” said Aaron Greenberg, director of product management for Xbox 360. “And they don’t even need to buy a new console.” Project Natal and other Microsoft-focused NUI projects represent a fundamental shift in the way people can interact with technology.
Natural user interfaces describe a wide-ranging category of technology that is perhaps most easily identified by what it lacks—the traditional methods of input including mice, keyboards, and controllers. “The days are over where a one-size-fits-all interface is appropriate, or even acceptable,” said Bill Buxton, principal researcher at Microsoft.
The goal of natural interfaces is not to make the keyboard and mouse obsolete, said August de los Reyes, principal director of user experience for Microsoft Surface. Instead, NUI is meant to remove the mental and physical barriers to technology, to make computing feel more intuitive, and to expand the palette of ways users can experience technology. Whether it’s a receptionist and patient at a doctor’s office separated by a large computer monitor, a family sitting in a living room together in silence, or parents immersed in laptops and kids texting away on cell phones, technology is increasingly creating situations de los Reyes calls “connected but alone.” “Technology today isolates people,” de los Reyes said.
NUI and Microsoft Surface are “almost antitechnology solutions.” A member of the Advanced Studies Program at Harvard University Graduate School of Design and a former visiting associate at Oxford, de los Reyes has become a leader in the field of finding new and intuitive ways to interact with computers. He said natural interfaces are just the latest in a long line of evolving human–computer interaction.
In the past few decades, as computers became widely used, there was the command line interface (CLI) and its flashing cursor calling on users to type commands. Then, there was the graphical user interface (GUI) and its point-and-click mouse and desktop with icons and windows. Both interfaces were revolutionary in their time, and natural interfaces are the next step, de los Reyes said. Microsoft is “absolutely leading” in the category of natural interfaces, de los Reyes said. Microsoft has released, and is continually developing, a number of products that incorporate touch, gestures, speech, and more to make user–computer interaction more natural—more like the way humans interact with one another.
It’s a video game that a grandmother can play with her grandson, using intuitive body movements to compete rather than having to learn to use a controller, as with Project Natal for Xbox 360. Or it’s an in-car communications and entertainment system such as Microsoft Auto’s Ford SYNC that responds to a driver’s voice commands, playing favorite songs or answering text messages so that drivers can keep their eyes on the road, hands on the wheel, and mind on their driving. It’s also a table that acts as a collaborative massive multitouch-screen computer, such as Microsoft Surface, a voice-enabled Windows phone, or a Windows 7-based laptop that lets users navigate files or the Web by using their fingers or a pen tool. Though the human–computer relationship is becoming more personalized, it’s also becoming more personally contained. The Pixar movie “Wall•E” darkly portrayed one possible version of the future in which technology usurped even the most basic human interactions, with humans moving around in high-tech chairs (to save on walking) that meet their every need, including communication—even if the person they are communicating with is sitting right next to them. Though the movie’s message was a warning about technology surpassing humanism, it’s a future that’s not necessarily out of the question yet, said Buxton. “Without informed design, [technology] is far more likely to go bad than good,” Buxton said. “It’s too important and plays too large a part of our lives to leave these things to chance. The only way it’s going to come out right is if we really work hard on understanding that . . . it’s about people; it’s not about technology.” Technology for the sake of technology doesn’t interest Buxton. What interests him is when technology takes a more subordinate position—the Oscar-winning supporting actor to humankind’s starring role. “Technology today isolates people,” said August de los Reyes, principal director of user experience for Microsoft Surface. Natural user interfaces are “almost an antitechnology solution.” Buxton has won a number of awards and honors for his work, which advocates innovation, design, and “the appropriate consideration of human values and culture in the conception, implementation, and use of new technology.” He also frequently teaches, speaks, and writes on the subject, including a book and a regular column for BusinessWeek. “It’s not about interface design; it’s about ‘out of your face’ design,” Buxton said. “How do I get the technology out of my face so I can focus on the stuff that’s of interest to me—the material I’m reading, the film I’m viewing, the person I’m talking to, the problem I’m trying to solve—and doing so in a way that brings unexpected delight.” But creating a more natural relationship between user and technology is not merely a matter of simply removing mice, keyboards, buttons, and knobs or of adding new input methods such as speech, touch, and in-air gestures.
“The days are over where a one-size-fits-all interface is appropriate, or even acceptable,” Buxton said. Some technology, although advanced, is not appropriate or natural in certain situations. For example, text messaging is widely used, but driving or walking and texting is difficult, even dangerous. Speech-recognition technology works well for driving or walking but works poorly on an airplane where privacy is important or in a noisy, crowded restaurant. “The trick of elegant design is making sure you do the right thing the right way for the right person at the right time. What’s right here may be wrong there,” Buxton said. Microsoft leaders say no other company is as well situated to create new user interfaces across a range of devices and contexts. “These are all pieces of a larger puzzle that we are methodically trying to solve in this emerging field,” Buxton said.
Look at this video which shows NATAL project explained by David Hufford (PR Xbox)