EdTekTalk: Can a robot look you in the eye?

By Diana Fingal 8/17/2014

ISTE reached out to inspiring people from a range of fields and asked them to share their insights in mini-keynotes at ISTE 2014. This EdTekTalk is one in a series of five videos that showcases futurists, designers and entrepreneurs.

Long before Siri could deliver a cutting response to the facetious questions we ask our iPhones, computers were little more than cold, impersonal desktops designed for computing and not much else. Few people in the 1980s could imagine having computers in their bedrooms, bathrooms and handbags. But that was before enhanced user experience transformed computers into personal devices that speak to us, know our preferences and seemingly understand us.

The next frontier in technology is applying that same type of user experience enhancement to robots, says Guy Hoffman, a researcher who studies human-robot interaction. He explores the humanity of robots — how they think, feel, act and move as they interact with humans. What he’s discovered is that programming robots to express body language and respond to human emotion inspires creativity and empathy in the humans who work with them.

“I think when it comes to robots working with us face to face and side by side, the user experience is going to be key — much more than the technology of what the robot is actually capable of doing,” Hoffman says. “We all know how to understand body language. A kid knows it, older people know it, people of all different languages know what it means to look someone in the eye.”


The EdTekTalks are returning to the ISTE Conference & Expo! Sign up for the conference mailing list to be the first to find out which innovators from beyond ed tech will speak at ISTE 2015.

Like (0)