The Robotic Body Language Studies
Robots have changed quite a bit over the last 50 years. From the clunky grey metal robots of the 1950s to the sleek, more lifelike robots of today, technology has made huge advances in robotics. However, one thing that has always separated robots from humans is subtle body language. In the past, humans have only been able to judge a robot’s meaning through its words or obvious physical movements – that is until now.
Humans give off thousands of non-verbal clues to their feelings through body language. Some people tremble when nervous, some shake their legs when speaking before a large crowd, and some unknowingly tap their fingers. Even people who must train to refrain from showing emotions, such as poker players, may fall victim of a tell-tale facial tick. However, robots have none of these human traits, so humans don’t know their intentions unless they are spoken.
The StudyBilge Mutlu is a Ph.D. candidate at Carnegie Mellon University in Pittsburgh who is interested in designing social behaviour for lifelike robots. In his research, he has noticed how important social gaze behaviour is in conversation. He and his colleagues created robots that are able to convey non-verbal communication through subtle eye movement.
Mutlu’s study included 26 human participants. In the study, both human and humanoid robots played a guessing game. There were a dozen objects on a table and the robot was programmed to choose one of the objects. The human had to guess which object the robot would pick by asking it a series of yes or no questions.
The ResultsDuring the first trial of the study, it took the humans an average of 5.5 questions to figure out which object the robot would choose. However, during the second trial, Mutlu and his colleagues set up the robots to answer the same way, but swivel its eyes towards the object it intended to pick before answering yes or no during two of the first three questions asked by the humans. This “leaking” of information by the robot lowered the average from 5.5 questions asked to 5.0 asked before picking the correct object. This was a statistically significant result.
About 75% of the humans in the survey told Mutlu that they did not notice the glances made by the humanoid robot Geminoid, which had realistic skin. However, the results prove that the glances did in fact have some subliminal affect – provided the robot is lifelike. When Mutlu used the less lifelike robot Robovie, which has large glassy eyes, the results were the same whether the robot glanced at the object during questioning or not. Mutlu presented his study results recently at the Human Robot Interaction 2009 conference in La Jolla, California.
At the Swiss Federal Institutes of Technology in Lausanne, Switzerland, Sylvain Calinon believes that even less subtle robot movements such as nod during a conversation with a human can improve the quality of interaction between the two. However, Calinon notes that, to be truly successful at social interaction, the robot will need to be able to read the human’s body language.