A new generation of robots is shaping the way in which we perceive technology. Are we ready to work with them?
It’s Friday and I’m outside the Powerhouse Museum. At 10.00 am the glass doors finally open and I approach one of the guards. He lets me in after I show him my ID. The Articulated Head is still asleep.
Damith Herath and Christian Kroos, research engineers at MARCS Auditory Laboratories, University of Western Sydney, are ready to wake it up. They activate the system and thus, the Articulated Head starts to… think. It opens its eyes, moves its neck and is ready to begin its day. As I approach the keyboard that allows me to communicate with the Articulated Head, I feel that something is watching my every move. Then I stop and stare; it stares back and frowns — its face looks strangely familiar.
One of us has to start the conversation, and it is then that I dare to ask: “Do you like coffee?” Its answer is: “Yes.” But there is no time for small talk; visitors approach. A group of young girls guided by one of their teachers makes a stop to interact with the Articulated Head. The adult leaves almost immediately but the girls stay longer and engage in a lively conversation with this robot described by its creators as “a conventional industrial robot arm that has been converted into an aesthetically appealing installation piece that melds the worlds of art, science and engineering.”
More than forty years ago Japanese roboticist Masahiro Mori developed a theory based on his experiences regarding human-robot interaction. According to Professor Mori, people were keen to interact with robots regardless of their appearance — but just to some extent. As Mori’s thesis suggests, if the robots were too human a person would have an unpleasant impression. “Some people feel discomforted because it’s simulating things they feel shouldn’t be simulated or they feel it isn’t reacting in the right way… People are very critical, the more realistic we try to make it (robots), then any detail that suddenly shows up will cause a negative reaction,” explains Professor Chris Davis, one of the chief investigators behind the Articulated Head and a research fellow at MARCS Auditory Laboratories.
Known as the Uncanny Valley Effect, Mori’s hypothesis questions the relationship between a robot’s degree of realism in terms of physical appearance and behaviour, and a person’s impression of the robot. However, as robots move from the industrial to the social sphere, an adaptation process will occur. Robots should be able to communicate, cooperate and coexist in any given environment; they should also be agreeable. “Resembling a human can be better for some situations and worse for others. For example, many pet robots [toys] have done their jobs without human form, and some people might find the idea of a human pet disconcerting,” explains Charlie Kemp, Assistant Professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, in the US. “In the long run, I suspect that there will be a variety of different types of robots matched to their jobs and people’s individual preferences,” say Kemp.
Social robots are autonomous artefacts that interact and communicate with people. They don’t need to have an anthropomorphic shape, but they must be able to build some sort of rapport with whoever is using them. A social robot, therefore, must have its own personality, has to be aesthetically pleasant (to avoid the uncanny valley effect, of course) and easy to interact with. These characteristics can be seen in some robots that are already “working” among us. For example, ASIMO, Honda’s robot, visited Windsor Castle in 2009 to congratulate the nine finalists of a competition called Reaching for the Gold. ASIMO is a humanoid and it has been touring different cities — it visited Australia in 2007 — where it has received warm responses from the public. ASIMO looks like an astronaut. It has white bulky limbs, but it walks almost gracefully. We cannot see its eyes or mouth, but it has a set of sensors that allow it to interact in any given space. ASIMO also has its own show at Disneyland’s Tomorrowland where it talks, walks and dances. But the technology behind “the world’s most advanced bi-pedal humanoid robot” has also helped in the creation of the U3-X, Honda’s Stride Management Assist and Bodyweight Support Assist experimental walking assist devices that in the future will help the elderly and people with weakened limbs.
In 2007 the journal IEEE Transaction on Robotics published a special issue on Human-Robot Interaction. At the time, its editors suggested that the paradigmatic shift that occurred in robotics at the end of the 1980s brought robots from factories to everyday-life environments. They also suggested that this new generation of robots would fulfil four different roles: coworkers, assistants for the disabled and elderly, surgical devices and toys. So far their hypotheses have been proven right.
Although sensors, voice and other systems have greatly improved, robots still need to be programmed to learn more about their surroundings. Researchers from the Italian Institute of Technology are working on a skin system that will enable robots to explore and learn about their environment. According to Giorgio Metta, assistant professor at LIRA- Lab, University of Genova, Italy, the skin system is based on two different technologies: capacitive (as most of the consumer electronics touch screen/ pads) and piezoelectric. Metta explains that the system will allow robots to work in different social spheres. “It enables the robot to grade forces while interacting with the environment (including humans) and therefore it makes the robot safer. It also enables people to guide the robot via something called “˜kinesthetic teaching’ where a person can steer the robot to move in a certain way in order to solve a problem (reach for an object, grab it). This is a very natural way to “˜program’ a robot for a given task and opens up new ways for deploying robots into human-populated environments.”
At the same time, other researchers are trying to work towards human emotion recognition. “Emotions are a difficult question… The study of human emotion is not an exact science and there are things yet to be known,” says Davis. But as Chris Chesher, Senior Lecturer in Digital Cultures at The University of Sydney, explains, “Roboticists refer to the “˜emotional’ features of robots — components that allow the robot to appear to change expressions in ways that are meaningful to people around them.” Chesher’s suspicion is that while emotional expressivity can draw upon familiar interpersonal conventions, copying human expressions will be the dead end as “there will be simpler and cheaper codes and gestures that robots and people can use to communicate.”
In 2009, after years of trials, a team of researchers from Carnegie Mellon University, US, published in the journal of Robotics and Autonomous Systems a paper called “Robotics and Autonomous Systems”. Their conclusion stated: “A social robot needs to remember people who have interacted with it, and interact differently with those people than with newcomers. The robot should utilise its emotional expressions differently based on how much common ground it shares with a person.”
Back at the Powerhouse Museum, our photographer decides to discuss Australian Politics and food with the Articulated Head. The conversation doesn’t last long. However, after an hour, our photographer decides to resume the dialogue. “Do you remember me?” The Articulated Head says yes; his answer is striking.
Although robots still have very limited ability to understand people’s emotions, it is possible to engage emotionally with them — judging by people’s reactions to Kismet, Paro, ASIMO and the Articulated Head. Based on these and other examples, researchers are trying to develop technologies that will successfully bring robots to our daily live. Their challenges are to remove all uncanniness — Chesher says that as people become more accustomed to living with robots the uncanny valley “is likely to be seen as a historical curiosity” — and transform them into conscientious, agreeable and somewhat extrovert machines ready to improve our lives.
As assistants, teachers, nurses, actors, toys or companions for the elderly or disabled, this new generation of robots — which has as its flagship the mobile manipulator is here to stay. The change will not be immediate, but in the long run robots will populate our houses, offices, hospitals and schools. “By 2020, I’m optimistic that we’ll see widespread use a commercially available mobile manipulator,” says Kemp. “I’m confident about this technology. I see it as the next car industry or personal computer industry, and I image there will be a mobile manipulator in every home.” That is the future.