On Tuesday 3 June, we hosted the second Being There workshop in the Studio. Funded by the EPSRC, the workshop was one in a series of activities that brings together a group of brilliant creative practitioners, technologists and academics, to explore how cutting-edge robotics can enable people to participate in public spaces, as a place to meet and share ideas. The emphasis of the workshop was the exploration of gesture and interaction. They have been working in part with NAO robots, so they brought a couple along, to spend some time prototyping ideas and exploring how it feels to interact with them. Three groups formed, all looking to explore different aspects of human/robot interaction and share what they had discovered:

One of the groups investigated the vocal and conversational aspects of human/robot interaction. They were specifically interested in whether a robot can shock or offend in the same way that a human can; whether you could push people to want to turn the robot off. The nature of the NAO’s speech recognition and text-to-speech output meant that conversation was stop-start. In between speech, the robot felt like it was turning off, inhibiting what might feel like conversational flow. They suggested that programming the NAO to a fidget during conversation might help it to feel like it was engaging in the conversation and thinking about what it was about to say. This could help the NAO to act as though it has a certain amount of autonomy, meaning that its audio output feels more as though it is coming from the robot itself, rather than someone programming it on the sidelines.

The second group’s ideas were presented by Studio Resident Laura Kriefman. They were investigating how the different movements and gestures that the NAO makes could help or hinder our feelings of empathy towards it. They demonstrated that some of the NAO’s pre-programmed gestures were exaggerated and pantomimic. Puppeteer and roboticist Sarah Angliss commented that the Nao’s gestures feel more like a language than a natural movement, they are there to say ‘I am happy’ or ‘I am angry’ rather than to make you feel as though the robot genuinely felt these emotions. They began investigating the robots more subtle micro-movements, and found that these lent a much more natural feel to the NAO. The group also found that when the robot was picked up, it felt quite rigid. The stiffness of its limbs made it feel more like an object than a responsive creature. They programmed the NAO to become limp when it was picked up, and said that this transition in and out of passivity made it feel much more complex and receptive. They are also looking into ways of getting rid of the ‘neutral position’ which the robot returns to in between each gesture, as it rids its movement of the fluidity that would help it feel more natural, ‘breaking its spell’.

The last group had been working without a NAO robot, investigating human gesture. Slingshot's Simon Johnson presented. The group had created a game, called ‘Gess-ture’ in which players have to pass gestures to each other without saying a word. Each round of the game became more complicated, and the final round had us accumulating a number of gestures at once. The winner, Victoria, managed to incorporate four gestures, hopping, nodding, pointing and shaking into one otherworldly and unforgettable dance. The game demonstrated the complexity of our own innate gesture recognition systems, which will surely prompt some interesting conversations around gestural communication and human/robot interaction.

It is exciting to see people pooling their expertise to determine the perimeter of the uncanny valley, whilst exploring how it feels/what it means to, empathise, understand and converse with robots. We’ll keep you informed of the questions and discoveries that will undoubtedly emerge during the course of this three-year project.