Last week I visited the Bristol Robotics Laboratory. This is part of my discussions with different technology researchers whose interest is in  social and emotional human-computer interaction. I've also visited Microsoft Research Cambridge  where I'm being helped by Kenton O'Hara, John Helmes, Alex Taylor and Tim Regan all of the Socio-Digital Systems group and Bristol University Department Computer Science  where I'm being advised by Sri Subramanian, senior lecturer in Human Computer Interaction. More on these later, for now here's the first tale of my Bristol Robotics Lab visit.

I talked with Roger Moore who's visiting BRL from Sheffield University Computer Science Dept. Professor Moore is also Chair of the Spoken Language Processing, Speech and Hearing Research Group.  Roger and I had a long and very interesting chat in which we found we had a lot of common ground between myself as an artist and he as a scientist, which I'll go over in a moment. We also had some interesting differences in approach: Roger's main field is the creative application of speech technology, that is looking at the best ways in which computerised voice and speech can be applied to 'artefacts' as he calls them or 'pervasive media' as I might call it. My interest when I met him however was in how non-verbal language might be developed as the best way of emotionally communicating with a computerised object or robot. I am looking at how we might treat a robot type entity more like an alien, where we would wish to learn from each other by finding any common ground through which we can communicate should verbal language fail us. This might be visually through body language or images or with music. I mentioned to Roger that I was getting a lot from reading Rosalind Picard's book Affective Computing and this lead us to discuss Emotion Theory and some interesting projects around it.

The HUMAINE project, for instance, is concerned with developing interfaces that will register and respond to emotion, particularly pervasive emotion (forms of feeling, expression and action that colour most of human life.) Another key individual is Klaus Scherer of the Swiss Centre for Affective Sciences. Now I made notes at the time, which made sense to me as Roger explained things but now all I can do is try to summise what any of it meant without having spent any time studying Scherer's work and Emotion Theory. One of the things I picked up on is the fact of our 'normative' use of emotion: this is where context and a sense of the appropriate dictates how much emotion we express. For instance one is not supposed to grimace and pull extreme facial expressions at a business meeting but at sporting event it's fine. Of course we're not usually aware of the intricate rules of emotional expression until they're broken, suddenly the rules are all too clear. In terms of art and entertainment of all kinds, this is often its job, to point out these rules of expression that we live by. Breaking these rules creates very strong emotions in us immediately from humour and freedom to anxiety and fear. We learn a lot about ourselves and others when we become aware of the rules of appropriate emotional expression.

This brings us to empathy and Theory of Mind. Roger Moore points out to me that learning language is closely linked with being able to project ones self and to understanding other's needs. Autism for instance is linked with both a lack of empathy and a lack of ability to learn language. In other words learning to communicate with strangers, foreigners or aliens teaches us empathy and emotional intelligence. What I'm now looking for with the Daemon project, are ways in which we might learn to communicate emotionally with a different kind of being forming some kind of emotional language between the two of us, so that people of all abilities might come away feeling a great deal of understanding of the emotionally suppressing rules we live by and perhaps by communicating with our Daemon, break them.