Computers feel?
© GETTYNow computers can sense the mood of their users. Already they can identify smiles, frowns and blushes.
Robots can now pick up the mood of their users, and can even tell if they're drunk.

Many people have commented on the contrast between Tony Blair's urbane comments to last week's Chilcott Enquiry and his physical unease in its first minutes as manifest in blinks, foot-tapping, crossed legs, and soon. Body language - non-verbal communication - is a valuable clue to innerfeelings (a truth, or half-truth, that men's magazines often use when advising their readers how to tell whether a young lady might be interested in body language of another kind).

Their claims are dubious, but now science is getting in on the act. It began with Darwin, whose 1872 book The Expression of the Emotions in Man and other Animals showed how blushes, smiles, raised eyebrows and the like add spice to the banal messages of the spoken word.

Now, computers can sense the mood of their users. Already they are able to identify smiles and frowns and even blushes (a subject of much interest to Darwin, who devotes many pages to it). Their programs generate well over a million combinations of facial expressions and head position and, on a good day (or with an expressive face) can. nine times out of ten, correctly identify looks of fear, sadness, happiness, anger disgust and surprise. They even do better than humans in differentiating the expression of a puzzled person from that of a drunk.

However, most of the research is based on actors making faces at a camera. Even simple computers can tell such faked expressions from real ones (which could be useful to policemen, bank managers and parliamentary enquiries). The real world, with expressions changing ever few seconds, is harder to analyse. Tiredness and pain can sometimes be picked up and there is much interest in training the machines to respond to expressions of distress or confusion in the elderly or the mentally impaired.

We ourselves respond to crude images of emotion as much as to subtle (as Edvard Munch's painting The Scream shows). Students do better when a recorded lesson is accompanied by a computer-based avatar that seems to be speaking, and better still when it smiles and gestures as the lesson proceeds.

Researchers at MIT have made robots that smile, wink or raise their eyebrows at a human when they interact, and make eye contact when playing a recorded message. They nod when the subject gives the right answer to a question (which will be useful when academics are replaced by electronics - a process already well under way). The machines point at unfamiliar objects, or at particular shapes and colours, and follow the gaze of their human partners. Babies are entranced by them, for non-verbal messages play a large part in their world.

Such interactions are always two-way. A cunning experiment used a video conference in which the head movements and facial expression of one speaker underwent instant computerised reduction - wide smile to grin, vigorous nod to slight inclination of the head. The other at once responded by nodding and beaming more enthusiastically than before in an unconscious attempt to engage his partner deeper in the conversation.

People with autism would almost certainly not respond in that way, for they find it difficult to express their own feelings and to understand those of others. Sometimes, such children are distressed, but their behaviour appears to be calm to an outside observer, until it explodes into furious flailing. Sensors that pick up rising muscle tension can now help identify their inner turmoil before it is released, not (as in most people) as a series of minor twitches, but as uncontrollable body language. Videos that show happy or puzzled faces superimposed onto cars and trains (of great interest to many such children) help train them to read the feelings of others. The new generation of emotionally interactive robots may do even better.

The face is a window into the soul - and so is the body. Even three-month-old babies can understand its messages. A new coding scheme like that used for faces has shown that people recognise fear, nervousness, anger and the rest from another person's bearing in a predictable way, even when the visage is hidden. A frightened face causes a certain part of the brain to light up when scanned, and it does the same when shown a cringing posture. We are good at seeing a contradiction between the two: a picture of a terrified face pasted on to an image of the body of an angry person, or vice versa, alters how we judge the subject's mood.

Perhaps that is why so many observers noted Mr Blair's nervous fidgets as his bland tale unfolded. Soon, though, it became clear that he had nothing to fear from the questioners and, in a further homage to Darwin, the ex-Prime Minister's performance was transformed into a classic of that political imperative, the survival of the glibbest.

Steve Jones is professor of genetics at University College London