Abstract :
The interface between humans and machines is changing. From merely being tools that we manipulate with buttons and switches, our computers, PDAs and electronic toys will become socially responsive entities. The advent of machines with ´personality´ began with a spectrum of modest innovations, from cars that welcome their owners with pretty lights and soothing sounds, through mobile phones that predict what their users want to text, or the seductive voices of satnavs guiding gullible drivers into muddy fields and axle-snapping ditches. Many devices today can learn the vocal and textual patterns of their owners, and the time is fast approaching when we will talk to machines almost as naturally as we talk to each other. However, there´s more to our conversational style than words alone. A vast amount of human signalling is non-verbal. The expressions on our faces and our physical gestures convey a great deal of information, especially about our emotional states. If machines are really going to take part in such complex exchanges, embodiment will be a key contributor. In other words, they´ll have to look and behave less like machines and more like us. Welcome to the era of the ´socially interactive´ robot, both as virtual screen interface and real-world physical object.