Diego-san the Infant Robot by Machine Perception Lab.

His name is Diego-san and he is an android infant developed by the Institute for Neural Computation’s Machine Perception Laboratory, based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit). The lab is funded by the National Science Foundation. For the hardware, the lab collaborated with Kokoro Co. Ltd. and Hanson Robotics specialized in building life-like animatronics and androids. Taking a year old baby as a model, the head was designed by Hanson Robotics which sells a line of expressive robots called Robokind. There were some other projects between the Machine Perception Lab and Hanson Robotics in the past, as well, such as the emotionally responsive robot head of Albert Einstein. The body of Diego-san was designed by the Japanese Kokoro Co. which built the android infant CB2 for Osaka University beforehand. The robot is a product of “Developing Social Robots” project started in 2008 with the intention “to make progress on computational problems that elude the most sophisticated computers and Artificial Intelligence approaches, but that infants solve seamlessly during their first year of life.”

Diego-san with different facial expressionsKaynak: UCSDNews PressRelease
Diego-san with different facial expressions
Source: UCSDNews PressRelease

The project team studied on children and robots. The team leader is the researcher scientist Javier Movellan who was previously worked on RUBI Project that revealed the toddlers treated to the Sony QRIO as if it was alive. The team also cooperates with the researchers from the Early Play and Development Laboratory at the University of Miami, and Movement Control Laboratory at the University of Washington.

The purpose was creating a research platform in order to study the cognitive development of children while reading and mimicking facial expressions and how children learn to use their body and to communicate with others. Dr. Javier Movellan sums up the objectives in the Japan-based PlasticPals blog: “The project’s main goal is to try and understand the development of sensory motor intelligence from a computational point of view. It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby’s brain faces, when learning to move its own body and use it to interact with the physical and social worlds.”

Diego-san is designed to learn and develop sensory-motor skills such as reaching, grasping and communicative skills such as pointing and smiling similar to a year old infant. Sensory motor and social development studies will be the main uses for this robot. However, what Movellan says depicts that there is more progress in social development: “We’ve made good progress developing new algorithms for motor control, and they have been presented at robotics conferences, but generally on the motor-control side, we really appreciate the difficulties faced by the human brain when controlling the human body…We developed machine-learning methods to analyze face-to-face interaction between mothers and infants, to extract the underlying social controller used by infants, and to port it to Diego-san. We then analyzed the resulting interaction between Diego-san and adults.”

After David Hanson from Hanson Robotics posted the video below on YouTube, more and more people admired Diego-san and his expessive face.

http://www.youtube.com/watch?feature=player_embedded&v=knRyDcnUc4U#!.

With its 4 feet 3 inches (130cm) height and 66 pounds (30kg) weight, Diego-san is obviously bigger than an infant due to the fact that being small size would have cost much more. It has 44 pneumatic joints altogether while only its head has 27 parts that enable different facial expressions. The sensors and actuators enable the complexity in dymamics of human muscles. Thanks to these characteristics, Diego-san is one of the most realistic robots of its kind. This project is very important considering its potential contribution to computational study of infant development and the understanding of developmental disorders such as autism and Williams syndrome.

Machine Perception Technologies (MPT) is now looking for undergraduate interns and postgraduate programmers in expression recognition technology. There will be more information about the research in scientific publications soon. Furthermore, there will be friends available for Diego-san within a few months such as Roboy.