Researchers from the University of Miami, in collaboration with a team at the University of California San Diego, have taken a new approach to robot construction with a project that involves the psychological analysis of infants as they interact with their mothers.
UM professor of psychology and pediatrics, Dr. Daniel Messinger, along with two research associates who are graduate students, provide information for a team of three engineers at UCSD’s Machine Perception Lab, and help them create a robot with the ability to learn movements similar to those of a child. The team hopes that with sufficient data programming, the robot will be able to learn on its own.
“We have made some new discoveries that we are excited about,” said Messinger. “Babies reach with their arms and legs at first, and then gradually they can use individual limbs.”
The team collects data on a weekly basis from eight babies, between the ages of three to five months, as they and their mothers play with toys in a soundproof chamber. Lights attached to the infant’s clothing are recorded on video and are used to interpret patterns of movement and development.
“The biggest challenge has been to build the infant’s suit from scratch. It is the first time motion capture with lights has been used on a child,” said Juan Artigas, the team member who built the suit. “Motion capture is the same method they used to create Avatar.”
The project is funded by a National Science Foundation grant. They have recently added lights to mark more details, and experimented with different camera angles. The baby robot, close to four feet tall, has also been given a new face. The baby-doll head that was previously used was criticized as creepy by online bloggers.
The details of the project are described in a paper titled “Rethinking Modes of Development.” The team submitted the paper to the Institute of Electrical and Electronic Engineers (IEEE) and hopes to present their project at the IEEE’s annual conference in Frankfurt, Germany, this August.