A collaboration
between UW developmental psychologists and computer scientists aims to enable
robots to learn in
the same way that children naturally do. The team used research on how babies
follow an adult’s
gaze to “teach” a robot to perform the same task.University of Washington
(December 1, 2015) Babies
learn about the world by exploring how their bodies move in space, grabbing
toys, pushing things off tables and by watching and imitating what adults are
doing.
But when roboticists want to teach a robot how to do a task,
they typically either write code or physically move a robot’s arm or body to
show it how to perform an action.
Now a collaboration between University of Washington
developmental psychologists and computer scientists has demonstrated that
robots can “learn” much like kids — by amassing data through exploration,
watching a human do something and determining how to perform that task on its
own.
“You can look at this as a first step in building robots
that can learn from humans in the same way that infants learn from humans,”
said senior author Rajesh Rao, a UW professor of computer science and
engineering.
“If you want people who don’t know anything about computer
programming to be able to teach a robot, the way to do it is through
demonstration — showing the robot how to clean your dishes, fold your clothes,
or do household chores. But to achieve that goal, you need the robot to be able
to understand those actions and perform them on their own.”
The research, which combines child development research from
the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine
learning approaches, was published in a paper in November in the journal PLOS
ONE.
This robot used
the new UW model to imitate a human moving toy food objects around a tabletop.
By learning which
actions worked best with its own geometry, the robot could use different means
to
achieve the same
goal — a key to enabling robots to learn through imitation.University of
Washington
In the paper, the UW team developed a new probabilistic
model aimed at solving a fundamental challenge in robotics: building robots
that can learn new skills by watching people and imitating them.
The roboticists collaborated with UW psychology professor
and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that
children as young as 18 months can infer the goal of an adult’s actions and
develop alternate ways of reaching that goal themselves.