UW Roboticists Learn to Teach Robots from Babies


Reading time ( words)

Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing. But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot’s arm or body to show it how to perform an action.

Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.

“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.

“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”

The research, which combines child development research from the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.

A collaboration between UW developmental psychologists and computer scientists aims to enable robots to learn in the same way that children naturally do. The team used research on how babies follow an adult’s gaze to “teach” a robot to perform the same task.University of Washington

In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.

The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult’s actions and develop alternate ways of reaching that goal themselves.

In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods — they wrapped their tiny fingers all the way around the ends and yanked especially hard — duplicating what the adult intended to do.

Share

Print


Suggested Items

Brittle Pals Bond for Flexible Electronics

05/13/2019 | Rice University
Mixing two brittle materials to make something flexible defies common sense, but Rice University scientists have done just that to make a novel dielectric. Dielectrics are the polarized insulators in batteries and other devices that separate positive and negative electrodes. Without them, there are no electronic devices.

Army Researchers Explore Benefits of Immersive Technology for Soldiers

01/18/2019 | ARL
The emergence of next generation virtual and augmented reality devices like the Oculus Rift and Microsoft HoloLens has increased interest in using mixed reality to simulate training, enhance command and control, and improve the effectiveness of warfighters on the battlefield.

Flights Show Promising Technologies from Industry and Academic Partnerships

02/06/2018 | NASA
The technologies ranged from proposed new space suits to cryogenic propellant research, with implications for future NASA space missions as well as other research efforts.



Copyright © 2021 I-Connect007. All rights reserved.