Environment-adaptive interaction primitives for human-robot motor skill learning

Publication Type:
Conference Proceeding
Citation:
IEEE-RAS International Conference on Humanoid Robots, 2016, pp. 711 - 717
Issue Date:
2016-12-30
Full metadata record
© 2016 IEEE. In complex environments where robots are expected to co-operate with human partners, it is vital for the robot to consider properties of their collaborative activity in addition to the behavior of its partner. In this paper, we propose to learn such complex interactive skills by observing the demonstrations of a human-robot team with additional external attributes. We propose Environment-adaptive Interaction Primitives (EalPs) as an extension of Interaction Primitives. In cooperation tasks between human and robot with different environmental conditions, EalPs not only improve the predicted motor skills of robot within a brief observed human motion, but also obtain the generalization ability to adapt to new environmental conditions by learning the relationships between each condition and the corresponding motor skills from training samples. Our method is validated in the collaborative task of covering objects by plastic bag with a humanoid Baxter robot. To achieve the task successfully, the robot needs to coordinate itself to its partner while also considering information about the object to be covered.
Please use this identifier to cite or link to this item: