Human Intention Prediction in Human-Robot Collaborative Tasks

Weitian Wang, Rui Li, Yi Chen, Yunyi Jia

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

25 Scopus citations

Abstract

Enabling the robot to predict human intentions in human-robot collaborative hand-over tasks is a challenging but important issue to address. We develop a novel and effective teaching-learning-prediction (TLP) model for the robot to online learn from natural multi-modal human demonstrations during human-robot hand-overs and then predict human hand-over intentions using human wearable sensing information. The human could program the robot using partial demonstrations according to the updated tasks and his/her personal hand-over preferences, and the robot can online leverage its learned strategy to actively predict human hand-over intentions and assist the human in collaborative tasks. We evaluate the approach through experiments.

Original languageEnglish
Title of host publicationHRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
PublisherIEEE Computer Society
Pages279-280
Number of pages2
ISBN (Electronic)9781450356152
DOIs
StatePublished - 1 Mar 2018
Event13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018 - Chicago, United States
Duration: 5 Mar 20188 Mar 2018

Publication series

NameACM/IEEE International Conference on Human-Robot Interaction
ISSN (Electronic)2167-2148

Conference

Conference13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018
Country/TerritoryUnited States
CityChicago
Period5/03/188/03/18

Keywords

  • human intention prediction
  • human-robot hand-over
  • learning from demonstrations
  • natural multi-modal information

Fingerprint

Dive into the research topics of 'Human Intention Prediction in Human-Robot Collaborative Tasks'. Together they form a unique fingerprint.

Cite this