Predicting Human Intentions in Human-Robot Hand-Over Tasks Through Multimodal Learning

Weitian Wang, Rui Li, Yi Chen, Yi Sun, Yunyi Jia

Research output: Contribution to journalArticlepeer-review

49 Scopus citations

Abstract

In human-robot shared manufacturing contexts, product parts or tools hand-over between the robot and the human is an important collaborative task. Facilitating the robot to figure out and predict human hand-over intentions correctly to improve the task efficiency in human-robot collaboration is therefore a necessary issue to be addressed. In this study, a teaching-learning-prediction (TLP) framework is proposed for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. In this approach, the robot can be programmed by the human through demonstrations utilizing natural language and wearable sensors according to task requirements and the human's working preferences. Then the robot learns from human hand-over demonstrations online via extreme learning machine (ELM) algorithms to update its cognition capacity, allowing the robot to use its learned policy to predict human intentions actively and assist its human companion in hand-over tasks. Experimental results and evaluations suggest that the human may program the robot easily by the proposed approach when the task changes, as the robot can effectively predict hand-over intentions with competitive accuracy to complete the hand-over tasks. Note to Practitioners - This article is motivated by human-robot hand-over problems in smart manufacturing contexts. Product parts or tools delivery in worker-robot partnerships is an important collaborative task. We develop a teaching-learning-prediction (TLP) framework for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. The robot can be taught by human through natural language and wearable sensing information. The extreme learning machine (ELM) approach is employed for the robot to build its cognition capacity to predict human intentions actively and assist its human companion in hand-over tasks. We demonstrate that the proposed approach presents distinct and effective advantages to facilitate human-robot hand-over tasks in collaborative manufacturing contexts.

Original languageEnglish
Pages (from-to)2339-2353
Number of pages15
JournalIEEE Transactions on Automation Science and Engineering
Volume19
Issue number3
DOIs
StatePublished - 1 Jul 2022

Keywords

  • Extreme learning machine (ELM)
  • human-robot hand-over
  • intention prediction
  • learning from demonstrations
  • natural language
  • wearable sensors

Fingerprint

Dive into the research topics of 'Predicting Human Intentions in Human-Robot Hand-Over Tasks Through Multimodal Learning'. Together they form a unique fingerprint.

Cite this