TY - JOUR
T1 - Predicting Human Intentions in Human-Robot Hand-Over Tasks Through Multimodal Learning
AU - Wang, Weitian
AU - Li, Rui
AU - Chen, Yi
AU - Sun, Yi
AU - Jia, Yunyi
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2022/7/1
Y1 - 2022/7/1
N2 - In human-robot shared manufacturing contexts, product parts or tools hand-over between the robot and the human is an important collaborative task. Facilitating the robot to figure out and predict human hand-over intentions correctly to improve the task efficiency in human-robot collaboration is therefore a necessary issue to be addressed. In this study, a teaching-learning-prediction (TLP) framework is proposed for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. In this approach, the robot can be programmed by the human through demonstrations utilizing natural language and wearable sensors according to task requirements and the human's working preferences. Then the robot learns from human hand-over demonstrations online via extreme learning machine (ELM) algorithms to update its cognition capacity, allowing the robot to use its learned policy to predict human intentions actively and assist its human companion in hand-over tasks. Experimental results and evaluations suggest that the human may program the robot easily by the proposed approach when the task changes, as the robot can effectively predict hand-over intentions with competitive accuracy to complete the hand-over tasks. Note to Practitioners - This article is motivated by human-robot hand-over problems in smart manufacturing contexts. Product parts or tools delivery in worker-robot partnerships is an important collaborative task. We develop a teaching-learning-prediction (TLP) framework for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. The robot can be taught by human through natural language and wearable sensing information. The extreme learning machine (ELM) approach is employed for the robot to build its cognition capacity to predict human intentions actively and assist its human companion in hand-over tasks. We demonstrate that the proposed approach presents distinct and effective advantages to facilitate human-robot hand-over tasks in collaborative manufacturing contexts.
AB - In human-robot shared manufacturing contexts, product parts or tools hand-over between the robot and the human is an important collaborative task. Facilitating the robot to figure out and predict human hand-over intentions correctly to improve the task efficiency in human-robot collaboration is therefore a necessary issue to be addressed. In this study, a teaching-learning-prediction (TLP) framework is proposed for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. In this approach, the robot can be programmed by the human through demonstrations utilizing natural language and wearable sensors according to task requirements and the human's working preferences. Then the robot learns from human hand-over demonstrations online via extreme learning machine (ELM) algorithms to update its cognition capacity, allowing the robot to use its learned policy to predict human intentions actively and assist its human companion in hand-over tasks. Experimental results and evaluations suggest that the human may program the robot easily by the proposed approach when the task changes, as the robot can effectively predict hand-over intentions with competitive accuracy to complete the hand-over tasks. Note to Practitioners - This article is motivated by human-robot hand-over problems in smart manufacturing contexts. Product parts or tools delivery in worker-robot partnerships is an important collaborative task. We develop a teaching-learning-prediction (TLP) framework for the robot to learn from its human partner's multimodal demonstrations and predict human hand-over intentions. The robot can be taught by human through natural language and wearable sensing information. The extreme learning machine (ELM) approach is employed for the robot to build its cognition capacity to predict human intentions actively and assist its human companion in hand-over tasks. We demonstrate that the proposed approach presents distinct and effective advantages to facilitate human-robot hand-over tasks in collaborative manufacturing contexts.
KW - Extreme learning machine (ELM)
KW - human-robot hand-over
KW - intention prediction
KW - learning from demonstrations
KW - natural language
KW - wearable sensors
UR - http://www.scopus.com/inward/record.url?scp=85105846600&partnerID=8YFLogxK
U2 - 10.1109/TASE.2021.3074873
DO - 10.1109/TASE.2021.3074873
M3 - Article
AN - SCOPUS:85105846600
SN - 1545-5955
VL - 19
SP - 2339
EP - 2353
JO - IEEE Transactions on Automation Science and Engineering
JF - IEEE Transactions on Automation Science and Engineering
IS - 3
ER -