Learn How to Assist Humans Through Human Teaching and Robot Learning in Human-Robot Collaborative Assembly

Yi Sun, Weitian Wang, Yi Chen, Yunyi Jia

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Human-robot collaborative assembly has been one of the next-generation manufacturing paradigms in which superiorities of humans and robots can be fully leveraged. To enable robots effectively collaborate with humans, similar to human-human collaboration, robot learning from human demonstrations has been adopted to learn the assembly tasks. However, existing feature-based approaches require critical feature design and extraction process and are usually complex to incorporate task contexts. Existing learning-based approaches usually require a large amount of manual effort for data labeling and also rarely consider task contexts. This article proposes a dual-input deep learning approach to incorporate task contexts into the robot learning from human demonstration process to assist human in assembly. In addition, online automated data labeling during human demonstration is proposed to reduce the training efforts for learning. The experimental validations on a realistic human-robot model car assembly task with safety-concerned execution designs demonstrate the effectiveness and advantages of the proposed approaches.

Original languageEnglish
Pages (from-to)728-738
Number of pages11
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
Volume52
Issue number2
DOIs
StatePublished - 1 Feb 2022

Keywords

  • Assembly
  • deep learning
  • human demonstration
  • human-robot collaboration
  • robot learning

Fingerprint

Dive into the research topics of 'Learn How to Assist Humans Through Human Teaching and Robot Learning in Human-Robot Collaborative Assembly'. Together they form a unique fingerprint.

Cite this