Natural language and gesture perception based robot companion teaching for assisting human workers in assembly contexts

Rui Li, Weitian Wang, Yi Chen, Yunyi Jia

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

In this work, we propose a teaching-learning framework for the robot to learn from multi-modal human demonstrations to build task knowledge and assist its human partner in collaborative tasks. These multi-modal human demonstrations are parameterized by the natural language and forearm gestures. The Random Forests algorithm is employed for the robot to learn from human demonstrations and construct its task knowledge in assembly contexts. The experimental results suggest that the proposed approach can not only have the robot gain the task knowledge directly through the human demonstrations but also provide a more natural and user-friendly robot teaching pattern to non-expert users. In addition, the proposed method can allow the users to customize the motion pattern of the robot according to their working habit.

Original languageEnglish
Title of host publicationAdvanced Driver Assistance and Autonomous Technologies; Advances in Control Design Methods; Advances in Robotics; Automotive Systems; Design, Modeling, Analysis, and Control of Assistive and Rehabilitation Devices; Diagnostics and Detection; Dynamics and Control of Human-Robot Systems; Energy Optimization for Intelligent Vehicle Systems; Estimation and Identification; Manufacturing
PublisherAmerican Society of Mechanical Engineers (ASME)
ISBN (Electronic)9780791859148
DOIs
StatePublished - 2019
EventASME 2019 Dynamic Systems and Control Conference, DSCC 2019 - Park City, United States
Duration: 8 Oct 201911 Oct 2019

Publication series

NameASME 2019 Dynamic Systems and Control Conference, DSCC 2019
Volume1

Conference

ConferenceASME 2019 Dynamic Systems and Control Conference, DSCC 2019
Country/TerritoryUnited States
CityPark City
Period8/10/1911/10/19

Fingerprint

Dive into the research topics of 'Natural language and gesture perception based robot companion teaching for assisting human workers in assembly contexts'. Together they form a unique fingerprint.

Cite this