Hands-Free Maneuvers of Robotic Vehicles via Human Intentions Understanding Using Wearable Sensing

Weitian Wang, Rui Li, Longxiang Guo, Z. Max Diekel, Yunyi Jia

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Intelligent robotic vehicles are more and more fully automated, without steering wheels, gas/brake pedals, or gearshifts. However, allowing the human driver to step in and maneuver the robotic vehicle under specific driving requirements is a necessary issue that should be considered. To this end, we propose a wearable-sensing-based hands-free maneuver intention understanding approach to assist the human to naturally operate the robotic vehicle without physical contact. The human intentions are interpreted and modeled based on the fuzzy control using the forearm postures and muscle activities information detected by a wearable sensory system, which incorporates electromyography (EMG) sensors and inertial measurement unit (IMU). Based on the maneuver intention understanding model, the human can flexibly, intuitively, and conveniently control diverse vehicle maneuvers only using his intention expressions. This approach was implemented by a series of experiments in the practical situations on a lab-based 1/10 robotic vehicle research platform. Experimental results and evaluations demonstrated that, by taking advantage of the nonphysical contact and natural handleability of this approach, the robotic vehicle was successfully and effectively maneuvered to finish the driving tasks with considerable accuracy and robustness in human-robotic vehicle interaction.

Original languageEnglish
Article number4546094
JournalJournal of Robotics
Volume2018
DOIs
StatePublished - 2018

Fingerprint

Dive into the research topics of 'Hands-Free Maneuvers of Robotic Vehicles via Human Intentions Understanding Using Wearable Sensing'. Together they form a unique fingerprint.

Cite this