An intuitive robot learning from human demonstration

Uchenna Emeoha Ogenyi, Gongyue Zhang, Chenguang Yang, Zhaojie Ju, Honghai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

365 Downloads (Pure)


This paper presents a new way to teach a robot certain motions remotely from human demonstrator. The human and robot interface is built using a Kinect sensor which is connected directly to a remote computer that runs on processing software. The Cartesian coordinates are extracted, converted into joint angles and sent to the workstation for the control of the Sawyer robot. Kinesthetic teaching was used to correct the reproduced demonstrations while only valid resolved joint angles are recorded to ensure consistence in the sent data. The recorded dataset is encoded using GMM while GMR was employed to extract and reproduce generalised trajectory with respect to the associated time-step. To evaluate the proposed approach, an experiment for a robot to follow a human arm motion was performed. This proposed approach could help non-expert users to teach a robot how to perform assembling task in more human like ways.
Original languageEnglish
Title of host publicationIntelligent Robotics and Applications
Subtitle of host publication11th International Conference, ICIRA 2018, Newcastle, NSW, Australia, August 9–11, 2018, Proceedings, Part I
EditorsZhiyong Chen, Alexandre Mendes, Yamin Yan, Shifeng Chen
ISBN (Electronic)978-3-319-97586-3
ISBN (Print)978-3-319-97585-6
Publication statusPublished - Sept 2018
Event11th International Conference on Intelligent Robotics and Applications - Australia, Newcastle, Australia
Duration: 9 Aug 201811 Aug 2018

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference11th International Conference on Intelligent Robotics and Applications
Abbreviated titleICIRA 2018
Internet address


Dive into the research topics of 'An intuitive robot learning from human demonstration'. Together they form a unique fingerprint.

Cite this