Skip to content

An intuitive robot learning from human demonstration

Research output: Chapter in Book/Report/Conference proceedingConference contribution

This paper presents a new way to teach a robot certain motions remotely from human demonstrator. The human and robot interface is built using a Kinect sensor which is connected directly to a remote computer that runs on processing software. The Cartesian coordinates are extracted, converted into joint angles and sent to the workstation for the control of the Sawyer robot. Kinesthetic teaching was used to correct the reproduced demonstrations while only valid resolved joint angles are recorded to ensure consistence in the sent data. The recorded dataset is encoded using GMM while GMR was employed to extract and reproduce generalised trajectory with respect to the associated time-step. To evaluate the proposed approach, an experiment for a robot to follow a human arm motion was performed. This proposed approach could help non-expert users to teach a robot how to perform assembling task in more human like ways.
Original languageEnglish
Title of host publicationIntelligent Robotics and Applications
Subtitle of host publication11th International Conference, ICIRA 2018, Newcastle, NSW, Australia, August 9–11, 2018, Proceedings, Part I
EditorsZhiyong Chen, Alexandre Mendes, Yamin Yan, Shifeng Chen
PublisherSpringer
Pages176-185
ISBN (Electronic)978-3-319-97586-3
ISBN (Print)978-3-319-97585-6
DOIs
Publication statusPublished - Sep 2018
Event11th International Conference on Intelligent Robotics and Applications - Australia, Newcastle, Australia
Duration: 9 Aug 201811 Aug 2018
http://www.icira2018.org

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume10984
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Intelligent Robotics and Applications
Abbreviated titleICIRA 2018
CountryAustralia
CityNewcastle
Period9/08/1811/08/18
Internet address

Documents

Related information

Relations Get citation (various referencing formats)

ID: 11484873