This research paper presents a novel interface of programming by demonstration (PbD) for Human-Robot Interaction (HRI). The designed mixed reality (MR) based interface is providing users a more immersive user experience (UX) while teleoperating the robot. The operator's hand gestures are captured and be used to control the robot. Users are able to see their own hands in the virtual environment. Experimental test was carried out with a dual-arm robot to make the robot react to a gesture made by an operator viewing the robot workspace through a virtual reality (VR) headset.
|IEEE ICMLC Proceedings Series
|16th International Conference on Machine Learning and Cybernetics
|9/07/17 → 12/07/17