2018/10 Gesture-based human-robot interaction for human assistance in manufacturing.
“The paradigm for robot usage has changed in the last few years, from a scenario in which robots work isolated to a scenario where robots collaborate with human beings, exploiting and combining the best abilities of robots and humans. The development and acceptance of collaborative robots is highly dependent on reliable and intuitive human-robot interaction (HRI) in the factory floor. This paper proposes a gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation. Wearable sensors, inertial measurement units (IMUs), are used to capture the human upper body gestures. Captured data are segmented in static and dynamic blocks recurring to an unsupervised sliding window approach. Static and dynamic data blocks feed an artificial neural network (ANN) for static, dynamic, and composed gesture classification. For the HRI interface, we propose a parameterization robotic task manager (PRTM), in which according to the system speech and visual feedback, the co-worker selects/validates robot options using gestures. Experiments in an assembly operation demonstrated the efficiency of the proposed solution.”