In this paper the novel teleoperation method Hands-Free is presented. Hands-Free is a vision-based augmented reality system that allows users to teleoperate a robot end-effector with their hands in real time. The system leverages OpenPose neural network to detect the human operator hand in a given workspace, achieving an average inference time of 0.15 s. The user index position is extracted from the image and converted in real world coordinates to move the robot end-effector in a different workspace.The user hand skeleton is visualized in real-time moving in the actual robot workspace, allowing the user to teleoperate the robot intuitively, regardless of the differences between the user workspace and the robot workspace.Since a set of calibration procedures is involved to convert the index position to the robot end-effector position, we designed three experiments to determine the different errors introduced by conversion. A detailed explanation of the mathematical principles adopted in this work is provided in the paper.Finally, the proposed system has been developed using ROS and is publicly available at the following GitHub repository: https://github.com/Krissy93/hands-free-project.

Hands-Free: A robot augmented reality teleoperation system

Nuzzi C.
Methodology
;
Ghidini S.
Validation
;
Pagani R.
Data Curation
;
Pasinetti S.
Validation
;
Coffetti G.
Membro del Collaboration Group
;
Sansoni G.
Supervision
2020-01-01

Abstract

In this paper the novel teleoperation method Hands-Free is presented. Hands-Free is a vision-based augmented reality system that allows users to teleoperate a robot end-effector with their hands in real time. The system leverages OpenPose neural network to detect the human operator hand in a given workspace, achieving an average inference time of 0.15 s. The user index position is extracted from the image and converted in real world coordinates to move the robot end-effector in a different workspace.The user hand skeleton is visualized in real-time moving in the actual robot workspace, allowing the user to teleoperate the robot intuitively, regardless of the differences between the user workspace and the robot workspace.Since a set of calibration procedures is involved to convert the index position to the robot end-effector position, we designed three experiments to determine the different errors introduced by conversion. A detailed explanation of the mathematical principles adopted in this work is provided in the paper.Finally, the proposed system has been developed using ROS and is publicly available at the following GitHub repository: https://github.com/Krissy93/hands-free-project.
2020
978-1-7281-5715-3
File in questo prodotto:
File Dimensione Formato  
2020_Hands-Free a robot augmented reality teleoperation system.pdf

gestori archivio

Tipologia: Documento in Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.97 MB
Formato Adobe PDF
2.97 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11379/536128
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 11
social impact