This paper is a first step towards a smart hand gesture recognition set up for Collaborative Robots using a Faster R-CNN Object Detector to find the accurate position of the hands in RGB images. In this work, a gesture is defined as a combination of two hands, where one is an anchor and the other codes the command for the robot. Other spatial requirements are used to improve the performances of the model and filter out the incorrect predictions made by the detector. As a first step, we used only four gestures.

Deep learning-based hand gesture recognition for collaborative robots

Nuzzi, Cristina
Methodology
;
Pasinetti, Simone
Validation
;
Lancini, Matteo
Membro del Collaboration Group
;
Docchio, Franco
Membro del Collaboration Group
;
Sansoni, Giovanna
Supervision
2019-01-01

Abstract

This paper is a first step towards a smart hand gesture recognition set up for Collaborative Robots using a Faster R-CNN Object Detector to find the accurate position of the hands in RGB images. In this work, a gesture is defined as a combination of two hands, where one is an anchor and the other codes the command for the robot. Other spatial requirements are used to improve the performances of the model and filter out the incorrect predictions made by the detector. As a first step, we used only four gestures.
File in questo prodotto:
File Dimensione Formato  
08674634.pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: Full Text
Licenza: DRM non definito
Dimensione 1.29 MB
Formato Adobe PDF
1.29 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11379/515481
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 45
  • ???jsp.display-item.citation.isi??? 34
social impact