Show simple item record

dc.contributor.authorSánchez Brizuela, Guillermo
dc.contributor.authorCisnal de la Rica, Ana
dc.contributor.authorFuente López, Eusebio de la 
dc.contributor.authorFraile Marinero, Juan Carlos 
dc.contributor.authorPérez Turiel, Javier 
dc.date.accessioned2023-09-22T09:07:25Z
dc.date.available2023-09-22T09:07:25Z
dc.date.issued2023
dc.identifier.citationVirtual Reality, 2023.es
dc.identifier.issn1359-4338es
dc.identifier.urihttps://uvadoc.uva.es/handle/10324/61780
dc.descriptionProducción Científicaes
dc.description.abstractReal-time hand segmentation is a key process in applications that require human–computer interaction, such as gesture rec- ognition or augmented reality systems. However, the infinite shapes and orientations that hands can adopt, their variability in skin pigmentation and the self-occlusions that continuously appear in images make hand segmentation a truly complex problem, especially with uncontrolled lighting conditions and backgrounds. The development of robust, real-time hand segmentation algorithms is essential to achieve immersive augmented reality and mixed reality experiences by correctly interpreting collisions and occlusions. In this paper, we present a simple but powerful algorithm based on the MediaPipe Hands solution, a highly optimized neural network. The algorithm processes the landmarks provided by MediaPipe using morphological and logical operators to obtain the masks that allow dynamic updating of the skin color model. Different experiments were carried out comparing the influence of the color space on skin segmentation, with the CIELab color space chosen as the best option. An average intersection over union of 0.869 was achieved on the demanding Ego2Hands dataset running at 90 frames per second on a conventional computer without any hardware acceleration. Finally, the proposed seg- mentation procedure was implemented in an augmented reality application to add hand occlusion for improved user immer- sion. An open-source implementation of the algorithm is publicly available at https://github.com/itap-robotica-medica/light weight-hand-segmentation.es
dc.format.mimetypeapplication/pdfes
dc.language.isoenges
dc.publisherSpringeres
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subject.classificationAugmented realityes
dc.subject.classificationHand segmentationes
dc.subject.classificationMediaPipees
dc.subject.classificationOnline processinges
dc.subject.classificationSemantic segmentationes
dc.titleLightweight real-time hand segmentation leveraging MediaPipe landmark detectiones
dc.typeinfo:eu-repo/semantics/articlees
dc.rights.holder© 2023 The Author(s)es
dc.identifier.doi10.1007/s10055-023-00858-0es
dc.relation.publisherversionhttps://link.springer.com/article/10.1007/s10055-023-00858-0es
dc.identifier.publicationtitleVirtual Realityes
dc.peerreviewedSIes
dc.description.projectMinisterio de Ciencia e Innovación (under Grant Agreement No. RTC2019-007350-1)es
dc.description.projectPublicación en abierto financiada por el Consorcio de Bibliotecas Universitarias de Castilla y León (BUCLE), con cargo al Programa Operativo 2014ES16RFOP009 FEDER 2014-2020 DE CASTILLA Y LEÓN, Actuación:20007-CL - Apoyo Consorcio BUCLEes
dc.identifier.essn1434-9957es
dc.rightsAtribución 4.0 Internacional*
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersiones
dc.subject.unesco33 Ciencias Tecnológicases


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record