dc.contributor.author | Sánchez Brizuela, Guillermo | |
dc.contributor.author | Cisnal de la Rica, Ana | |
dc.contributor.author | Fuente López, Eusebio de la | |
dc.contributor.author | Fraile Marinero, Juan Carlos | |
dc.contributor.author | Pérez Turiel, Javier | |
dc.date.accessioned | 2023-09-22T09:07:25Z | |
dc.date.available | 2023-09-22T09:07:25Z | |
dc.date.issued | 2023 | |
dc.identifier.citation | Virtual Reality, 2023. | es |
dc.identifier.issn | 1359-4338 | es |
dc.identifier.uri | https://uvadoc.uva.es/handle/10324/61780 | |
dc.description | Producción Científica | es |
dc.description.abstract | Real-time hand segmentation is a key process in applications that require human–computer interaction, such as gesture rec- ognition or augmented reality systems. However, the infinite shapes and orientations that hands can adopt, their variability in skin pigmentation and the self-occlusions that continuously appear in images make hand segmentation a truly complex problem, especially with uncontrolled lighting conditions and backgrounds. The development of robust, real-time hand segmentation algorithms is essential to achieve immersive augmented reality and mixed reality experiences by correctly interpreting collisions and occlusions. In this paper, we present a simple but powerful algorithm based on the MediaPipe Hands solution, a highly optimized neural network. The algorithm processes the landmarks provided by MediaPipe using morphological and logical operators to obtain the masks that allow dynamic updating of the skin color model. Different experiments were carried out comparing the influence of the color space on skin segmentation, with the CIELab color space chosen as the best option. An average intersection over union of 0.869 was achieved on the demanding Ego2Hands dataset running at 90 frames per second on a conventional computer without any hardware acceleration. Finally, the proposed seg- mentation procedure was implemented in an augmented reality application to add hand occlusion for improved user immer- sion. An open-source implementation of the algorithm is publicly available at https://github.com/itap-robotica-medica/light weight-hand-segmentation. | es |
dc.format.mimetype | application/pdf | es |
dc.language.iso | eng | es |
dc.publisher | Springer | es |
dc.rights.accessRights | info:eu-repo/semantics/openAccess | es |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
dc.subject.classification | Augmented reality | es |
dc.subject.classification | Hand segmentation | es |
dc.subject.classification | MediaPipe | es |
dc.subject.classification | Online processing | es |
dc.subject.classification | Semantic segmentation | es |
dc.title | Lightweight real-time hand segmentation leveraging MediaPipe landmark detection | es |
dc.type | info:eu-repo/semantics/article | es |
dc.rights.holder | © 2023 The Author(s) | es |
dc.identifier.doi | 10.1007/s10055-023-00858-0 | es |
dc.relation.publisherversion | https://link.springer.com/article/10.1007/s10055-023-00858-0 | es |
dc.identifier.publicationtitle | Virtual Reality | es |
dc.peerreviewed | SI | es |
dc.description.project | Ministerio de Ciencia e Innovación (under Grant Agreement No. RTC2019-007350-1) | es |
dc.description.project | Publicación en abierto financiada por el Consorcio de Bibliotecas Universitarias de Castilla y León (BUCLE), con cargo al Programa Operativo 2014ES16RFOP009 FEDER 2014-2020 DE CASTILLA Y LEÓN, Actuación:20007-CL - Apoyo Consorcio BUCLE | es |
dc.identifier.essn | 1434-9957 | es |
dc.rights | Atribución 4.0 Internacional | * |
dc.type.hasVersion | info:eu-repo/semantics/publishedVersion | es |
dc.subject.unesco | 33 Ciencias Tecnológicas | es |
Files in this item
This item appears in the following Collection(s)
Except where otherwise noted, this item's license is described as Atribución 4.0 Internacional