Afficher la notice abrégée

dc.contributor.authorTamayo-Alonso, Diego
dc.contributor.authorPoza-Casado, Irene
dc.contributor.authorMeiss, Alberto
dc.date.accessioned2026-01-30T11:28:02Z
dc.date.available2026-01-30T11:28:02Z
dc.date.issued2026-01-29
dc.identifier.citationTamayo-Alonso, D., Poza-Casado, I., & Meiss, A. (2025). Application of Deep Neural Networks for Leakage Airflow Rate Estimation From Three-Dimensional Thermal Patterns. Indoor Air, 2026(1), 5960599. https://doi.org/10.1155/ina/5960599es
dc.identifier.issn0905-6947es
dc.identifier.urihttps://uvadoc.uva.es/handle/10324/82402
dc.descriptionProducción Científicaes
dc.description.abstractThe employment of deep convolutional neural networks (CNNs) signifies a substantial progression in the domain of image analysis. The application of this method is particularly suitable when the image set represents a spatial structure and predictive analysis can only be performed using Gaussian processes, which are computationally complex. The uncontrolled airflow of air into buildings, known as infiltration, poses a significant challenge in terms of characterisation and quantification. The irregular contours of gaps and cracks through the enclosure create a virtually endless variety of cases, making a generalizable scientific interpretation that can be applied to existing buildings very difficult. This circumstance is always clearly manifested by an irregular, three-dimensional incoming airflow. This study presents an innovative methodology for estimating airflow rates based on three-dimensional thermal patterns captured through infrared thermography. The experimental setup employs a 3D-printed matrix of spheres, facilitating the characterisation of the spatial temperature distribution within the airflow. The resulting thermal images are processed using a CNNs, which integrates the spatial information contained in the thermograms with a scalar input representing the inlet air temperature. The model′s performance was assessed under a range of conditions, including reduced image resolutions, varying experimental configurations (involving different flow apertures) and a comparison between full thermographic inputs and thermal difference-based features. The results indicate that the model can accurately infer airflow rates within the same aperture (medium absolute error [MAE] < 2%). While generalisation to new apertures presents a greater challenge, the experiments demonstrate that a sufficiently diverse training dataset can enhance the model′s predictive capacity for configurations not included in the training phase. These findings underscore the potential of deep learning as a nonintrusive and efficient tool for estimating airflow in systems where conventional measurement techniques are either difficult to apply or impractical.es
dc.format.mimetypeapplication/pdfes
dc.language.isoenges
dc.publisherJohn Wiley & Sons A/Ses
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subject.classificationbuilding airtightnesses
dc.subject.classificationdeep convolutional neural networkes
dc.subject.classificationinfiltrationes
dc.subject.classificationpressurisation testes
dc.subject.classificationthermographyes
dc.titleApplication of Deep Neural Networks for Leakage Airflow Rate Estimation From Three‐Dimensional Thermal Patternses
dc.typeinfo:eu-repo/semantics/articlees
dc.identifier.doi10.1155/ina/5960599es
dc.relation.publisherversionhttps://onlinelibrary.wiley.com/doi/10.1155/ina/5960599es
dc.identifier.publicationfirstpage1es
dc.identifier.publicationissue1es
dc.identifier.publicationlastpage15es
dc.identifier.publicationtitleIndoor Aires
dc.identifier.publicationvolume2026es
dc.peerreviewedSIes
dc.description.projectMinisterio de Ciencia e Innovación (10.13039/100014440) (PID2022-142104OB-I00)es
dc.identifier.essn1600-0668es
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersiones


Fichier(s) constituant ce document

Thumbnail

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée