RT info:eu-repo/semantics/article T1 Multimodal teaching analytics: Automated extraction of orchestration graphs from wearable sensor data A1 Prieto, Luis P. A1 Sharma, Kshitij A1 Kidzinski, Łukasz A1 Rodríguez Triana, María Jesús A1 Dillenbourg, Pierre K1 activity detection K1 eye-tracking K1 multimodal learning analytics K1 sensors K1 teaching analytics AB The pedagogical modelling of everyday classroom practice is an interesting kind of evidence, both for educational research and teachers' own professional development. This paper explores the usage of wearable sensors and machine learning techniques to automatically extract orchestration graphs (teaching activities and their social plane over time) on a dataset of 12 classroom sessions enacted by two different teachers in different classroom settings. The dataset included mobile eye-tracking as well as audiovisual and accelerometry data from sensors worn by the teacher. We evaluated both time-independent and time-aware models, achieving median F1 scores of about 0.7–0.8 on leave-one-session-out k-fold cross-validation. Although these results show the feasibility of this approach, they also highlight the need for larger datasets, recorded in a wider variety of classroom settings, to provide automated tagging of classroom practice that can be used in everyday practice across multiple teachers. PB Wiley YR 2018 FD 2018-01-24 LK https://uvadoc.uva.es/handle/10324/74412 UL https://uvadoc.uva.es/handle/10324/74412 LA eng NO Journal of Computer Assisted Learning, April 2018, vol. 34, n. 2, p.193-203 NO Producción Científica DS UVaDOC RD 23-feb-2025