Model for Hybrid Teaching (HyTea)

HyTea (www.hytea.de) is my first project as a PI. HyTea is a collaborative project funded by the German Ministry of Education (BMBF) between DIPF and Cologne Game Lab that aims to design a multimodal tutor for training presentation skills using artificial intelligence (AI) and immersive technologies. The project also addresses the alignment problem of AI systems, which refers to the importance of designing AI systems that align with human values The project started in September 2022 and will run for three years.

In the HyTea project, we are redesigning the Presentation Trainer focusing on the presentation content preparation skills and on training non-verbal aspects (gestures, body positions, engagement with audience etc.) 


Learning Progression Analytics ALICE and AFLEK are two parallel projects between DIPF, Bochum University and IPN in Kiel. In these two projects, we collect various Moodle data traces from school children to generate learning analytics. In these projects, I am co-supervising Sebastian Gombert and Onur Karademir. 

Milky Psy

MILKI PSY project (2021-24)

Multimodal Immersive Learning with Artificial Intelligence for Psychomotor Skills (https://milki-psy.de/) is a BMBF-funded project which aims to create an innovative learning environment that uses AI, big data, and mixed reality to support the independent learning of psychomotor skills. The Cologne Game Lab of TH Köln leads the project. In this project, I am co-supervising PhD candidates Fernando Cardenaz and Gianluca Romano.


Visual Inspection Tool (2019)

The Visual Inspection Tool allows users to visualise and annotate the data collected by different sensors, such as microphones, cameras, eye trackers, or motion trackers. 

The CPR Tutor (2020)

The CPR tutor is a tool that helps users learn and practice CPR skills using real-time feedback and multimodal data. It uses sensors to measure the kinematic and electromyographic data of the user while performing CPR on a manikin. The system uses recurrent neural networks to detect and classify chest compressions according to five performance indicators: compression rate, compression depth, compression release, hand position, and arm posture. The system then provides audio feedback to correct the most critical mistakes and improve CPR performance. The CPR tutor aims to enhance the learning experience and outcomes of CPR training by providing personalised and adaptive feedback based on multimodal data.

The CPR Tutor was part of the SafePAT Interreg project and it belongs to my PhD Thesis, the Multimodal Tutor – Adaptive Feedback from Multimodal Experiences . 

Sense The Classroom (2021)

Sense The Classroom is a research project that uses sensors and multimodal data to understand and improve classroom dynamics. Krist Shingjergji leads the research project from the Open University of The Netherlands. We published a paper entitled “Privacy-Preserving and Scalable Affect Detection in Online Synchronous Learning” in 2022. The tool can be tested here.

MOBIUS project (2020)

The Mobius project investigated smart mobility for users. First, we gathered smartphone sensor data and we applied machine learning to classify the mode of transportation. This application won the best paper award at the S-Cube conference

The Multimodal Pipeline (2019)

The MMLA pipeline has been a conceptual work which I started during my PhD. This work suggests a practical approach for collecting, analysing, annotating and using multimodal data for learning. 

Learning Pulse (2017)

The Learning Pulse project used machine learning and multimodal data to predict learning performance in self-regulated learning. PhD students wore a Fitbit HR wristband and had their computer activities recorded. The data included heart rate, step count, weather conditions, and learning activity. The participants rated their learning experience using an Activity Rating Tool—the project aimed to provide feedback and improve self-regulation skills. The tool was connected to the Feedback Cube and a Learner Dashboard.