Sensor-Based Recognition of Engagement During Work and Learning Activities

Decanato - Facoltà di scienze informatiche

Data: 10 Settembre 2021 / 13:30 - 16:00

Online

You are cordially invited to attend the PhD Dissertation Defence of Elena Di Lascio on Friday 10 September 2021 at 13:30 on MS Teams.

Abstract:
Personal computing systems like e.g., laptops, smartphones, and smartwatches are nowadays ubiquitous in people’s everyday life. People use such systems not only for communicating or searching for information but also as digital companions, able to track and support their daily activities such as sleep, food intake, physical exercise, and even work. Sensors embedded in personal computing systems enable the continuous collection of heterogeneous data about their users. Location, heart rate and more, can nowadays be measured reliably using such sensors. Processed with machine learning and data analytics techniques, sensors’ data can be used to infer users’ information such as the type of activity, the behavior, and even the affective states. In this thesis, we investigate the feasibility of using data derived from personal devices to automatically recognize the affective state of engagement as it occurs during daily activities. We focus on the specific use cases of inferring students’ engagement during learning activities and knowledge workers’ engagement during work activities. Engagement, generally considered in terms of the emotional and attentional involvement in an activity is a well-known predictor of learning outcomes and job performance. Consequently, engagement-aware systems are able to sense, recognize and promote engagement have a huge potential for improving the learning and work experience. Measuring engagement has been for years the central focus of research in psychology. Traditional methods, such as self-reports and observations, requiring significant manual effort from researchers and study participants, have been used for years to derive knowledge about engagement. Bulky devices measuring physiological parameters e.g., electrodermal activity and heart rate variability, have been also used to study engagement from a physiological perspective mostly in laboratory settings or during pre-defined activities. Today, taking advantage of the availability of personal devices and the sensors they are equipped with, computer science researchers are investigating methods for automatically measuring engagement in everyday activities, with little or no effort from users. Despite the knowledge gained from years of research on engagement, its automatic assessment using sensor data is a challenging goal. Indeed, there is no a pre-defined mapping between sensor data and engagement, and it is not clear what transformation and combination of data can provide a reliable engagement assessment. Furthermore, definitions of engagement and its expressions are context-dependent, thus a system aiming to infer engagement should be able to retrieve and use information about the user’s context. However, in the work environment, context information such as the type of activity is difficult to infer. People use several tools to perform their tasks, work in different locations, alone and with others, making the activity inference challenging. In this thesis, we target two main problems: (1) the engagement recognition problem; and (2) the activity recognition problem. To evaluate our approach we designed and ran three user studies and collected data in laboratory settings and in the in-the-wild e.g., during lectures in the classroom and during actual workdays. Further, we performed an extensive data analysis. Specifically, we first address the sensor data transformation and combination problem for engagement recognition. To this end, in the first study presented in this thesis, we leveraged electrodermal activity data and proposed a method for translating findings from educational research into sensor data representation, i.e., features. We then used the features in input to machine learning algorithms with the aim of recognizing students' engagement during lectures. In the second study, we proposed a novel method to recognize a behavioral expression, i.e., laughter, that can be used for recognizing engagement. We leveraged typical physiological and body movement reactions of laughter, and quantify them using sensor data gathered from wristbands. In the third study, we investigated sensor fusion strategies based on traditional machine learning and deep learning, and combined physiological data i.e., electrodermal activity and cardiac activity, together with context information to recognize engagement during work activities. Second, we address the problem of recognizing activities in the workplace. To this end, we proposed a method to combine behavioral expressions such as physiological activation, physical movement, laptop, and phone usage. We performed a thorough analysis and investigated which type of device and sensor data bring relevant information, especially for distinguishing between work and break activities. The insights and technical contributions of this thesis aim to enable the design and development of engagement-aware systems able to support people during their daily activities.

Dissertation Committee:

  • Prof. Silvia Santini, Università della Svizzera italiana, Switzerland (Research Advisor)
  • Prof. Gabriele Bavota, Università della Svizzera italiana, Switzerland (Internal Member)
  • Prof. Fabio Crestani, Università della Svizzera italiana, Switzerland (Internal Member)
  • Prof. Jakob Bardram, Technical University of Denmark & University of Copenhagen, Denmark (External Member)
  • Prof. Fahim Kawsar, Bell Labs Cambridge, UK & TU Delft, Netherlands (External Member)
  • Prof. Akane Sano, Rice University, USA (External Member)