George Kyei Boateng - Academia.edu (original) (raw)

Uploads

Papers by George Kyei Boateng

Research paper thumbnail of Emotion Recognition among Couples: A Survey

arXiv (Cornell University), Feb 16, 2022

Couples' relationships affect the physical health and emotional well-being of partners. Automatic... more Couples' relationships affect the physical health and emotional well-being of partners. Automatically recognizing each partner's emotions could give a better understanding of their individual emotional well-being, enable interventions and provide clinical benefits. In the paper, we summarize and synthesize works that have focused on developing and evaluating systems to automatically recognize the emotions of each partner based on couples' interaction or conversation contexts. We identified 28 articles from IEEE, ACM, Web of Science, and Google Scholar that were published between 2010 and 2021. We detail the datasets, features, algorithms, evaluation, and results of each work as well as present main themes. We also discuss current challenges, research gaps and propose future research directions. In summary, most works have used audio data collected from the lab with annotations done by external experts and used supervised machine learning approaches for binary classification of positive and negative affect. Performance results leave room for improvement with significant research gaps such as no recognition using data from daily life. This survey will enable new researchers to get an overview of this field and eventually enable the development of emotion recognition systems to inform interventions to improve the emotional well-being of couples.

Research paper thumbnail of “You made me feel this way”: Investigating Partners’ Influence in Predicting Emotions in Couples’ Conflict Interactions using Speech Data

Companion Publication of the 2021 International Conference on Multimodal Interaction

How romantic partners interact with each other during a conflict influences how they feel at the ... more How romantic partners interact with each other during a conflict influences how they feel at the end of the interaction and is predictive of whether the partners stay together in the long term. Hence understanding the emotions of each partner is important. Yet current approaches that are used include self-reports which are burdensome and hence limit the frequency of this data collection. Automatic emotion prediction could address this challenge. Insights from psychology research indicate that partners' behaviors influence each other's emotions in conflict interaction and hence, the behavior of both partners could be considered to better predict each partner's emotion. However, it is yet to be investigated how doing so compares to only using each partner's own behavior in terms of emotion prediction performance. In this work, we used BERT to extract linguistic features (i.e., what partners said) and openSMILE to extract paralinguistic features (i.e., how they said it) from a data set of 368 German-speaking Swiss couples (N = 736 individuals) who were videotaped during an 8-minutes conflict interaction in the laboratory. Based on those features, we trained machine learning models to predict if partners feel positive or negative after the conflict interaction. Our results show that including the behavior of the other partner improves the prediction performance. Furthermore, for men, considering how their female partners spoke is most important and for women considering what their male partner said is most important in getting better prediction performance. This work is a step towards automatically recognizing each partners' emotion based on the behavior of both, which would enable a better understanding of couples in research, therapy, and the real world.

Research paper thumbnail of BERT meets LIWC: Exploring State-of-the-Art Language Models for Predicting Communication Behavior in Couples’ Conflict Interactions

Companion Publication of the 2021 International Conference on Multimodal Interaction, 2021

Many processes in psychology are complex, such as dyadic interactions between two interacting par... more Many processes in psychology are complex, such as dyadic interactions between two interacting partners (e.g., patient-therapist, intimate relationship partners). Nevertheless, many basic questions about interactions are difficult to investigate because dyadic processes can be within a person and between partners, they are based on multimodal aspects of behavior and unfold rapidly. Current analyses are mainly based on the behavioral coding method, whereby human coders annotate behavior based on a coding schema. But coding is labor-intensive, expensive, slow, focuses on few modalities, and produces sparse data which has forced the field to use average behaviors across entire interactions, thereby undermining the ability to study processes on a fine-grained scale. Current approaches in psychology use LIWC for analyzing couples' interactions. However, advances in natural language processing such as BERT could enable the development of systems to potentially automate behavioral coding, which in turn could substantially improve psychological research. In this work, we train machine learning models to automatically predict positive and negative communication behavioral codes of 368 German-speaking Swiss couples during an 8-minute conflict interaction on a fine-grained scale (10-seconds sequences) using linguistic features and paralinguistic features derived with openSMILE. Our results show that both simpler TF-IDF features as well as more complex BERT features performed better

Research paper thumbnail of Emotion Elicitation and Capture among Real Couples in the Lab

Couples’ relationships affect partners’ mental and physical well-being. Automatic recognition of ... more Couples’ relationships affect partners’ mental and physical well-being. Automatic recognition of couples’ emotions will not only help to better understand the interplay of emotions, intimate relationships, and health and well-being, but also provide crucial clinical insights into protective and risk factors of relationships, and can ultimately guide interventions. However, several works developing emotion recognition algorithms use data from actors in artificial dyadic interactions and the algorithms are likely not to perform well on real couples. We are developing emotion recognition methods using data from real couples and, in this paper, we describe two studies we ran in which we collected emotion data from real couples — Dutch-speaking couples in Belgium and German-speaking couples in Switzerland. We discuss our approach to eliciting and capturing emotions and make five recommendations based on their relevance for developing well-performing emotion recognition systems for couples.

Research paper thumbnail of Emotion Capture among Real Couples in Everyday Life

Tobias Kowatsch ETH Zürich, University of St. Gallen Abstract Illness management among married ad... more Tobias Kowatsch ETH Zürich, University of St. Gallen Abstract Illness management among married adults is mainly shared with their spouses and it involves social support. Social support among couples has been shown to affect emotional well-being positively or negatively and result in healthier habits among diabetes patients. Hence, through automatic emotion recognition, we could have an assessment of the emotional well-being of couples which could inform the development and triggering of interventions to help couples better manage chronic diseases. We are developing an emotion recognition system to recognize the emotions of

Research paper thumbnail of Speech Emotion Recognition among Elderly Individuals using Multimodal Fusion and Transfer Learning

Companion Publication of the 2020 International Conference on Multimodal Interaction, 2020

Recognizing the emotions of the elderly is important as it could give an insight into their menta... more Recognizing the emotions of the elderly is important as it could give an insight into their mental health. Emotion recognition systems that work well on the elderly could be used to assess their emotions in places such as nursing homes and could inform the development of various activities and interventions to improve their mental health. However, several emotion recognition systems are developed using data from younger adults. In this work, we train machine learning models to recognize the emotions of elderly individuals via performing a 3-class classification of valence and arousal as part of the INTERSPEECH 2020 Computational Paralinguistics Challenge (COMPARE). We used speech data from 87 participants who gave spontaneous personal narratives. We leveraged a transfer learning approach in which we used pretrained CNN and BERT models to extract acoustic and linguistic features respectively and fed them into separate machine learning models. Also, we fused these two modalities in a multimodal approach. Our best model used a linguistic approach and outperformed the official competition of unweighted average recall (UAR) baseline for valence by 8.8% and the mean of valence and arousal by 3.2%. We also showed that feature engineering is not necessary as transfer learning without fine-tuning performs as well or better and could be leveraged for the task of recognizing the emotions of elderly individuals. This work is a step towards better recognition of the emotions of the elderly which could eventually inform the development of interventions to manage their mental health.

Research paper thumbnail of Speech Emotion Recognition among Couples using the Peak-End Rule and Transfer Learning

Companion Publication of the 2020 International Conference on Multimodal Interaction, 2020

Extensive couples' literature shows that how couples feel after a conflict is predicted by certai... more Extensive couples' literature shows that how couples feel after a conflict is predicted by certain emotional aspects of that conversation. Understanding the emotions of couples leads to a better understanding of partners' mental well-being and consequently their relationships. Hence, automatic emotion recognition among couples could potentially guide interventions to help couples improve their emotional well-being and their relationships. It has been shown that people's global emotional judgment after an experience is strongly influenced by the emotional extremes and ending of that experience, known as the peak-end rule. In this work, we leveraged this theory and used machine learning to investigate, which audio segments can be used to best predict the end-of-conversation emotions of couples. We used speech data collected from 101 Dutchspeaking couples in Belgium who engaged in 10-minute long conversations in the lab. We extracted acoustic features from (1) the audio segments with the most extreme positive and negative ratings, and (2) the ending of the audio. We used transfer learning in which we extracted these acoustic features with a pre-trained convolutional neural network (YAMNet). We then used these features to train machine learning models-support vector machines-to predict the end-of-conversation valence ratings (positive vs negative) of each partner. The results of this work could inform how to best recognize the emotions of couples after conversationsessions and eventually, lead to a better understanding of couples' relationships either in therapy or in everyday life.

Research paper thumbnail of Social Support and Common Dyadic Coping in Couple’s Dyadic Management of Type II Diabetes: Study Protocol for an Ambulatory Assessment Application (Preprint)

UNSTRUCTURED Diabetes mellitus Type II (T2DM) is a common chronic disease. To manage blood glucos... more UNSTRUCTURED Diabetes mellitus Type II (T2DM) is a common chronic disease. To manage blood glucose levels patients need to follow medical recommendations for healthy eating, physical activity, and medication adherence in their everyday life. Illness management is mainly shared with partners and involves social support and common dyadic coping (CDC). Social support and CDC have been identified as having implications for people’s health behavior and well-being. Visible support, however, may also be negatively related to people’s well-being. Thus, the concept of invisible support was introduced. It is unknown which of these concepts (visible support, invisible support, CDC) displays the most beneficial associations with health behavior and well-being when considered together in the context of illness management in couple’s everyday life. Therefore, a novel ambulatory assessment application for the open source behavioral intervention platform MobileCoach (AAMC) was developed. It utilize...

Research paper thumbnail of VADLite

Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers

Smartwatches provide a unique opportunity to collect more speech data because they are always wit... more Smartwatches provide a unique opportunity to collect more speech data because they are always with the user and also have a more exposed microphone compared to smartphones. Speech data could be used to infer various indicators of mental well being such as emotions, stress and social activity. Hence, real-time voice activity detection (VAD) on smartwatches could enable the development of applications for mental health monitoring. In this work, we present VADLite, an open-source, lightweight, system that performs real-time VAD on smartwatches. It extracts mel-frequency cepstral coefficients and classifies speech versus non-speech audio samples using a linear Support Vector Machine. The real-time implementation is done on the Wear OS Polar M600 smartwatch. An offline and online evaluation of VADLite using real-world data showed better performance than WebRTC's open-source VAD system. VADLite can be easily integrated into Wear OS projects that need a lightweight VAD module running on a smartwatch.

Research paper thumbnail of Emotion Recognition among Couples: A Survey

arXiv (Cornell University), Feb 16, 2022

Couples' relationships affect the physical health and emotional well-being of partners. Automatic... more Couples' relationships affect the physical health and emotional well-being of partners. Automatically recognizing each partner's emotions could give a better understanding of their individual emotional well-being, enable interventions and provide clinical benefits. In the paper, we summarize and synthesize works that have focused on developing and evaluating systems to automatically recognize the emotions of each partner based on couples' interaction or conversation contexts. We identified 28 articles from IEEE, ACM, Web of Science, and Google Scholar that were published between 2010 and 2021. We detail the datasets, features, algorithms, evaluation, and results of each work as well as present main themes. We also discuss current challenges, research gaps and propose future research directions. In summary, most works have used audio data collected from the lab with annotations done by external experts and used supervised machine learning approaches for binary classification of positive and negative affect. Performance results leave room for improvement with significant research gaps such as no recognition using data from daily life. This survey will enable new researchers to get an overview of this field and eventually enable the development of emotion recognition systems to inform interventions to improve the emotional well-being of couples.

Research paper thumbnail of “You made me feel this way”: Investigating Partners’ Influence in Predicting Emotions in Couples’ Conflict Interactions using Speech Data

Companion Publication of the 2021 International Conference on Multimodal Interaction

How romantic partners interact with each other during a conflict influences how they feel at the ... more How romantic partners interact with each other during a conflict influences how they feel at the end of the interaction and is predictive of whether the partners stay together in the long term. Hence understanding the emotions of each partner is important. Yet current approaches that are used include self-reports which are burdensome and hence limit the frequency of this data collection. Automatic emotion prediction could address this challenge. Insights from psychology research indicate that partners' behaviors influence each other's emotions in conflict interaction and hence, the behavior of both partners could be considered to better predict each partner's emotion. However, it is yet to be investigated how doing so compares to only using each partner's own behavior in terms of emotion prediction performance. In this work, we used BERT to extract linguistic features (i.e., what partners said) and openSMILE to extract paralinguistic features (i.e., how they said it) from a data set of 368 German-speaking Swiss couples (N = 736 individuals) who were videotaped during an 8-minutes conflict interaction in the laboratory. Based on those features, we trained machine learning models to predict if partners feel positive or negative after the conflict interaction. Our results show that including the behavior of the other partner improves the prediction performance. Furthermore, for men, considering how their female partners spoke is most important and for women considering what their male partner said is most important in getting better prediction performance. This work is a step towards automatically recognizing each partners' emotion based on the behavior of both, which would enable a better understanding of couples in research, therapy, and the real world.

Research paper thumbnail of BERT meets LIWC: Exploring State-of-the-Art Language Models for Predicting Communication Behavior in Couples’ Conflict Interactions

Companion Publication of the 2021 International Conference on Multimodal Interaction, 2021

Many processes in psychology are complex, such as dyadic interactions between two interacting par... more Many processes in psychology are complex, such as dyadic interactions between two interacting partners (e.g., patient-therapist, intimate relationship partners). Nevertheless, many basic questions about interactions are difficult to investigate because dyadic processes can be within a person and between partners, they are based on multimodal aspects of behavior and unfold rapidly. Current analyses are mainly based on the behavioral coding method, whereby human coders annotate behavior based on a coding schema. But coding is labor-intensive, expensive, slow, focuses on few modalities, and produces sparse data which has forced the field to use average behaviors across entire interactions, thereby undermining the ability to study processes on a fine-grained scale. Current approaches in psychology use LIWC for analyzing couples' interactions. However, advances in natural language processing such as BERT could enable the development of systems to potentially automate behavioral coding, which in turn could substantially improve psychological research. In this work, we train machine learning models to automatically predict positive and negative communication behavioral codes of 368 German-speaking Swiss couples during an 8-minute conflict interaction on a fine-grained scale (10-seconds sequences) using linguistic features and paralinguistic features derived with openSMILE. Our results show that both simpler TF-IDF features as well as more complex BERT features performed better

Research paper thumbnail of Emotion Elicitation and Capture among Real Couples in the Lab

Couples’ relationships affect partners’ mental and physical well-being. Automatic recognition of ... more Couples’ relationships affect partners’ mental and physical well-being. Automatic recognition of couples’ emotions will not only help to better understand the interplay of emotions, intimate relationships, and health and well-being, but also provide crucial clinical insights into protective and risk factors of relationships, and can ultimately guide interventions. However, several works developing emotion recognition algorithms use data from actors in artificial dyadic interactions and the algorithms are likely not to perform well on real couples. We are developing emotion recognition methods using data from real couples and, in this paper, we describe two studies we ran in which we collected emotion data from real couples — Dutch-speaking couples in Belgium and German-speaking couples in Switzerland. We discuss our approach to eliciting and capturing emotions and make five recommendations based on their relevance for developing well-performing emotion recognition systems for couples.

Research paper thumbnail of Emotion Capture among Real Couples in Everyday Life

Tobias Kowatsch ETH Zürich, University of St. Gallen Abstract Illness management among married ad... more Tobias Kowatsch ETH Zürich, University of St. Gallen Abstract Illness management among married adults is mainly shared with their spouses and it involves social support. Social support among couples has been shown to affect emotional well-being positively or negatively and result in healthier habits among diabetes patients. Hence, through automatic emotion recognition, we could have an assessment of the emotional well-being of couples which could inform the development and triggering of interventions to help couples better manage chronic diseases. We are developing an emotion recognition system to recognize the emotions of

Research paper thumbnail of Speech Emotion Recognition among Elderly Individuals using Multimodal Fusion and Transfer Learning

Companion Publication of the 2020 International Conference on Multimodal Interaction, 2020

Recognizing the emotions of the elderly is important as it could give an insight into their menta... more Recognizing the emotions of the elderly is important as it could give an insight into their mental health. Emotion recognition systems that work well on the elderly could be used to assess their emotions in places such as nursing homes and could inform the development of various activities and interventions to improve their mental health. However, several emotion recognition systems are developed using data from younger adults. In this work, we train machine learning models to recognize the emotions of elderly individuals via performing a 3-class classification of valence and arousal as part of the INTERSPEECH 2020 Computational Paralinguistics Challenge (COMPARE). We used speech data from 87 participants who gave spontaneous personal narratives. We leveraged a transfer learning approach in which we used pretrained CNN and BERT models to extract acoustic and linguistic features respectively and fed them into separate machine learning models. Also, we fused these two modalities in a multimodal approach. Our best model used a linguistic approach and outperformed the official competition of unweighted average recall (UAR) baseline for valence by 8.8% and the mean of valence and arousal by 3.2%. We also showed that feature engineering is not necessary as transfer learning without fine-tuning performs as well or better and could be leveraged for the task of recognizing the emotions of elderly individuals. This work is a step towards better recognition of the emotions of the elderly which could eventually inform the development of interventions to manage their mental health.

Research paper thumbnail of Speech Emotion Recognition among Couples using the Peak-End Rule and Transfer Learning

Companion Publication of the 2020 International Conference on Multimodal Interaction, 2020

Extensive couples' literature shows that how couples feel after a conflict is predicted by certai... more Extensive couples' literature shows that how couples feel after a conflict is predicted by certain emotional aspects of that conversation. Understanding the emotions of couples leads to a better understanding of partners' mental well-being and consequently their relationships. Hence, automatic emotion recognition among couples could potentially guide interventions to help couples improve their emotional well-being and their relationships. It has been shown that people's global emotional judgment after an experience is strongly influenced by the emotional extremes and ending of that experience, known as the peak-end rule. In this work, we leveraged this theory and used machine learning to investigate, which audio segments can be used to best predict the end-of-conversation emotions of couples. We used speech data collected from 101 Dutchspeaking couples in Belgium who engaged in 10-minute long conversations in the lab. We extracted acoustic features from (1) the audio segments with the most extreme positive and negative ratings, and (2) the ending of the audio. We used transfer learning in which we extracted these acoustic features with a pre-trained convolutional neural network (YAMNet). We then used these features to train machine learning models-support vector machines-to predict the end-of-conversation valence ratings (positive vs negative) of each partner. The results of this work could inform how to best recognize the emotions of couples after conversationsessions and eventually, lead to a better understanding of couples' relationships either in therapy or in everyday life.

Research paper thumbnail of Social Support and Common Dyadic Coping in Couple’s Dyadic Management of Type II Diabetes: Study Protocol for an Ambulatory Assessment Application (Preprint)

UNSTRUCTURED Diabetes mellitus Type II (T2DM) is a common chronic disease. To manage blood glucos... more UNSTRUCTURED Diabetes mellitus Type II (T2DM) is a common chronic disease. To manage blood glucose levels patients need to follow medical recommendations for healthy eating, physical activity, and medication adherence in their everyday life. Illness management is mainly shared with partners and involves social support and common dyadic coping (CDC). Social support and CDC have been identified as having implications for people’s health behavior and well-being. Visible support, however, may also be negatively related to people’s well-being. Thus, the concept of invisible support was introduced. It is unknown which of these concepts (visible support, invisible support, CDC) displays the most beneficial associations with health behavior and well-being when considered together in the context of illness management in couple’s everyday life. Therefore, a novel ambulatory assessment application for the open source behavioral intervention platform MobileCoach (AAMC) was developed. It utilize...

Research paper thumbnail of VADLite

Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers

Smartwatches provide a unique opportunity to collect more speech data because they are always wit... more Smartwatches provide a unique opportunity to collect more speech data because they are always with the user and also have a more exposed microphone compared to smartphones. Speech data could be used to infer various indicators of mental well being such as emotions, stress and social activity. Hence, real-time voice activity detection (VAD) on smartwatches could enable the development of applications for mental health monitoring. In this work, we present VADLite, an open-source, lightweight, system that performs real-time VAD on smartwatches. It extracts mel-frequency cepstral coefficients and classifies speech versus non-speech audio samples using a linear Support Vector Machine. The real-time implementation is done on the Wear OS Polar M600 smartwatch. An offline and online evaluation of VADLite using real-world data showed better performance than WebRTC's open-source VAD system. VADLite can be easily integrated into Wear OS projects that need a lightweight VAD module running on a smartwatch.