George Caridakis | University of the Aegean (original) (raw)

Papers by George Caridakis

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction (Why, Which, How and Where?)

Proceedings of the 2014 Workshop, Nov 16, 2014

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction: Why, Which, How and Where?

Bookmarks Related papers MentionsView impact

Research paper thumbnail of 1st CAA GR Conference

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Social things - The SandS instantiation

2013 IEEE 14th International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM), 2013

ABSTRACT At a time when socialism as an economic option is variously questioned, very few people ... more ABSTRACT At a time when socialism as an economic option is variously questioned, very few people are against social instances of our life such as entertainment, customer assistance, and so on. This happens with the management of many things accompanying our life as well. We can find both the reason and the evidence for the viability of this trend in one very basic fact: things are social because they work better. However, also in this sphere social politics are highly questionable. Here we introduce the perspective adopted in the European project SandS within a framework of Internet of Things. In this case things are agents interacting on the network within a service centric approach where a sound hierarchy dispatches instructions. It is a complete ecosystem where the social network develops a collective intelligence subtending new concrete functionalities that are centered on the user willing and fostered by his/her feedbacks. The central role of the user reflects on all aspects of the ecosystem, from the family of things which are socially governed: the household appliances (the white goods) that affect our everyday life, up to the employed hardware and software: strictly open source.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural interaction expressivity modeling and analysis

Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA '13, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of A Particle Swarm Optimization (PSO) Model for Scheduling Nonlinear Multimedia Services in Multicommodity Fat-Tree Cloud Networks

Communications in Computer and Information Science, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Induction, recording and recognition of natural emotions from facial expressions and speech prosody

Journal on Multimodal User Interfaces, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction

Proceedings of the 2014 workshop on Understanding and Modeling Multiparty, Multimodal Interactions - UM3I '14, 2014

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context-Aware User Modeling and Semantic Interoperability in Smart Home Environments

2013 8th International Workshop on Semantic and Social Media Adaptation and Personalization, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Investigating Context Awareness of Affective Computing Systems: A Critical Approach

Procedia Computer Science, 2014

ABSTRACT Intelligent Human Computer Interaction systems should be affective aware and Affective C... more ABSTRACT Intelligent Human Computer Interaction systems should be affective aware and Affective Computing systems should be context aware. Positioned in the cross-section of the research areas of Interaction Context and Affective Computing current paper investigates if and how context is incorporated in automatic analysis of human affective behavior. Several related aspects are discussed ranging from modeling, acquiring and annotating issues in affectively enhanced corpora to issues related to incorporating context information in a multimodal fusion framework of affective analysis. These aspects are critically discussed in terms of the challenges they comprise while, in a wider framework, future directions of this recently active, yet mainly unexplored, research area are identified. Overall, the paper aims to both document the present status as well as comment on the evolution of the upcoming topic of Context in Affective Computing.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent

In order to be believable, embodied conversational agents (ECAs) must show expression of emotions... more In order to be believable, embodied conversational agents (ECAs) must show expression of emotions in a consistent and natural looking way across modalities. The ECA has to be able to display coordinated signs of emotion during realistic emotional behaviour. Such a capability requires one to study and represent emotions and coordination of modalities during non-basic realistic human behaviour, to define languages for representing such behaviours to be displayed by the ECA, to have access to mono-modal representations such as gesture repositories. This chapter is concerned about coordinating the generation of signs in multiple modalities in such an affective agent. Designers of an affective agent need to know how it should coordinate its facial expression, speech, gestures and other modalities in view of showing emotion. This synchronisation of modalities is a main feature of emotions.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Recognition of emotional states in natural human-computer interaction

Affective and human-centered computing have attracted a lot of attention during the past years, m... more Affective and human-centered computing have attracted a lot of attention during the past years, mainly due to the abundance of environments and applications able to exploit and adapt to multimodal input from the users. The combination of facial expressions with prosody information allows us to capture the users' emotional state in an unintrusive manner, relying on the best performing modality in cases where one modality suffers from noise or bad sensing conditions.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural, Affect Aware interfaces: Gesture and Body Expressivity Aspects

Recently, a growing number of researchers focus their work on a fledgling field related to the id... more Recently, a growing number of researchers focus their work on a fledgling field related to the identification, recording, interpreting, processing and simulation of emotion and affective states. Affective computing, along with the wide range of related application areas, is expanded increasingly due to the emerging demand for natural, intelligent, adaptive and personalized multimodal interaction context. This paper focuses on the gesture and bodily expressivity analysis, investigating and reporting the entire spectrum of related aspects such as issues involved in affectively enhanced corpora (stimuli and input streams), respective expressivity formalization approaches as well as machine learning and pattern recognition aspects (feature extraction, multimodal fusion and classification). Finally, the synthesis counterpart of affectively enhanced behavior is examined and future directions of the research area are identified.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent

In order to be believable, embodied conversational agents (ECAs) must show expression of emotions... more In order to be believable, embodied conversational agents (ECAs) must show expression of emotions in a consistent and natural looking way across modalities. The ECA has to be able to display coordinated signs of emotion during realistic emotional behaviour.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Detecting human behavior emotional cues in Natural Interaction

Abstract Current work focuses on the detection of human behavior emotional cues and their incorpo... more Abstract Current work focuses on the detection of human behavior emotional cues and their incorporation into affect aware Natural Interaction. Techniques for extracting emotional cues based on visual non verbal human behavior are presented. Namely, gesture qualitative expressivity features and head pose and eye gaze estimation are derived from hand and facial movement respectively.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Manual annotation and automatic image processing of multimodal emotional behaviors in tv interviews

Designing affective Human Computer-Interfaces such as Embodied Conversational Agents requires mod... more Designing affective Human Computer-Interfaces such as Embodied Conversational Agents requires modeling the relations between spontaneous emotions and behaviors in several modalities. There have been a lot of psychological researches on emotion and nonverbal communication. Yet, these studies were based mostly on acted basic emotions. This paper explores how manual annotation and image processing might cooperate towards the representation of spontaneous emotional behavior in low resolution videos from TV.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural Interaction Multimodal Analysis: Expressivity Analysis towards Adaptive and Personalized Interfaces

Abstract Intelligent personalized systems often ignore the affective aspectof human behavior and ... more Abstract Intelligent personalized systems often ignore the affective aspectof human behavior and focus more on tactile cues of the useractivity. A complete user modeling, though, should also incorporatecues such as facial expressions, speech prosody and gesture orbody posture expressivity features, in order to dynamically profile the user, fusing all available modalities since these qualitative affective cues contain significant information about the user's on verbal behavior and communication.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Affective E-Learning System: Analysis of Learners’ State

Abstract Learning systems provide adaptation to the learners' preferences without taking under co... more Abstract Learning systems provide adaptation to the learners' preferences without taking under consideration the learners' current status. The more a learning system exchanges relevant fragments of information about the learner's affective status the more it adapts to it. Following this direction, we propose an integrated learning system taking under consideration learners' emotional state in order to provide a personalized e-learning system.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Body gesture and facial expression analysis for automatic affect recognition

Summary Affect recognition plays an important role in everyday life. This explains why researcher... more Summary Affect recognition plays an important role in everyday life. This explains why researchers in the human–computer and human–robot interaction community have increasingly been addressing the issue of endowing machines with affect sensitivity. Affect sensitivity refers to the ability to analyse verbal and non-verbal behavioural cues displayed by the user in order to infer the underlying communicated affect.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of A cross-cultural, multimodal, affective corpus for gesture expressivity analysis

Abstract A multimodal, cross-cultural corpus of affective behavior is presented in this research ... more Abstract A multimodal, cross-cultural corpus of affective behavior is presented in this research work. The corpus construction process, including issues related to the design and implementation of an experiment, is discussed along with resulting acoustic prosody, facial expressions and gesture expressivity features.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction (Why, Which, How and Where?)

Proceedings of the 2014 Workshop, Nov 16, 2014

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction: Why, Which, How and Where?

Bookmarks Related papers MentionsView impact

Research paper thumbnail of 1st CAA GR Conference

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Social things - The SandS instantiation

2013 IEEE 14th International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM), 2013

ABSTRACT At a time when socialism as an economic option is variously questioned, very few people ... more ABSTRACT At a time when socialism as an economic option is variously questioned, very few people are against social instances of our life such as entertainment, customer assistance, and so on. This happens with the management of many things accompanying our life as well. We can find both the reason and the evidence for the viability of this trend in one very basic fact: things are social because they work better. However, also in this sphere social politics are highly questionable. Here we introduce the perspective adopted in the European project SandS within a framework of Internet of Things. In this case things are agents interacting on the network within a service centric approach where a sound hierarchy dispatches instructions. It is a complete ecosystem where the social network develops a collective intelligence subtending new concrete functionalities that are centered on the user willing and fostered by his/her feedbacks. The central role of the user reflects on all aspects of the ecosystem, from the family of things which are socially governed: the household appliances (the white goods) that affect our everyday life, up to the employed hardware and software: strictly open source.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural interaction expressivity modeling and analysis

Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA '13, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of A Particle Swarm Optimization (PSO) Model for Scheduling Nonlinear Multimedia Services in Multicommodity Fat-Tree Cloud Networks

Communications in Computer and Information Science, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Induction, recording and recognition of natural emotions from facial expressions and speech prosody

Journal on Multimodal User Interfaces, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context in Affective Multiparty and Multimodal Interaction

Proceedings of the 2014 workshop on Understanding and Modeling Multiparty, Multimodal Interactions - UM3I '14, 2014

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Context-Aware User Modeling and Semantic Interoperability in Smart Home Environments

2013 8th International Workshop on Semantic and Social Media Adaptation and Personalization, 2013

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Investigating Context Awareness of Affective Computing Systems: A Critical Approach

Procedia Computer Science, 2014

ABSTRACT Intelligent Human Computer Interaction systems should be affective aware and Affective C... more ABSTRACT Intelligent Human Computer Interaction systems should be affective aware and Affective Computing systems should be context aware. Positioned in the cross-section of the research areas of Interaction Context and Affective Computing current paper investigates if and how context is incorporated in automatic analysis of human affective behavior. Several related aspects are discussed ranging from modeling, acquiring and annotating issues in affectively enhanced corpora to issues related to incorporating context information in a multimodal fusion framework of affective analysis. These aspects are critically discussed in terms of the challenges they comprise while, in a wider framework, future directions of this recently active, yet mainly unexplored, research area are identified. Overall, the paper aims to both document the present status as well as comment on the evolution of the upcoming topic of Context in Affective Computing.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent

In order to be believable, embodied conversational agents (ECAs) must show expression of emotions... more In order to be believable, embodied conversational agents (ECAs) must show expression of emotions in a consistent and natural looking way across modalities. The ECA has to be able to display coordinated signs of emotion during realistic emotional behaviour. Such a capability requires one to study and represent emotions and coordination of modalities during non-basic realistic human behaviour, to define languages for representing such behaviours to be displayed by the ECA, to have access to mono-modal representations such as gesture repositories. This chapter is concerned about coordinating the generation of signs in multiple modalities in such an affective agent. Designers of an affective agent need to know how it should coordinate its facial expression, speech, gestures and other modalities in view of showing emotion. This synchronisation of modalities is a main feature of emotions.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Recognition of emotional states in natural human-computer interaction

Affective and human-centered computing have attracted a lot of attention during the past years, m... more Affective and human-centered computing have attracted a lot of attention during the past years, mainly due to the abundance of environments and applications able to exploit and adapt to multimodal input from the users. The combination of facial expressions with prosody information allows us to capture the users' emotional state in an unintrusive manner, relying on the best performing modality in cases where one modality suffers from noise or bad sensing conditions.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural, Affect Aware interfaces: Gesture and Body Expressivity Aspects

Recently, a growing number of researchers focus their work on a fledgling field related to the id... more Recently, a growing number of researchers focus their work on a fledgling field related to the identification, recording, interpreting, processing and simulation of emotion and affective states. Affective computing, along with the wide range of related application areas, is expanded increasingly due to the emerging demand for natural, intelligent, adaptive and personalized multimodal interaction context. This paper focuses on the gesture and bodily expressivity analysis, investigating and reporting the entire spectrum of related aspects such as issues involved in affectively enhanced corpora (stimuli and input streams), respective expressivity formalization approaches as well as machine learning and pattern recognition aspects (feature extraction, multimodal fusion and classification). Finally, the synthesis counterpart of affectively enhanced behavior is examined and future directions of the research area are identified.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent

In order to be believable, embodied conversational agents (ECAs) must show expression of emotions... more In order to be believable, embodied conversational agents (ECAs) must show expression of emotions in a consistent and natural looking way across modalities. The ECA has to be able to display coordinated signs of emotion during realistic emotional behaviour.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Detecting human behavior emotional cues in Natural Interaction

Abstract Current work focuses on the detection of human behavior emotional cues and their incorpo... more Abstract Current work focuses on the detection of human behavior emotional cues and their incorporation into affect aware Natural Interaction. Techniques for extracting emotional cues based on visual non verbal human behavior are presented. Namely, gesture qualitative expressivity features and head pose and eye gaze estimation are derived from hand and facial movement respectively.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Manual annotation and automatic image processing of multimodal emotional behaviors in tv interviews

Designing affective Human Computer-Interfaces such as Embodied Conversational Agents requires mod... more Designing affective Human Computer-Interfaces such as Embodied Conversational Agents requires modeling the relations between spontaneous emotions and behaviors in several modalities. There have been a lot of psychological researches on emotion and nonverbal communication. Yet, these studies were based mostly on acted basic emotions. This paper explores how manual annotation and image processing might cooperate towards the representation of spontaneous emotional behavior in low resolution videos from TV.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Natural Interaction Multimodal Analysis: Expressivity Analysis towards Adaptive and Personalized Interfaces

Abstract Intelligent personalized systems often ignore the affective aspectof human behavior and ... more Abstract Intelligent personalized systems often ignore the affective aspectof human behavior and focus more on tactile cues of the useractivity. A complete user modeling, though, should also incorporatecues such as facial expressions, speech prosody and gesture orbody posture expressivity features, in order to dynamically profile the user, fusing all available modalities since these qualitative affective cues contain significant information about the user's on verbal behavior and communication.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Affective E-Learning System: Analysis of Learners’ State

Abstract Learning systems provide adaptation to the learners' preferences without taking under co... more Abstract Learning systems provide adaptation to the learners' preferences without taking under consideration the learners' current status. The more a learning system exchanges relevant fragments of information about the learner's affective status the more it adapts to it. Following this direction, we propose an integrated learning system taking under consideration learners' emotional state in order to provide a personalized e-learning system.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of Body gesture and facial expression analysis for automatic affect recognition

Summary Affect recognition plays an important role in everyday life. This explains why researcher... more Summary Affect recognition plays an important role in everyday life. This explains why researchers in the human–computer and human–robot interaction community have increasingly been addressing the issue of endowing machines with affect sensitivity. Affect sensitivity refers to the ability to analyse verbal and non-verbal behavioural cues displayed by the user in order to infer the underlying communicated affect.

Bookmarks Related papers MentionsView impact

Research paper thumbnail of A cross-cultural, multimodal, affective corpus for gesture expressivity analysis

Abstract A multimodal, cross-cultural corpus of affective behavior is presented in this research ... more Abstract A multimodal, cross-cultural corpus of affective behavior is presented in this research work. The corpus construction process, including issues related to the design and implementation of an experiment, is discussed along with resulting acoustic prosody, facial expressions and gesture expressivity features.

Bookmarks Related papers MentionsView impact