Integrating models of personality and emotions into lifelike characters (original) (raw)
Related papers
Life-Like Characters. Tools, Affective Functions, and Applications
2003
Galatea is a software toolkit to develop a human-like spoken dialog agent. In order to easily integrate the modules of different characteristics including speech recognizer, speech synthesizer, facial animation synthesizer[ facial-image synthesizer ] and dialog controller, each module is modeled as a virtual machine having a simple common interface and connected to each other through a broker (communication manager). Galatea employs model-based speech and facial animation[ facial-image ] synthesizers whose model parameters are adapted easily to those for an existing person if his/her training data is given. The software toolkit that runs on both UNIX/Linux and Windows operating systems will be publicly available in the middle of 2003 [1, 2].
Imparting individuality to virtual humans
2002
In this paper, we present an integrated method of linking personality and emotion with the response generation and expression synthesis of Virtual Humans. The characters are powered by a dialogue system that consists of a large set of basic interactions between user and computer. These interactions are encoded in finite state machines. Transitions are linked with conditions and actions that can be connected with external modules. One of these modules is a personality module. In this way, responses of the virtual human depend not only on input given by a user, but also on its personality and emotional state. The dialogue system is connected to a 3D face that performs the speech and facial animation, together with facial expressions that reflect the personality specification.
Personality models to and from virtual characters
2017
In order to be believable, virtual agents must possess both a behavioral model simulating emotions and personality, and a convincing aesthetics [4]. A lot of research already exists about models of emotions, and some seminal work investigates now the role of personality [1, 2]. While emotions are dynamic and variable in time, personality is a static feature of humans, changing only very slowly through the course of a life. The emotional state drives the style of the behavior of a character (how it accomplishes actions) the personality drives the intention of an autonomous agent (what to do next). However, there is not much work investigating the relationships between the personality of a virtual agent, its behavior, and its physical appearance. The work that we are conducting in the SLSI group is based on the observation that people very quickly build up their ideas about the personality of others in zero-acquaintance encounters [11]. The judgment of the personality can be modeled, ...
NPCs and Chatterbots with Personality and Emotional Response
2006 IEEE Symposium on Computational Intelligence and Games, 2006
Chatterbots are computer programs that simulate intelligent conversation. They are situated between games and toys, as their aim is mostly to be entertaining, but the user doesn't have to follow precise rules when playing with the program. Currently business and educational applications have started to emerge as a further development of the idea of intelligent dialog. For the game industry, they come close to the concept of NPC, or Non-Player Character, and they may become part of making such virtual beings more believable and lifelike in the future. In this paper we present application introducing an emotional component designed to enhance the realism of the conversation.
Virtual Characters as Emotional Interaction Element in the User Interfaces
Lecture Notes in Computer Science, 2006
Virtual assistants, also called Avatars, are virtual characters making the communication between the user and the machine more natural and interactive. In this research we have given avatars the capacity of having and expressing emotions by means of a computational emotional model based on the cognitive perspective. Once the system knows the emotional expressiveness that the virtual character will show to the user, we also have worked in how to express it through facial animation techniques.
2009
The development of intelligent virtual agents (IVAs) is a complex task featuring many sub problems. Concerning the education in this field, there is a good theoretical basis. However when it comes to the practical education – the platforms that can be used are scarce and mostly still not fully developed. Our goal is to create a platform, which would allow for a good practical education in the field of IVAs development. The first step towards this platform is a prototype implementation – project Emohawk – that will be described in this thesis. Project Emohawk features a partly emergent story and affect-driven architecture for IVAs control based on a psychologically plausible emotion model. Moreover a methodology was created analyzing this prototype implementation regarding the believability and emergent story potential.
Simulating emotional personality in human computer interfaces
International Conference on Fuzzy Systems, 2010
Currently, there are quite a number of computational systems that attend to humans automatically, e.g., using natural language. However, the interaction with these machines is still too artificial. Usually, these machines are insensitive to the emotional content being expressed by the communication partner, as well as incapable of expressing emotional content. This paper presents the architecture of a computational system to simulate emotional states. The simulator is split up in three modules that provide to the designer a great number of possibilities to model different aspects of a simulated emotional personality.