Affective Dialogue Systems [electronic resource] : Tutorial and Research Workshop, ADS 2004, Kloster Irsee, Germany, June 14-16, 2004. Proceedings / edited by Elisabeth André, Laila Dybkjær, Wolfgang Minker, Paul Heisterkamp.

Contributor(s): André, Elisabeth [editor.] | Dybkjær, Laila [editor.] | Minker, Wolfgang [editor.] | Heisterkamp, Paul [editor.] | SpringerLink (Online service)Material type: TextTextSeries: Lecture Notes in Computer Science ; 3068Publisher: Berlin, Heidelberg : Springer Berlin Heidelberg, 2004Description: XII, 328 p. online resourceContent type: text Media type: computer Carrier type: online resourceISBN: 9783540248422Subject(s): Computer science | Information systems | Multimedia systems | Artificial intelligence | Computer graphics | Social sciences -- Data processing | Computer Science | Artificial Intelligence (incl. Robotics) | User Interfaces and Human Computer Interaction | Multimedia Information Systems | Information Systems Applications (incl.Internet) | Computer Appl. in Social and Behavioral Sciences | Computer GraphicsAdditional physical formats: Printed edition:: No titleDDC classification: 006.3 LOC classification: Q334-342TJ210.2-211.495Online resources: Click here to access online
Contents:
Emotion Recognition -- From Emotion to Interaction: Lessons from Real Human-Machine-Dialogues -- Emotions in Short Vowel Segments: Effects of the Glottal Flow as Reflected by the Normalized Amplitude Quotient -- Towards Real Life Applications in Emotion Recognition -- Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System -- Neural Architecture for Temporal Emotion Classification -- Affective User Modeling -- Empathic Embodied Interfaces: Addressing Users’ Affective State -- Cognitive-Model-Based Interpretation of Emotions in a Multi-modal Dialog System -- Affective Advice Giving Dialogs -- Emotional Databases, Annotation Schemes, and Tools -- A Categorical Annotation Scheme for Emotion in the Linguistic Content of Dialogue -- Data-Driven Tools for Designing Talking Heads Exploiting Emotional Attitudes -- Design of a Hungarian Emotional Database for Speech Analysis and Synthesis -- Affective Conversational Agents and Dialogue Simulation -- Emotion and Dialogue in the MRE Virtual Humans -- Coloring Multi-character Conversations through the Expression of Emotions -- Domain-Oriented Conversation with H.C. Andersen -- Simulating the Emotion Dynamics of a Multimodal Conversational Agent -- Design and First Tests of a Chatter -- Endowing Spoken Language Dialogue Systems with Emotional Intelligence -- Do You Want to Talk About It? -- Application of D-Script Model to Emotional Dialogue Simulation -- Synthesis of Emotional Speech and Facial Animations -- Modeling and Synthesizing Emotional Speech for Catalan Text-to-Speech Synthesis -- Dimensional Emotion Representation as a Basis for Speech Synthesis with Non-extreme Emotions -- Extra-Semantic Protocols; Input Requirements for the Synthesis of Dialogue Speech -- How (Not) to Add Laughter to Synthetic Speech -- Modifications of Speech Articulatory Characteristics in the Emotive Speech -- Expressive Animated Agents for Affective Dialogue Systems -- Affective Tutoring Systems -- Affective Feedback in a Tutoring System for Procedural Tasks -- Generating Socially Appropriate Tutorial Dialog -- Evaluation of Affective Dialogue Systems -- The Role of Affect and Sociality in the Agent-Based Collaborative Learning System -- Evaluation of Synthetic Faces: Human Recognition of Emotional Facial Displays -- How to Evaluate Models of User Affect? -- Preliminary Cross-Cultural Evaluation of Expressiveness in Synthetic Faces -- Demonstrations -- Conversational H.C. Andersen First Prototype Description -- Experiences with an Emotional Sales Agent -- A Freely Configurable, Multi-modal Sensor System for Affective Computing -- Gesture Synthesis in a Real-World ECA.
In: Springer eBooksSummary: Human conversational partners are able, at least to a certain extent, to detect the speaker’s or listener’s emotional state and may attempt to respond to it accordingly. When instead one of the interlocutors is a computer a number of questions arise, such as the following: To what extent are dialogue systems able to simulate such behaviors? Can we learn the mechanisms of emotional be- viors from observing and analyzing the behavior of human speakers? How can emotionsbeautomaticallyrecognizedfromauser’smimics,gesturesandspeech? What possibilities does a dialogue system have to express emotions itself? And, very importantly, would emotional system behavior be desirable at all? Given the state of ongoing research into incorporating emotions in dialogue systems we found it timely to organize a Tutorial and Research Workshop on A?ectiveDialogueSystems(ADS2004)atKlosterIrseein GermanyduringJune 14–16, 2004. After two successful ISCA Tutorial and Research Workshops on Multimodal Dialogue Systems at the same location in 1999 and 2002, we felt that a workshop focusing on the role of a?ect in dialogue would be a valuable continuation of the workshop series. Due to its interdisciplinary nature, the workshop attracted submissions from researchers with very di?erent backgrounds and from many di?erent research areas, working on, for example, dialogue processing, speech recognition, speech synthesis, embodied conversational agents, computer graphics, animation, user modelling, tutoring systems, cognitive systems, and human-computer inter- tion.
Item type: E-BOOKS
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Current library Home library Call number Materials specified URL Status Date due Barcode
IMSc Library
IMSc Library
Link to resource Available EBK3179

Emotion Recognition -- From Emotion to Interaction: Lessons from Real Human-Machine-Dialogues -- Emotions in Short Vowel Segments: Effects of the Glottal Flow as Reflected by the Normalized Amplitude Quotient -- Towards Real Life Applications in Emotion Recognition -- Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System -- Neural Architecture for Temporal Emotion Classification -- Affective User Modeling -- Empathic Embodied Interfaces: Addressing Users’ Affective State -- Cognitive-Model-Based Interpretation of Emotions in a Multi-modal Dialog System -- Affective Advice Giving Dialogs -- Emotional Databases, Annotation Schemes, and Tools -- A Categorical Annotation Scheme for Emotion in the Linguistic Content of Dialogue -- Data-Driven Tools for Designing Talking Heads Exploiting Emotional Attitudes -- Design of a Hungarian Emotional Database for Speech Analysis and Synthesis -- Affective Conversational Agents and Dialogue Simulation -- Emotion and Dialogue in the MRE Virtual Humans -- Coloring Multi-character Conversations through the Expression of Emotions -- Domain-Oriented Conversation with H.C. Andersen -- Simulating the Emotion Dynamics of a Multimodal Conversational Agent -- Design and First Tests of a Chatter -- Endowing Spoken Language Dialogue Systems with Emotional Intelligence -- Do You Want to Talk About It? -- Application of D-Script Model to Emotional Dialogue Simulation -- Synthesis of Emotional Speech and Facial Animations -- Modeling and Synthesizing Emotional Speech for Catalan Text-to-Speech Synthesis -- Dimensional Emotion Representation as a Basis for Speech Synthesis with Non-extreme Emotions -- Extra-Semantic Protocols; Input Requirements for the Synthesis of Dialogue Speech -- How (Not) to Add Laughter to Synthetic Speech -- Modifications of Speech Articulatory Characteristics in the Emotive Speech -- Expressive Animated Agents for Affective Dialogue Systems -- Affective Tutoring Systems -- Affective Feedback in a Tutoring System for Procedural Tasks -- Generating Socially Appropriate Tutorial Dialog -- Evaluation of Affective Dialogue Systems -- The Role of Affect and Sociality in the Agent-Based Collaborative Learning System -- Evaluation of Synthetic Faces: Human Recognition of Emotional Facial Displays -- How to Evaluate Models of User Affect? -- Preliminary Cross-Cultural Evaluation of Expressiveness in Synthetic Faces -- Demonstrations -- Conversational H.C. Andersen First Prototype Description -- Experiences with an Emotional Sales Agent -- A Freely Configurable, Multi-modal Sensor System for Affective Computing -- Gesture Synthesis in a Real-World ECA.

Human conversational partners are able, at least to a certain extent, to detect the speaker’s or listener’s emotional state and may attempt to respond to it accordingly. When instead one of the interlocutors is a computer a number of questions arise, such as the following: To what extent are dialogue systems able to simulate such behaviors? Can we learn the mechanisms of emotional be- viors from observing and analyzing the behavior of human speakers? How can emotionsbeautomaticallyrecognizedfromauser’smimics,gesturesandspeech? What possibilities does a dialogue system have to express emotions itself? And, very importantly, would emotional system behavior be desirable at all? Given the state of ongoing research into incorporating emotions in dialogue systems we found it timely to organize a Tutorial and Research Workshop on A?ectiveDialogueSystems(ADS2004)atKlosterIrseein GermanyduringJune 14–16, 2004. After two successful ISCA Tutorial and Research Workshops on Multimodal Dialogue Systems at the same location in 1999 and 2002, we felt that a workshop focusing on the role of a?ect in dialogue would be a valuable continuation of the workshop series. Due to its interdisciplinary nature, the workshop attracted submissions from researchers with very di?erent backgrounds and from many di?erent research areas, working on, for example, dialogue processing, speech recognition, speech synthesis, embodied conversational agents, computer graphics, animation, user modelling, tutoring systems, cognitive systems, and human-computer inter- tion.

There are no comments on this title.

to post a comment.
The Institute of Mathematical Sciences, Chennai, India

Powered by Koha