ELSNET-list archive

Category:   E-Job
Subject:   PhD Studentship on "Joint Attention in Human-Agent-Interaction" at University of Saarland, Germany
Email:   mariastaudte_(on)_googlemail.com
Date received:   04 Mar 2013

PhD Studentship "Joint Attention in Human-Agent-Interaction" in Psycholinguistics/Human-Agent-Interaction Saarland University, Saarbruecken, Germany The Embodied Spoken Interaction (ESI) group at the Saarland University, headed by Dr. Maria Staudte, invites applications for a PhD studentship, available from June 1st, 2013 (75% TV-L E13). People interact using speech as well as other non-verbal cues. Gaze, for instance, ubiquitously accompanies utterances in face-to-face interaction and may provide additional referential information from the speaker or may reveal whether the listener has understood. Following the partner's gaze to share visual information (and establish joint attention) can therefore be essential for successful and efficient communication. Understanding and modeling the dynamic interplay of speech and non-verbal cues such as gaze and how they combine to encode a particular message is a complex enterprise. The use of virtual agents as interaction partners provides one way to approach this problem: The artificial partner introduces a precise, controllable, and yet dynamic component to the interaction with humans. Using them as test beds enables us to observe, model and test complex multi-modal behaviors. The proposed PhD project will develop and simulate interactive behaviors of a virtual character using state-of-the-art virtual agent software (developed here at the MMCI Cluster of Excellence) and modern eye- and motion-tracking systems. Besides the technical component, this research is also empirical and will involve the design of user studies and data analysis in order to address questions such as when joint attention is/should be established and how it affects spoken content. Applicants should hold a Master degree in computational linguistics, computer science, cognitive science, psychology or psycholinguistics (or equivalent) and should have an interest in modeling and understanding the dynamics of gaze and speech in interaction. Basic programming skills are necessary. Experience with experiment design and statistics are an advantage but not required. Most importantly, the successful applicant should be enthusiastic about the general research questions and be prepared to learn new methods. The Embodied Spoken Interaction (ESI) group is part of the "Multi-modal Computing and Interaction" Cluster of Excellence (http://www.mmci.uni-saarland.de/) at Saarland University which provides a very fruitful and constructive research environment with excellent opportunities for exchange and cooperation. The group has access to numerous state-of-the-art eye-tracking laboratories, a 64 channel EEG/ERP lab, and modern computing infrastructure, and conducts research at the level of international excellence. The candidate will be expected to contribute to the high standards of the group and to be actively involved in the preparation and publication of new results. Further information about the group can be found at: http://www.mmci.uni-saarland.de/en/independent_research_groups/esi Applicants should submit their research statement, a CV, a copy of their school and university degrees, a representative reprint (thesis or paper if applicable), and names and contact information of two references. The position remains open until filled, but preference will be given to applications received by 1 April. All documents should be e-mailed as a single PDF to: Dr. Maria Staudte e-mail: masta_(at)_coli.uni-saarland.de Department of Computational Linguistics Saarland University 66123 Saarbruecken, Germany __________________________________________ - ELSNET mailing list Elsnet-list_(at)_elsnet.org - To manage your subscription go to: http://mailman.elsnet.org/mailman/listinfo/elsnet-list

[print/pda] [no frame] [navigation table] [navigation frame]     Page generated 05-03-2013 by Steven Krauwer Disclaimer / Contact ELSNET