This seminar, Speech-Gesture Coordination Beyond Intonation, will be taking place on Wednesday 2 April 2025 at 12:00-13:00, via Zoom link:
Speaker: Dr Olcay Türk (Research Associate, Bielefeld University, Germany)
Abstract
In this talk, Dr Olcay Türk will present an overview of his work on the relationships between different aspects of spoken language and bodily movements. The overarching theme of his research can be described as understanding and modeling how multimodal behavior becomes coordinated with linguistic functions and forms in a way that facilitates meaningful and stable communication.
Researchers agree that speech and gesture production are related, particularly in how prosody and gesture coordinate around the concept of prominence. His work has shown that prominence-based coordination of prosody and gesture holds true in Turkish, but prosodic phrasing also plays an important role. Moreover, it provides evidence for a formal and functional link between information structural categories (i.e., topic/focus) and different types of gestural units (i.e., phrases).
More recently, his research has extended the study of the gesture-prosody relationship to rhythm, focusing on developing and adapting signal-based rhythm metrics that can be applied to both speech and gesture. Through cross-linguistic production and perception studies, this investigation aims to determine whether gestural rhythm reflects underlying speech rhythm and whether differences exist between languages, as predicted by traditional speech rhythm classifications.
Finally, he will discuss his work on the TRR 318 – A02 project at Bielefeld University. This project aims to develop a model of conversation for an intelligent system capable of detecting the signaled level of understanding of an explanation by analyzing the coordination and usage of relevant verbal and non-verbal cues (e.g., nods, gaze, hand gestures, backchannels, hesitations) among a large set of communicative signals.