How Hands Help Us Hear
HearingHands aims to explore the impact of gesture-speech coupling on audiovisual communication across languages and populations, enhancing our understanding of multimodal interaction.
Projectdetails
Introduction
Human communication in face-to-face conversations is inherently multimodal, combining spoken language with a plethora of multimodal cues including hand gestures. Although most of our understanding of human language comes from unimodal research, the multimodal literature suggests that hand gestures are produced in close synchrony to speech prosody, aligning for instance with stressed syllables in free-stress languages like English.
The Role of Prosody
Furthermore, prosody plays a vital role in spoken word recognition in many languages, influencing core cognitive processes involved in speech perception, such as:
- Lexical activation
- Segmentation
- Recognition
Consequently, viewing gestural timing as an audiovisual prosody cue raises the possibility that the temporal alignment of hand gestures to speech directly influences what we hear (e.g., distinguishing OBject vs. obJECT).
Research Gap
However, research to date has largely overlooked the functional contribution of gestural timing to human communication.
Project Objectives
Therefore, HearingHands aims to uncover how gesture-speech coupling contributes to audiovisual communication in human interaction. Its objectives are:
- [WP1] To chart the PREVALENCE of the use of gesture-speech coupling as a multimodal prominence cue in production and perception across a typologically diverse set of languages.
- [WP2] To capture the VARIABILITY in production and perception of gesture-speech coupling in both neurotypical and atypical populations.
- [WP3] To determine the CONSTRAINTS that govern gestural timing effects in more naturalistic communicative settings.
Methodology
These objectives will be achieved through:
- Cross-linguistic comparisons of gesture-speech production and perception
- Neuroimaging of multimodal integration in autistic and neurotypical individuals
- Psychoacoustic tests of gestural timing effects employing eye-tracking and virtual reality
Conclusion
Thus, HearingHands has the potential to revolutionize models of multimodal human communication, delineating how hands help us hear.
Financiële details & Tijdlijn
Financiële details
Subsidiebedrag | € 1.499.988 |
Totale projectbegroting | € 1.499.988 |
Tijdlijn
Startdatum | 1-9-2022 |
Einddatum | 31-8-2027 |
Subsidiejaar | 2022 |
Partners & Locaties
Projectpartners
- STICHTING RADBOUD UNIVERSITEITpenvoerder
Land(en)
Vergelijkbare projecten binnen European Research Council
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
Human communication as joint epistemic engineeringMINDSHARING explores how humans effectively communicate using context-dependent signals through computational, cultural, and neurocognitive methods to uncover universal communicative principles. | ERC Advanced... | € 2.499.105 | 2023 | Details |
Defining an integrated model of the neural processing of speech in light of its multiscale dynamicsThis project aims to develop an integrated model of speech perception by analyzing neural oscillatory dynamics and their relationship with linguistic timescales using advanced neuroimaging techniques. | ERC Consolid... | € 1.861.100 | 2022 | Details |
Fundamentals of formal properties of nonmanuals: A quantitative approachThis project aims to establish a quantitative typology of nonmanuals in five sign languages to explore universal linguistic features and their differences in expression across modalities. | ERC Starting... | € 1.500.000 | 2023 | Details |
Towards an emerging field of social neuroscience in human groupsThe GROUPS project investigates how multimodal synchrony among group members influences individual and collective outcomes, aiming to enhance understanding of group dynamics and societal functioning. | ERC Consolid... | € 2.000.000 | 2024 | Details |
Empirical and mechanistic foundations for synergistic predictive processing in the sensory brainSynPrePro aims to integrate hierarchical predictive coding with subcortical processing to enhance understanding of sensory input processing and its implications for perceptual disorders. | ERC Starting... | € 1.499.945 | 2024 | Details |
Human communication as joint epistemic engineering
MINDSHARING explores how humans effectively communicate using context-dependent signals through computational, cultural, and neurocognitive methods to uncover universal communicative principles.
Defining an integrated model of the neural processing of speech in light of its multiscale dynamics
This project aims to develop an integrated model of speech perception by analyzing neural oscillatory dynamics and their relationship with linguistic timescales using advanced neuroimaging techniques.
Fundamentals of formal properties of nonmanuals: A quantitative approach
This project aims to establish a quantitative typology of nonmanuals in five sign languages to explore universal linguistic features and their differences in expression across modalities.
Towards an emerging field of social neuroscience in human groups
The GROUPS project investigates how multimodal synchrony among group members influences individual and collective outcomes, aiming to enhance understanding of group dynamics and societal functioning.
Empirical and mechanistic foundations for synergistic predictive processing in the sensory brain
SynPrePro aims to integrate hierarchical predictive coding with subcortical processing to enhance understanding of sensory input processing and its implications for perceptual disorders.
Vergelijkbare projecten uit andere regelingen
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
Eye contact device for video callsDit project onderzoekt de ontwikkeling van een applicatie die oogcontact en sociale signalen tijdens videoconsulten optimaliseert voor betere communicatie in de gezondheidszorg. | Mkb-innovati... | € 20.000 | 2021 | Details |
SpeakScriptAmberScript en SpeakSee ontwikkelen een kosteneffectieve dienst voor real-time transcriptie van gesprekken, gericht op doven en slechthorenden, om hun communicatie en werkgelegenheid te verbeteren. | Mkb-innovati... | € 348.250 | 2019 | Details |
Precision Hearing Diagnostics and Augmented-hearing TechnologiesThe project aims to develop a portable diagnostic device for cochlear synaptopathy and augmented-hearing technologies, transitioning innovative research into practical clinical applications. | EIC Transition | € 2.499.416 | 2022 | Details |
Luisterinspanning objectief bepalen met EEGHet project richt zich op het meten van luisterinspanning via EEG en AI om hoorapparaten voor mensen met gehoorproblemen te optimaliseren. | Mkb-innovati... | € 20.000 | 2021 | Details |
Right HearingHet project ontwikkelt een hoorsysteem dat spraak versterkt en achtergrondgeluiden minimaliseert, ter verbetering van sociale interactie voor slechthorenden. | 1.1 - RSO1.1... | € 435.705 | 2024 | Details |
Eye contact device for video calls
Dit project onderzoekt de ontwikkeling van een applicatie die oogcontact en sociale signalen tijdens videoconsulten optimaliseert voor betere communicatie in de gezondheidszorg.
SpeakScript
AmberScript en SpeakSee ontwikkelen een kosteneffectieve dienst voor real-time transcriptie van gesprekken, gericht op doven en slechthorenden, om hun communicatie en werkgelegenheid te verbeteren.
Precision Hearing Diagnostics and Augmented-hearing Technologies
The project aims to develop a portable diagnostic device for cochlear synaptopathy and augmented-hearing technologies, transitioning innovative research into practical clinical applications.
Luisterinspanning objectief bepalen met EEG
Het project richt zich op het meten van luisterinspanning via EEG en AI om hoorapparaten voor mensen met gehoorproblemen te optimaliseren.
Right Hearing
Het project ontwikkelt een hoorsysteem dat spraak versterkt en achtergrondgeluiden minimaliseert, ter verbetering van sociale interactie voor slechthorenden.