DE eng

Search in the Catalogues and Directories

Hits 1 – 9 of 9

1
Musical Sophistication and Speech Auditory-Motor Coupling: Easy Tests for Quick Answers
In: Front Neurosci (2022)
BASE
Show details
2
Motor representations underlie the reading of unfamiliar letter combinations
Taitz, Alan; Assaneo, M. Florencia; Shalom, Diego E.. - : Nature Publishing Group UK, 2020
BASE
Show details
3
MEG and Language
In: https://hal.archives-ouvertes.fr/hal-02265485 ; 2019 (2019)
BASE
Show details
4
Spontaneous synchronization to speech reveals neural mechanisms facilitating language learning
BASE
Show details
5
Spontaneous synchronization to speech reveals neural mechanisms facilitating language learning
Assaneo, M. Florencia; Ripollés, Pablo; Orpella, Joan. - : Nature Publishing Group, 2019
BASE
Show details
6
The Lateralization of Speech-Brain Coupling Is Differentially Modulated by Intrinsic Auditory and Top-Down Mechanisms
Assaneo, M. Florencia; Rimmele, J. M.; Orpella, Joan. - : Frontiers Media, 2019
BASE
Show details
7
The audiovisual structure of onomatopoeias: An intrusion of real-world physics in lexical creation
In: ISSN: 1932-6203 ; EISSN: 1932-6203 ; PLoS ONE ; https://hal.sorbonne-universite.fr/hal-01774896 ; PLoS ONE, Public Library of Science, 2018, 13 (3), pp.e0193466. ⟨10.1371/journal.pone.0193466⟩ (2018)
Abstract: International audience ; Sound-symbolic word classes are found in different cultures and languages worldwide. These words are continuously produced to code complex information about events. Here we explore the capacity of creative language to transport complex multisensory information in a controlled experiment, where our participants improvised onomatopoeias from noisy moving objects in audio, visual and audiovisual formats. We found that consonants communicate movement types (slide, hit or ring) mainly through the manner of articulation in the vocal tract. Vowels communicate shapes in visual stimuli (spiky or rounded) and sound frequencies in auditory stimuli through the configuration of the lips and tongue. A machine learning model was trained to classify movement types and used to validate generalizations of our results across formats. We implemented the classifier with a list of cross-linguistic onomatopoeias simple actions were correctly classified, while different aspects were selected to build onomatopoeias of complex actions. These results show how the different aspects of complex sensory information are coded and how they interact in the creation of novel onomatopoeias.
Keyword: [PHYS]Physics [physics]; [SDV]Life Sciences [q-bio]
URL: https://hal.sorbonne-universite.fr/hal-01774896/document
https://hal.sorbonne-universite.fr/hal-01774896/file/journal.pone.0193466.pdf
https://doi.org/10.1371/journal.pone.0193466
https://hal.sorbonne-universite.fr/hal-01774896
BASE
Hide details
8
The audiovisual structure of onomatopoeias: An intrusion of real-world physics in lexical creation
Taitz, Alan; Assaneo, M. Florencia; Elisei, Natalia. - : Public Library of Science, 2018
BASE
Show details
9
Exploring the anatomical encoding of voice with a mathematical model of the vocal system.
In: ISSN: 1053-8119 ; EISSN: 1095-9572 ; NeuroImage ; https://hal.inria.fr/hal-01498364 ; NeuroImage, Elsevier, 2016, 141, pp.31-9. ⟨10.1016/j.neuroimage.2016.07.033⟩ (2016)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
9
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern