DE eng

Search in the Catalogues and Directories

Hits 1 – 10 of 10

1
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
2
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
3
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: ISSN: 2379-8920 ; EISSN: 2379-8939 ; IEEE Transactions on Cognitive and Developmental Systems ; https://hal.inria.fr/hal-01964541 ; IEEE Transactions on Cognitive and Developmental Systems, Institute of Electrical and Electronics Engineers, Inc, 2019, ⟨10.1109/TCDS.2019.2957006⟩ ; https://doi.org/10.1109/tcds.2019.2957006 (2019)
BASE
Show details
4
Syntactic Reanalysis in Language Models for Speech Recognition
In: 2017 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob) ; https://hal.inria.fr/hal-01558462 ; 2017 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), Sep 2017, Lisbon, Portugal ; http://icdl-epirob.org/ (2017)
BASE
Show details
5
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: https://hal.inria.fr/hal-01665807 ; 2017 (2017)
BASE
Show details
6
Recurrent Neural Network for Syntax Learning with Flexible Predicates for Robotic Architectures
In: The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB) ; https://hal.inria.fr/hal-01417697 ; The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB), Sep 2016, Cergy, France ; http://icdl-epirob.org/ (2016)
BASE
Show details
7
Recurrent Neural Network Sentence Parser for Multiple Languages with Flexible Meaning Representations for Home Scenarios
In: IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios ; https://hal.inria.fr/hal-01417667 ; IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios, Oct 2016, Daejon, South Korea ; https://www.informatik.uni-hamburg.de/wtm/SocialRobotsWorkshop2016/index.php (2016)
BASE
Show details
8
Semantic Role Labelling for Robot Instructions using Echo State Networks
In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN) ; https://hal.inria.fr/hal-01417701 ; European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Apr 2016, Bruges, Belgium ; https://www.elen.ucl.ac.be/esann/index.php?pg=esann16_programme (2016)
BASE
Show details
9
Using Natural Language Feedback in a Neuro-inspired Integrated Multimodal Robotic Architecture
In: 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) ; https://hal.inria.fr/hal-01417706 ; 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Aug 2016, New York City, United States. pp.52 - 57, ⟨10.1109/ROMAN.2016.7745090⟩ ; http://www.tc.columbia.edu/conferences/roman2016/ (2016)
BASE
Show details
10
A Recurrent Neural Network for Multiple Language Acquisition: Starting with English and French
In: Proceedings of the NIPS Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches (CoCo 2015) ; https://hal.inria.fr/hal-02561258 ; Proceedings of the NIPS Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches (CoCo 2015), Dec 2015, Montreal, Canada ; http://ceur-ws.org/Vol-1583/ (2015)
Abstract: International audience ; How humans acquire language, and in particular two or more different languages with the same neural computing substrate, is still an open issue. To address this issue we suggest to build models that are able to process any language from the very beginning. Here we propose a developmental and neuro-inspired approach that processes sentences word by word with no prior knowledge of the semantics of the words. Our model has no "pre-wired" structure but only random and learned connections: it is based on Reservoir Computing. Our previous model has been implemented in the context of robotic platforms where users could teach basics of the English language to instruct a robot to perform actions. In this paper, we add the ability to process infrequent words, so we could keep our vocabulary size very small while processing natural language sentences. Moreover, we extend this approach to the French language and demonstrate that the network can learn both languages at the same time. Even with small corpora the model is able to learn and generalize in monolingual and bilingual conditions. This approach promises to be a more practical alternative for small corpora of different languages than other supervised learning methods relying on big data sets or more hand-crafted parsers requiring more manual encoding effort.
Keyword: [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]; [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]; [INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]; [SCCO.LING]Cognitive science/Linguistics; [SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]
URL: https://hal.inria.fr/hal-02561258
https://hal.inria.fr/hal-02561258/file/CoCoNIPS_2015_paper_14.pdf
https://hal.inria.fr/hal-02561258/document
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
10
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern