DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 26

1
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
2
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
3
Hierarchical-Task Reservoir for Online Semantic Analysis from Continuous Speech
In: ISSN: 2162-237X ; IEEE Transactions on Neural Networks and Learning Systems ; https://hal.inria.fr/hal-03031413 ; IEEE Transactions on Neural Networks and Learning Systems, IEEE, 2021, ⟨10.1109/TNNLS.2021.3095140⟩ ; https://ieeexplore.ieee.org/abstract/document/9548713/metrics#metrics (2021)
BASE
Show details
4
Editorial: Language and Robotics
In: ISSN: 2296-9144 ; Frontiers in Robotics and AI ; https://hal.inria.fr/hal-03533733 ; Frontiers in Robotics and AI, Frontiers Media S.A., 2021, 8, ⟨10.3389/frobt.2021.674832⟩ (2021)
BASE
Show details
5
Cross-Situational Learning with Reservoir Computing for Language Acquisition Modelling
In: 2020 International Joint Conference on Neural Networks (IJCNN 2020) ; https://hal.inria.fr/hal-02594725 ; 2020 International Joint Conference on Neural Networks (IJCNN 2020), Jul 2020, Glasgow, Scotland, United Kingdom ; https://wcci2020.org/ (2020)
BASE
Show details
6
Language Acquisition with Echo State Networks: Towards Unsupervised Learning
In: ICDL 2020 - IEEE International Conference on Development and Learning ; https://hal.inria.fr/hal-02926613 ; ICDL 2020 - IEEE International Conference on Development and Learning, Oct 2020, Valparaiso / Virtual, Chile (2020)
BASE
Show details
7
A Journey in ESN and LSTM Visualisations on a Language Task
In: https://hal.inria.fr/hal-03030248 ; 2020 (2020)
Abstract: Echo States Networks (ESN) and Long-Short Term Memory networks (LSTM) are two popular architectures of Recurrent Neural Networks (RNN) to solve machine learning task involving sequential data. However, little have been done to compare their performances and their internal mechanisms on a common task. In this work, we trained ESNs and LSTMs on a Cross-Situationnal Learning (CSL) task. This task aims at modelling how infants learn language: they create associations between words and visual stimuli in order to extract meaning from words and sentences. The results are of three kinds: performance comparison, internal dynamics analyses and visualization of latent space. (1) We found that both models were able to successfully learn the task: the LSTM reached the lowest error for the basic corpus, but the ESN was quicker to train. Furthermore, the ESN was able to outperform LSTMs on datasets more challenging without any further tuning needed. (2) We also conducted an analysis of the internal units activations of LSTMs and ESNs. Despite the deep differences between both models (trained or fixed internal weights), we were able to uncover similar inner mechanisms: both put emphasis on the units encoding aspects of the sentence structure. (3) Moreover, we present Recurrent States Space Visualisations (RSSviz), a method to visualize the structure of latent state space of RNNs, based on dimension reduction (using UMAP). This technique enables us to observe a fractal embedding of sequences in the LSTM. RSSviz is also useful for the analysis of ESNs (i) to spot difficult examples and (ii) to generate animated plots showing the evolution of activations across learning stages. Finally, we explore qualitatively how the RSSviz could provide an intuitive visualisation to understand the influence of hyperparameters on the reservoir dynamics prior to ESN training.
Keyword: [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]; [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]; [INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]; [SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]; Cross-Situational Learning; Dimension Reduction; ESN; LSTM; UMAP; Visualisation
URL: https://hal.inria.fr/hal-03030248
https://hal.inria.fr/hal-03030248/file/Comparison_between_LSTM_and_ESN%2812%29.pdf
https://hal.inria.fr/hal-03030248/document
BASE
Hide details
8
Learning to Parse Grounded Language using Reservoir Computing
In: ICDL-Epirob 2019 - Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics ; https://hal.inria.fr/hal-02422157 ; ICDL-Epirob 2019 - Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, Aug 2019, Olso, Norway. ⟨10.1109/devlrn.2019.8850718⟩ ; https://ieeexplore.ieee.org/abstract/document/8850718 (2019)
BASE
Show details
9
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: ISSN: 2379-8920 ; EISSN: 2379-8939 ; IEEE Transactions on Cognitive and Developmental Systems ; https://hal.inria.fr/hal-01964541 ; IEEE Transactions on Cognitive and Developmental Systems, Institute of Electrical and Electronics Engineers, Inc, 2019, ⟨10.1109/TCDS.2019.2957006⟩ ; https://doi.org/10.1109/tcds.2019.2957006 (2019)
BASE
Show details
10
A Reservoir Model for Intra-Sentential Code-Switching Comprehension in French and English
In: CogSci'19 - 41st Annual Meeting of the Cognitive Science Society ; https://hal.inria.fr/hal-02432831 ; CogSci'19 - 41st Annual Meeting of the Cognitive Science Society, Jul 2019, Montréal, Canada ; https://cognitivesciencesociety.org/cogsci-2019/ (2019)
BASE
Show details
11
Which Input Abstraction is Better for a Robot Syntax Acquisition Model? Phonemes, Words or Grammatical Constructions?
In: 2018 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob) ; https://hal.inria.fr/hal-01889919 ; 2018 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), Sep 2018, Tokyo, Japan (2018)
BASE
Show details
12
From Phonemes to Sentence Comprehension: A Neurocomputational Model of Sentence Processing for Robots
In: SBDM2018 Satellite-Workshop on interfaces between Robotics, Artificial Intelligence and Neuroscience ; https://hal.inria.fr/hal-01964524 ; SBDM2018 Satellite-Workshop on interfaces between Robotics, Artificial Intelligence and Neuroscience, May 2018, Paris, France (2018)
BASE
Show details
13
From Phonemes to Robot Commands with a Neural Parser
In: IEEE ICDL-EPIROB Workshop on Language Learning ; https://hal.inria.fr/hal-01665823 ; IEEE ICDL-EPIROB Workshop on Language Learning, Sep 2017, Lisbon, Portugal. pp.1-2 (2017)
BASE
Show details
14
Modelling sentence processing with random recurrent neural networks and applications to robotics
In: Workshop "The role of the basal ganglia in the interaction between language and other cognitive functions" ; https://hal.inria.fr/hal-01673440 ; Workshop "The role of the basal ganglia in the interaction between language and other cognitive functions", Anne-Catherine Bachoud-Lévi, Maria Giavazzi, Charlotte Jacquemot, Laboratoire de NeuroPsychologie Interventionnelle., Oct 2017, Paris, France ; http://www.ens.fr/agenda/role-basal-ganglia-interaction-between-language-and-other-cognitive-functions/2017-10 (2017)
BASE
Show details
15
Syntactic Reanalysis in Language Models for Speech Recognition
In: 2017 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob) ; https://hal.inria.fr/hal-01558462 ; 2017 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), Sep 2017, Lisbon, Portugal ; http://icdl-epirob.org/ (2017)
BASE
Show details
16
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: https://hal.inria.fr/hal-01665807 ; 2017 (2017)
BASE
Show details
17
Recurrent Neural Network for Syntax Learning with Flexible Predicates for Robotic Architectures
In: The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB) ; https://hal.inria.fr/hal-01417697 ; The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB), Sep 2016, Cergy, France ; http://icdl-epirob.org/ (2016)
BASE
Show details
18
Recurrent Neural Network for Syntax Learning with Flexible Representations
In: IEEE ICDL-EPIROB Workshop on Language Learning ; https://hal.inria.fr/hal-01417060 ; IEEE ICDL-EPIROB Workshop on Language Learning, Dec 2016, Cergy, France ; https://sites.google.com/site/epirob2016language/ (2016)
BASE
Show details
19
Reservoir Computing for Robot Language Acquisition
In: IROS Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics ; https://hal.inria.fr/hal-01417683 ; IROS Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics, Oct 2016, Daejon, South Korea ; http://mlhlcr2016.tanichu.com/home (2016)
BASE
Show details
20
Recurrent Neural Network Sentence Parser for Multiple Languages with Flexible Meaning Representations for Home Scenarios
In: IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios ; https://hal.inria.fr/hal-01417667 ; IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios, Oct 2016, Daejon, South Korea ; https://www.informatik.uni-hamburg.de/wtm/SocialRobotsWorkshop2016/index.php (2016)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
26
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern