41 |
Deep Learning Methods for Human Behavior Recognition
|
|
Lu, Jia. - : Auckland University of Technology, 2021
|
|
BASE
|
|
Show details
|
|
42 |
La saillance : origines perceptives, applications linguistiques, enjeux interdisciplinaires
|
|
|
|
In: ISSN: 0761-2990 ; EISSN: 1957-780X ; Semen - Revue de sémio-linguistique des textes et discours ; https://hal.archives-ouvertes.fr/hal-03153505 ; Semen - Revue de sémio-linguistique des textes et discours, Presses Universitaires de l'Université de Franche Comté (Pufc), 2021, Signifiant et matière : l'iconicité et la plasticité dans le document numérique verbal et visuel, 49, pp.35-50 (2021)
|
|
BASE
|
|
Show details
|
|
43 |
Overt speech critically changes lateralization index and did not allow determination of hemispheric dominance for language: an fMRI study
|
|
|
|
In: ISSN: 1471-2202 ; EISSN: 1471-2202 ; BMC Neuroscience ; https://www.hal.inserm.fr/inserm-03474160 ; BMC Neuroscience, BioMed Central, 2021, 22 (1), pp.74. ⟨10.1186/s12868-021-00671-y⟩ (2021)
|
|
BASE
|
|
Show details
|
|
44 |
Bayesian modelling of reading ; Modélisation bayésienne de la lecture
|
|
|
|
In: https://tel.archives-ouvertes.fr/tel-03364950 ; Linguistique. Université Grenoble Alpes [2020-.], 2021. Français. ⟨NNT : 2021GRALS014⟩ (2021)
|
|
BASE
|
|
Show details
|
|
45 |
Development of thalamus mediates paternal age effect on offspring reading: A preliminary investigation.
|
|
|
|
In: Human brain mapping, vol 42, iss 14 (2021)
|
|
BASE
|
|
Show details
|
|
46 |
The Visual System Prioritizes High-Level Scene Properties for Attentional Selection
|
|
|
|
BASE
|
|
Show details
|
|
48 |
L'article partitif indique-t-il une perception "quantitative / partitive" du référent ?
|
|
|
|
In: Sens (inter)dits, Construction du sens et représentation des référents ; https://hal.archives-ouvertes.fr/hal-02323549 ; C. Lacassain-Lagoin, F. Marsac, W. Ucherek, K. Chovancova. Sens (inter)dits, Construction du sens et représentation des référents, L'Harmattan, pp.109-125, 2021, Collection Dixit Grammatica (2021)
|
|
BASE
|
|
Show details
|
|
49 |
Qui gardera les gardiens ? Sur certaines déclinaisons sémiotiques de la transparence en vue d’une évaluation critique des nudges
|
|
|
|
In: EISSN: 2270-4957 ; Actes Sémiotiques ; https://hal-unilim.archives-ouvertes.fr/hal-03108925 ; Actes Sémiotiques , Université de Limoges, 2021, ⟨10.25965/as.6720⟩ (2021)
|
|
BASE
|
|
Show details
|
|
50 |
Deep learning and the Global Workspace Theory
|
|
|
|
In: ISSN: 0166-2236 ; EISSN: 1878-108X ; Trends in Neurosciences ; https://hal.archives-ouvertes.fr/hal-03311492 ; Trends in Neurosciences, Elsevier, 2021, ⟨10.1016/j.tins.2021.04.005⟩ (2021)
|
|
BASE
|
|
Show details
|
|
51 |
Attention aux robots ! Comment des artefacts de téléprésence modifient les processus attentionnels pendant un séminaire doctoral
|
|
|
|
In: Drôles d'objets: un nouvel art de faire ; https://hal.archives-ouvertes.fr/hal-03597054 ; Drôles d'objets: un nouvel art de faire, Oct 2021, La Rochelle, France. ⟨10.5281/zenodo.6059591⟩ ; drolesdobjets20.sciencesconf.org (2021)
|
|
BASE
|
|
Show details
|
|
52 |
The grammar of engagement I: framework and initialexemplification
|
|
|
|
In: Language and Cognition (2021)
|
|
BASE
|
|
Show details
|
|
53 |
ТРУДНОСТИ ПРЕПОДАВАНИЯ АНГЛИЙСКОГО ЯЗЫКА В НЕЯЗЫКОВОМ ВУЗЕ ... : CHALLENGES OF TEACHING ENGLISH IN A NON-LINGUISTIC HIGHER SCHOOL ...
|
|
|
|
BASE
|
|
Show details
|
|
56 |
ATTENDING TO LEARN WHILE LEARNING TO ATTEND: RECIPROCAL RELATIONS BETWEEN INFANT ATTENTION AND CONTINGENT CONTINGENT INTERACTIONS AND IMPLICATIONS FOR LANGUAGE DEVELOPMENT ...
|
|
|
|
BASE
|
|
Show details
|
|
57 |
Prosodic predictability of child-directed speech in home language environments ...
|
|
|
|
BASE
|
|
Show details
|
|
58 |
Effects of attention to form on second language comprehension: A multi-site replication study - Registered materials and procedures with audio files ...
|
|
|
|
BASE
|
|
Show details
|
|
59 |
Auditory attention development from infancy to adolescence: A systematic review ...
|
|
|
|
BASE
|
|
Show details
|
|
60 |
Language Representation Models: An Overview
|
|
|
|
In: Entropy ; Volume 23 ; Issue 11 (2021)
|
|
Abstract:
In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have resulted in techniques that have allowed for the practical application of transfer learning in NLP. The advances in the field have been substantial, and the milestone of outperforming human baseline performance based on the general language understanding evaluation has been achieved. This paper implements a targeted literature review to outline, describe, explain, and put into context the crucial techniques that helped achieve this milestone. The research presented here is a targeted review of neural language models that present vital steps towards a general language representation model.
|
|
Keyword:
attention-based models; deep learning; embeddings; multi-task learning; natural language processing; neural networks; transformer
|
|
URL: https://doi.org/10.3390/e23111422
|
|
BASE
|
|
Hide details
|
|
|
|