Page: 1... 4 5 6 7 8 9 10 11 12
141 |
Les traducteurs, créateurs et usagers des terminologies juridiques multilingues. Enjeux, méthodes et ressources en évolution
|
|
|
|
In: ISSN: 1015-7573 ; Parallèles, Vol. 30, No 1 (2018) pp. 4-7 (2018)
|
|
BASE
|
|
Show details
|
|
142 |
Beyond bilingualism: multilingual experience correlates with caudate volume
|
|
|
|
In: ISSN: 1863-2653 ; Brain Structure and Function (2018) (2018)
|
|
BASE
|
|
Show details
|
|
143 |
Translanguaging and Multilingual Picturebooks : Gloria Anzaldúa’s Friends from the Other Side / Amigos Del Otro Lado
|
|
|
|
BASE
|
|
Show details
|
|
144 |
Effects of Bilingualism on Auditory Lexical Decision Tasks
|
|
|
|
BASE
|
|
Show details
|
|
145 |
A deep learning approach to bilingual lexicon induction in the biomedical domain.
|
|
|
|
Abstract:
BACKGROUND: Bilingual lexicon induction (BLI) is an important task in the biomedical domain as translation resources are usually available for general language usage, but are often lacking in domain-specific settings. In this article we consider BLI as a classification problem and train a neural network composed of a combination of recurrent long short-term memory and deep feed-forward networks in order to obtain word-level and character-level representations. RESULTS: The results show that the word-level and character-level representations each improve state-of-the-art results for BLI and biomedical translation mining. The best results are obtained by exploiting the synergy between these word-level and character-level representations in the classification model. We evaluate the models both quantitatively and qualitatively. CONCLUSIONS: Translation of domain-specific biomedical terminology benefits from the character-level representations compared to relying solely on word-level representations. It is beneficial to take a deep learning approach and learn character-level representations rather than relying on handcrafted representations that are typically used. Our combined model captures the semantics at the word level while also taking into account that specialized terminology often originates from a common root form (e.g., from Greek or Latin).
|
|
Keyword:
Data Mining; Deep Learning; Humans; Knowledge Bases; Multilingualism; Natural Language Processing; Semantics
|
|
URL: https://www.repository.cam.ac.uk/handle/1810/288980 https://doi.org/10.17863/CAM.36243
|
|
BASE
|
|
Hide details
|
|
146 |
Resisting attraction: Individual differences in executive control are associated with subject-verb agreement errors in production.
|
|
|
|
BASE
|
|
Show details
|
|
147 |
Constructing L3 selves: A study of undergraduate learners' motivation to learn a third language in China
|
|
Wang, Tianyi. - : University of Cambridge, 2018. : Faculty of Education, 2018. : Clare Hall College, 2018
|
|
BASE
|
|
Show details
|
|
148 |
From Empire to Nation: The Politics of Language in Manchuria (1890-1911)
|
|
He, Jiani. - : University of Cambridge, 2018. : Faculty of Asian and Middle Eastern Studies, 2018. : Newnham College, 2018
|
|
BASE
|
|
Show details
|
|
149 |
Language identity and multiculturalism: A case study on Singlish
|
|
|
|
BASE
|
|
Show details
|
|
150 |
Sociocultural implications of French in Middle English texts
|
|
|
|
BASE
|
|
Show details
|
|
152 |
Do interlocutors or conversation topics affect migrants' sense of feeling different when switching languages?
|
|
|
|
BASE
|
|
Show details
|
|
153 |
Esto necesito y así lo soluciono : necesidades y herramientas de estudiantes y profesionales de la traducción y la interpretación frente al reto multilingüe digital
|
|
|
|
BASE
|
|
Show details
|
|
155 |
Machine-translation inspired reordering as preprocessing for cross-lingual sentiment analysis
|
|
|
|
BASE
|
|
Show details
|
|
158 |
La competència plurilingüe en la formació inicial de mestres. Estudi longitudinal de casos sobre l’evolució de les creences relacionades amb l’educació plurilingüe
|
|
|
|
BASE
|
|
Show details
|
|
160 |
Esmorzem? : proposta didàctica per a l'escola La Sínia de Vic
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1... 4 5 6 7 8 9 10 11 12
|
|