Page: 1 2 3 4 5 6 7 8 9... 475
82 |
Data of the Shared Task on the Disambiguation of German Verbal Idioms at KONVENS 2021 ...
|
|
|
|
BASE
|
|
Show details
|
|
84 |
Hebrew Transformed: Machine Translation of Hebrew Using the Transformer Architecture
|
|
|
|
BASE
|
|
Show details
|
|
85 |
Scripted-sentence learning in Spanish speakers (Quique et al., 2022) ...
|
|
|
|
BASE
|
|
Show details
|
|
86 |
Scripted-sentence learning in Spanish speakers (Quique et al., 2022) ...
|
|
|
|
BASE
|
|
Show details
|
|
94 |
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
|
|
|
|
BASE
|
|
Show details
|
|
95 |
Questions in argumentative dialogue
|
|
|
|
In: Journal of Pragmatics ; 188 (2022). - S. 56-79. - Elsevier. - ISSN 0378-2166. - eISSN 1879-1387 (2022)
|
|
BASE
|
|
Show details
|
|
96 |
Machine Learning approaches for Topic and Sentiment Analysis in multilingual opinions and low-resource languages: From English to Guarani
|
|
|
|
BASE
|
|
Show details
|
|
97 |
Brazilian Portuguese verbal databases ; Bases lexicais verbais do português brasileiro
|
|
|
|
In: Domínios de Lingu@gem; Ahead of Print ; 1980-5799 (2022)
|
|
BASE
|
|
Show details
|
|
98 |
Representation learning of natural language and its application to language understanding and generation
|
|
|
|
Abstract:
How to properly represent language is a crucial and fundamental problem in Natural Language Processing (NLP). Language representation learning aims to encode rich information such as the syntax and semantics of the language into dense vectors. It facilitates the modeling, manipulation and analysis of natural language in computational linguistics. Existing algorithms utilize corpus statistics such as word co-occurrences to learn general-purpose language representation. Recent advances in generic representation integrate intensive information such as contextualized features from unlabeled text corpora. In this dissertation, we continue this line of research to incorporate rich knowledge into generic embeddings. We show that word representation could be enriched with various information including temporal and spatial variations as well as syntactic functionalities, and that text representation could be refined with topical knowledge. Moreover, we develop an insight into the geometry of pre-trained representation, and connect it to the semantic understanding such as identifying the idiomatic word usage. Besides generic representation, task-dependent representation is also extensively studied in downstream applications, where the representation is trained to encode domain information from labeled datasets. This dissertation leverages the capability of neural network models to integrate the task-specific supervision into language representations. We introduce new deep learning models and algorithms to train representations with external knowledge in annotated data. It is shown that the learned representation can assist in various downstream tasks in language understanding such as text classification and language generation such as text style transfer. ; U of I Only ; Author requested U of Illinois access only (OA after 2yrs) in Vireo ETD system
|
|
Keyword:
Language Generation; Language Understanding; Natural Language Processing; Representation Learning
|
|
URL: http://hdl.handle.net/2142/108110
|
|
BASE
|
|
Hide details
|
|
99 |
Neural Natural Language Generation: A Survey on Multilinguality, Multimodality, Controllability and Learning
|
|
|
|
BASE
|
|
Show details
|
|
100 |
Detecting weak and strong Islamophobic hate speech on social media
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8 9... 475
|
|