1 |
Representation Learning beyond Semantic Similarity: Character-aware and Function-specific Approaches ...
|
|
|
|
Abstract:
Representation learning is a research area within machine learning and natural language processing (NLP) concerned with building machine-understandable representations of discrete units of text. Continuous representations are at the core of modern machine learning applications, and representation learning has thereby become one of the central research areas in NLP. The induction of text representations is typically based on the distributional hypothesis, and consequently encodes general information about word similarity. Words or phrases with similar meaning obtain similar representations in a vector space constructed for this purpose. This established methodology excels for morphologically-simple languages such as English, and in data-rich settings. However, several useful lexical relations such as entailment or selectional preference, are not captured or get conflated with other relations. Another challenge is dealing with low-data regimes for morphologically-complex and under-resourced languages. In this ... : ERC Consolidator Grant LEXICAL (648909) ...
|
|
Keyword:
multilingual; representation learning; word vector spaces
|
|
URL: https://dx.doi.org/10.17863/cam.52043 https://www.repository.cam.ac.uk/handle/1810/304962
|
|
BASE
|
|
Hide details
|
|
2 |
Representation Learning beyond Semantic Similarity: Character-aware and Function-specific Approaches
|
|
Gerz, Daniela Susanne. - : University of Cambridge, 2020. : Theoretical and Applied Linguistics, 2020. : Lucy Cavendish College, 2020
|
|
BASE
|
|
Show details
|
|
|
|