DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Universal Dependencies 1.4
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2016
BASE
Show details
2
Universal Dependencies 1.3
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2016
BASE
Show details
3
Many Languages, One Parser ...
BASE
Show details
4
What Do Recurrent Neural Network Grammars Learn About Syntax? ...
Abstract: Recurrent neural network grammars (RNNG) are a recently proposed probabilistic generative modeling family for natural language. They show state-of-the-art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the best performance. Through the attention mechanism, we find that headedness plays a central role in phrasal representation (with the model's latent attention largely agreeing with predictions made by hand-crafted head rules, albeit with some important differences). By training grammars without nonterminal labels, we find that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis. ... : 10 pages. To appear in EACL 2017, Valencia, Spain ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1611.05774
https://dx.doi.org/10.48550/arxiv.1611.05774
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern