1 |
Domain-Adaptive Pretraining Methods for Dialogue Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Instance-adaptive training with noise-robust losses against noisy labels ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Character-based PCFG Induction for Modeling the Syntactic Acquisition of Morphologically Rich Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Connect-the-Dots: Bridging Semantics between Words and Definitions via Aligning Word Sense Inventories ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Connect-the-Dots: Bridging Semantics between Words and Definitions via Aligning Word Sense Inventories ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
The Importance of Category Labels in Grammar Induction with Child-directed Utterances ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Unsupervised Grammar Induction with Depth-bounded PCFG ...
|
|
|
|
Abstract:
There has been recent interest in applying cognitively or empirically motivated bounds on recursion depth to limit the search space of grammar induction models (Ponvert et al., 2011; Noji and Johnson, 2016; Shain et al., 2016). This work extends this depth-bounding approach to probabilistic context-free grammar induction (DB-PCFG), which has a smaller parameter space than hierarchical sequence models, and therefore more fully exploits the space reductions of depth-bounding. Results for this model on grammar acquisition from transcribed child-directed speech and newswire text exceed or are competitive with those of other models when evaluated on parse accuracy. Moreover, gram- mars acquired from this model demonstrate a consistent use of category labels, something which has not been demonstrated by other acquisition models. ... : Accepted by Transactions of the Association for Computational Linguistics ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.1802.08545 https://arxiv.org/abs/1802.08545
|
|
BASE
|
|
Hide details
|
|
|
|