DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6 7 8 9...83
Hits 81 – 100 of 1.643

81
Idiomatic Expression Identification using Semantic Compatibility
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1546-1562 (2021) (2021)
BASE
Show details
82
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 176-194 (2021) (2021)
BASE
Show details
83
Reducing Confusion in Active Learning for Part-Of-Speech Tagging
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1-16 (2021) (2021)
BASE
Show details
84
Differentiable Subset Pruning of Transformer Heads
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1442-1459 (2021) (2021)
BASE
Show details
85
Compressing Large-Scale Transformer-Based Models: A Case Study on BERT
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1061-1080 (2021) (2021)
BASE
Show details
86
Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 410-428 (2021) (2021)
BASE
Show details
87
Data-to-text Generation with Macro Planning
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 510-527 (2021) (2021)
BASE
Show details
88
Structured Self-Supervised Pretraining for Commonsense Knowledge Graph Completion
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1268-1284 (2021) (2021)
BASE
Show details
89
RYANSQL: Recursively Applying Sketch-based Slot Fillings for Complex Text-to-SQL in Cross-Domain Databases
In: Computational Linguistics, Vol 47, Iss 2, Pp 309-332 (2021) (2021)
BASE
Show details
90
Narrative Question Answering with Cutting-Edge Open-Domain QA Techniques: A Comprehensive Study
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1032-1046 (2021) (2021)
BASE
Show details
91
Maintaining Common Ground in Dynamic Environments
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 995-1011 (2021) (2021)
BASE
Show details
92
Infusing Finetuning with Semantic Dependencies
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 226-242 (2021) (2021)
BASE
Show details
93
On Generative Spoken Language Modeling from Raw Audio
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1336-1354 (2021) (2021)
BASE
Show details
94
Pretraining the Noisy Channel Model for Task-Oriented Dialogue
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 657-674 (2021) (2021)
BASE
Show details
95
Approximating Probabilistic Models as Weighted Finite Automata
In: Computational Linguistics, Vol 47, Iss 2, Pp 221-254 (2021) (2021)
BASE
Show details
96
Sensitivity as a Complexity Measure for Sequence Classification Tasks
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 891-908 (2021) (2021)
BASE
Show details
97
Unsupervised Learning of KB Queries in Task-Oriented Dialogs
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 374-390 (2021) (2021)
BASE
Show details
98
<scp>ParsiNLU</scp>: A Suite of Language Understanding Challenges for Persian
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 1147-1162 (2021) (2021)
BASE
Show details
99
Adaptive Semiparametric Language Models
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 362-373 (2021) (2021)
Abstract: AbstractWe present a language model that combines a large parametric neural network (i.e., a transformer) with a non-parametric episodic memory component in an integrated architecture. Our model uses extended short-term context by caching local hidden states—similar to transformer-XL—and global long-term memory by retrieving a set of nearest neighbor tokens at each timestep. We design a gating function to adaptively combine multiple information sources to make a prediction. This mechanism allows the model to use either local context, short-term memory, or long-term memory (or any combination of them) on an ad hoc basis depending on the context. Experiments on word-based and character-based language modeling datasets demonstrate the efficacy of our proposed method compared to strong baselines.
Keyword: Computational linguistics. Natural language processing; P98-98.5
URL: https://doaj.org/article/4d6a91c6e8ff40bfb1af0dd1fd3c888a
https://doi.org/10.1162/tacl_a_00371
BASE
Hide details
100
Strong Equivalence of TAG and CCG
In: Transactions of the Association for Computational Linguistics, Vol 9, Pp 707-720 (2021) (2021)
BASE
Show details

Page: 1 2 3 4 5 6 7 8 9...83

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.643
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern