DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 29

1
Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale ...
BASE
Show details
2
Green NLP panel ...
BASE
Show details
3
A Generative Framework for Simultaneous Machine Translation ...
BASE
Show details
4
Pretraining the Noisy Channel Model for Task-Oriented Dialogue ...
BASE
Show details
5
Counterfactual Data Augmentation for Neural Machine Translation ...
NAACL 2021 2021; Blunsom, Phil; Kusner, Matt. - : Underline Science Inc., 2021
BASE
Show details
6
Learning Robust and Multilingual Speech Representations ...
BASE
Show details
7
Better Document-Level Machine Translation with Bayes’ Rule
In: Transactions of the Association for Computational Linguistics, Vol 8, Pp 346-360 (2020) (2020)
BASE
Show details
8
Learning and Evaluating General Linguistic Intelligence ...
BASE
Show details
9
Learning to Discover, Ground and Use Words with Segmental Neural Language Models ...
BASE
Show details
10
From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042)
Cho, Kyunghyun; Dyer, Chris; Blunsom, Phil. - : Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, 2017. : Dagstuhl Reports. Dagstuhl Reports, Volume 7, Issue 1, 2017
BASE
Show details
11
Learning to Create and Reuse Words in Open-Vocabulary Neural Language Modeling ...
Abstract: Fixed-vocabulary language models fail to account for one of the most characteristic statistical facts of natural language: the frequent creation and reuse of new word types. Although character-level language models offer a partial solution in that they can create word types not attested in the training corpus, they do not capture the "bursty" distribution of such words. In this paper, we augment a hierarchical LSTM language model that generates sequences of word tokens character by character with a caching mechanism that learns to reuse previously generated words. To validate our model we construct a new open-vocabulary language modeling corpus (the Multilingual Wikipedia Corpus, MWC) from comparable Wikipedia articles in 7 typologically diverse languages and demonstrate the effectiveness of our model across this range of languages. ... : ACL 2017 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1704.06986
https://arxiv.org/abs/1704.06986
BASE
Hide details
12
From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042) ...
Blunsom, Phil; Cho, Kyunghyun; Dyer, Chris. - : Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik GmbH, Wadern/Saarbruecken, Germany, 2017
BASE
Show details
13
Grounded Language Learning in a Simulated 3D World ...
BASE
Show details
14
Robust Incremental Neural Semantic Graph Parsing ...
Buys, Jan; Blunsom, Phil. - : arXiv, 2017
BASE
Show details
15
Learning to Compose Words into Sentences with Reinforcement Learning ...
BASE
Show details
16
Reasoning about Entailment with Neural Attention ...
BASE
Show details
17
Tree Transduction Tools for cdec
In: The Prague bulletin of mathematical linguistics. - Praha : Univ. (2014) 102, 27-36
OLC Linguistik
Show details
18
OxLM: A Neural Language Modelling Framework for Machine Translation
In: The Prague bulletin of mathematical linguistics. - Praha : Univ. (2014) 102, 81-92
OLC Linguistik
Show details
19
A Fast and Simple Online Synchronous Context Free Grammar Extractor
In: The Prague bulletin of mathematical linguistics. - Praha : Univ. (2014) 102, 17-26
OLC Linguistik
Show details
20
Learning Bilingual Word Representations by Marginalizing Alignments ...
BASE
Show details

Page: 1 2

Catalogues
0
0
5
0
0
0
0
Bibliographies
2
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
24
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern