DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5
Hits 1 – 20 of 82

1
Grad-SAM: Explaining Transformers via Gradient Self-Attention Maps ...
BASE
Show details
2
Universal Dependencies and Semantics for English and Hebrew Child-directed Speech
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
3
Forecasting the macrolevel determinants of entrepreneurial opportunities using artificial intelligence models
In: ISSN: 0040-1625 ; Technological Forecasting and Social Change ; https://hal.archives-ouvertes.fr/hal-03442122 ; Technological Forecasting and Social Change, Elsevier, 2021, pp.121353. ⟨10.1016/j.techfore.2021.121353⟩ (2021)
BASE
Show details
4
On Neurons Invariant to Sentence Structural Changes in Neural Machine Translation ...
BASE
Show details
5
A constraint on co-occurrence of partitive quantifiers and gradable predicates
In: Sinn und Bedeutung; Bd. 25 (2021): Proceedings of Sinn und Bedeutung 25; 55-72 ; Proceedings of Sinn und Bedeutung; Vol 25 (2021): Proceedings of Sinn und Bedeutung 25; 55-72 ; 2629-6055 (2021)
BASE
Show details
6
Paths to Relation Extraction through Semantic Structure ...
BASE
Show details
7
Cross-linguistically Consistent Semantic and Syntactic Annotation of Child-directed Speech ...
BASE
Show details
8
On the Relation between Syntactic Divergence and Zero-Shot Performance ...
BASE
Show details
9
The Grammar-Learning Trajectories of Neural Language Models ...
BASE
Show details
10
A diachronic explanation for cross-linguistic variation in the use of inverse-scope constructions
In: Semantics and Linguistic Theory; Proceedings of SALT 31; 021-041 ; 2163-5951 (2021)
BASE
Show details
11
From Unsupervised Machine Translation To Adversarial Text Generation ...
Abstract: We present a self-attention based bilingual adversarial text generator (B-GAN) which can learn to generate text from the encoder representation of an unsupervised neural machine translation system. B-GAN is able to generate a distributed latent space representation which can be paired with an attention based decoder to generate fluent sentences. When trained on an encoder shared between two languages and paired with the appropriate decoder, it can generate sentences in either language. B-GAN is trained using a combination of reconstruction loss for auto-encoder, a cross domain loss for translation and a GAN based adversarial loss for text generation. We demonstrate that B-GAN, trained on monolingual corpora only using multiple losses, generates more fluent sentences compared to monolingual baselines while effectively using half the number of parameters. ... : Accepted at ICASSP 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2011.05449
https://dx.doi.org/10.48550/arxiv.2011.05449
BASE
Hide details
12
UCCA's Foundational Layer: Annotation Guidelines v2.1 ...
BASE
Show details
13
Part 6 - Cross-linguistic Studies ...
BASE
Show details
14
Language (Re)modelling: Towards Embodied Language Understanding ...
Tamari, Ronen; Shani, Chen; Hope, Tom. - : arXiv, 2020
BASE
Show details
15
Part 1.3 - A Bird's Eye View ...
BASE
Show details
16
Part 1.2 - A Bird's Eye View ...
BASE
Show details
17
Part 1.1 - A Bird's Eye View ...
BASE
Show details
18
Semantic Structural Decomposition for Neural Machine Translation ...
BASE
Show details
19
Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics ...
BASE
Show details
20
Non-maximality and homogeneity: Parallels between collective predicates and absolute adjectives
In: Sinn und Bedeutung; Bd. 24 Nr. 1 (2020): Proceedings of Sinn und Bedeutung 24; 66-83 ; Proceedings of Sinn und Bedeutung; Vol 24 No 1 (2020): Proceedings of Sinn und Bedeutung 24; 66-83 ; 2629-6055 (2020)
BASE
Show details

Page: 1 2 3 4 5

Catalogues
0
0
3
0
0
0
0
Bibliographies
4
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
76
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern