DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 79

1
Parallel processing in speech perception with local and global representations of linguistic context
In: eLife (2022)
BASE
Show details
2
Using surprisal and fMRI to map the neural bases of broad and local contextual prediction during natural language comprehension ...
BASE
Show details
3
Community-level Research on Suicidality Prediction in a Secure Environment: Overview of the CLPsych 2021 Shared Task
BASE
Show details
4
Connecting Documents, Words, and Languages Using Topic Models
Yang, Weiwei. - 2019
BASE
Show details
5
Assessing Composition in Sentence Vector Representations ...
BASE
Show details
6
Relating lexical and syntactic processes in language: Bridging research in humans and machines
BASE
Show details
7
Guided Probabilistic Topic Models for Agenda-setting and Framing
Nguyen, Viet An. - 2015
BASE
Show details
8
Soft syntactic constraints for Arabic-English hierarchical phrase-based translation
In: Machine translation. - Dordrecht [u.a.] : Springer Science + Business Media 26 (2012) 1-2, 137-157
BLLDB
OLC Linguistik
Show details
9
Crowdsourced Monolingual Translation
Hu, Chang. - 2012
BASE
Show details
10
Decision Tree-based Syntactic Language Modeling
BASE
Show details
11
Modeling Dependencies in Natural Languages with Latent Variables
BASE
Show details
12
Exploiting syntactic relationships in a phrase-based decoder: an exploration
In: Machine translation. - Dordrecht [u.a.] : Springer Science + Business Media 24 (2010) 2, 123-140
BLLDB
OLC Linguistik
Show details
13
Gibbs Sampling for the Uninitiated
In: DTIC (2010)
BASE
Show details
14
Structured local exponential models for machine translation
BASE
Show details
15
A Formal Model of Ambiguity and its Applications in Machine Translation
BASE
Show details
16
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
In: DTIC (2009)
BASE
Show details
17
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
BASE
Show details
18
COMPUTATIONAL ANALYSIS OF THE CONVERSATIONAL DYNAMICS OF THE UNITED STATES SUPREME COURT
Hawes, Timothy. - 2009
BASE
Show details
19
Fine-Grained Linguistic Soft Constraints on Statistical Natural Language Processing Models
Abstract: This dissertation focuses on effective combination of data-driven natural language processing (NLP) approaches with linguistic knowledge sources that are based on manual text annotation or word grouping according to semantic commonalities. I gainfully apply fine-grained linguistic soft constraints -- of syntactic or semantic nature -- on statistical NLP models, evaluated in end-to-end state-of-the-art statistical machine translation (SMT) systems. The introduction of semantic soft constraints involves intrinsic evaluation on word-pair similarity ranking tasks, extension from words to phrases, application in a novel distributional paraphrase generation technique, and an introduction of a generalized framework of which these soft semantic and syntactic constraints can be viewed as instances, and in which they can be potentially combined. Fine granularity is key in the successful combination of these soft constraints, in many cases. I show how to softly constrain SMT models by adding fine-grained weighted features, each preferring translation of only a specific syntactic constituent. Previous attempts using coarse-grained features yielded negative results. I also show how to softly constrain corpus-based semantic models of words (“distributional profiles”) to effectively create word-sense-aware models, by using semantic word grouping information found in a manually compiled thesaurus. Previous attempts, using hard constraints and resulting in aggregated, coarse-grained models, yielded lower gains. A novel paraphrase generation technique incorporating these soft semantic constraints is then also evaluated in a SMT system. This paraphrasing technique is based on the Distributional Hypothesis. The main advantage of this novel technique over current “pivoting” techniques for paraphrasing is the independence from parallel texts, which are a limited resource. The evaluation is done by augmenting translation models with paraphrase-based translation rules, where fine-grained scoring of paraphrase-based rules yields significantly higher gains. The model augmentation includes a novel semantic reinforcement component: In many cases there are alternative paths of generating a paraphrase-based translation rule. Each of these paths reinforces a dedicated score for the “goodness” of the new translation rule. This augmented score is then used as a soft constraint, in a weighted log-linear feature, letting the translation model learn how much to “trust” the paraphrase-based translation rules. The work reported here is the first to use distributional semantic similarity measures to improve performance of an end-to-end phrase-based SMT system. The unified framework for statistical NLP models with soft linguistic constraints enables, in principle, the combination of both semantic and syntactic constraints -- and potentially other constraints, too -- in a single SMT model.
Keyword: computational linguistics; Computer Science; hybrid; Language; Linguistics; paraphrase generation; semantic distance; soft constraints; statistical machine translation
URL: http://hdl.handle.net/1903/9861
BASE
Hide details
20
Generalizing Word Lattice Translation
In: DTIC (2008)
BASE
Show details

Page: 1 2 3 4

Catalogues
6
0
7
0
0
0
0
Bibliographies
18
0
0
0
0
0
0
0
6
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
50
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern