DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...7
Hits 1 – 20 of 134

1
Learning Stress Patterns with a Sequence-to-Sequence Neural Network
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
2
Learning Repetition, but not Syllable Reversal
In: Proceedings of the Annual Meetings on Phonology; Proceedings of the 2020 Annual Meeting on Phonology ; 2377-3324 (2021)
BASE
Show details
3
Learning syntacti parameter settings without triggers by assigning credit and blame
In: CLS 55, 2019 : proceedings of the fifty-fifth annual meeting of the Chicago Linguistic Society (2020), S. 337-350
Leibniz-Zentrum Allgemeine Sprachwissenschaft
Show details
4
French schwa and gradient cumulativity
In: Glossa: a journal of general linguistics; Vol 5, No 1 (2020); 24 ; 2397-1835 (2020)
BASE
Show details
5
Assimilation triggers metathesis in Balantak: Implications for theories of possible repair in Optimality Theory
In: University of Massachusetts Occasional Papers in Linguistics (2020)
BASE
Show details
6
*NC
In: North East Linguistics Society (2020)
BASE
Show details
7
French schwa and gradient cumulativity
In: Joe Pater (2019)
BASE
Show details
8
Learning Reduplication with a Neural Network without Explicit Variables
In: Joe Pater (2019)
BASE
Show details
9
Phonological typology in Optimality Theory and Formal Language Theory: Goals and future directions
In: Joe Pater (2019)
BASE
Show details
10
Learning syntactic parameters without triggers by assigning credit and blame
In: Joe Pater (2019)
BASE
Show details
11
Generative linguistics and neural networks at 60: foundation, friction, and fusion
In: Joe Pater (2019)
BASE
Show details
12
Preface: SCiL 2019 Editors’ Note
In: Proceedings of the Society for Computation in Linguistics (2019)
BASE
Show details
13
Substance matters: A reply to Jardine 2016
In: Joe Pater (2018)
BASE
Show details
14
Seq2Seq Models with Dropout can Learn Generalizable Reduplication
In: Joe Pater (2018)
BASE
Show details
15
Preface: SCiL 2018 Editors’ Note
In: Proceedings of the Society for Computation in Linguistics (2018)
BASE
Show details
16
Learning opacity in Stratal Maximum Entropy Grammar ...
Nazarov, Aleksei; Pater, Joe. - : arXiv, 2017
BASE
Show details
17
The Oxford handbook of developmental linguistics
Lidz, Jeffrey (Herausgeber); Snyder, William (Herausgeber); Pater, Joe (Herausgeber). - Oxford : Oxford University Press, 2016
UB Frankfurt Linguistik
Show details
18
The Oxford handbook of developmental linguistics
Lidz, Jeffrey; Snyder, William; Pater, Joe. - Oxford : Oxford University Press, 2016
MPI für Psycholinguistik
Show details
19
Harmonic grammar and harmonic serialism
McCarthy, John J. (Herausgeber); Pater, Joe (Herausgeber). - Bristol, CT : Equinox Publishing, 2016
BLLDB
UB Frankfurt Linguistik
Show details
20
Gradient Exceptionality in Maximum Entropy Grammar with Lexically Specific Constraints
Abstract: The number of exceptions to a phonological generalization appears to gradiently affect its productivity. Generalizations with relatively few exceptions are relatively productive, as measured in tendencies to regularization, as well as in nonce word productions and other psycholinguistic tasks. Gradient productivity has been previously modeled with probabilistic grammars, including Maximum Entropy Grammar, but they often fail to capture the fixed pronunciations of the existing words in a language, as opposed to nonce words. Lexically specific constraints allow existing words to be produced faithfully, while permitting variation in novel words that are not subject to those constraints. When each word has its own lexically specific version of a constraint, an inverse correlation between the number of exceptions and the degree of productivity is straightforwardly predicted. ; El nombre d'excepcions a una generalització fonològica sembla que afecta de forma gradual la seva productivitat. Les generalitzacions amb relativament poques excepcions són bastant productives, per les mesures en tendències a la regularització i per les produccions de mots sense sentit i altres tasques psicolingüístiques. La productivitat gradual s'ha modelat prèviament amb gramàtiques probabilístiques, incloent-hi la gramàtica de màxima entropia, però sovint no aconsegueixen recollir les pronunciacions fixes de paraules existents en una llengua, contràriament al que passa amb les paraules sense sentit. Les restriccions especificades lèxicament permeten produir els mots existents de manera fidel i al mateix temps permeten variació en mots nous, que no estan subjectes a aquestes restriccions. Quan cada mot té la seva pròpia versió d'una restricció especificada lèxicament es prediu directament una correlació inversa entre el nombre d'excepcions i el grau de productivitat.
Keyword: Computational phonology; Excepcions; Exceptions; Fonologia computacional; Gramàtica de màxima entropia; Indexed constraints; Maximum entropy grammar; Restriccions indexades; Variació; Variation
URL: https://ddd.uab.cat/record/166691
https://doi.org/10.5565/rev/catjl.183
BASE
Hide details

Page: 1 2 3 4 5...7

Catalogues
5
0
16
0
0
0
2
Bibliographies
27
0
0
1
0
0
0
0
11
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
80
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern