DE eng

Search in the Catalogues and Directories

Hits 1 – 2 of 2

1
Why is scaling up models of language evolution hard?
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
Abstract: Computational model simulations have been very fruitful for gaining insight into how the systematic structure we observe in the world's natural languages could have emerged through cultural evolution. However, these model simulations operate on a toy scale compared to the size of actual human vocabularies, due to the prohibitive computational resource demands that simulations with larger lexicons would pose. Using computational complexity analysis, we show that this is not an implementational artifact, but instead it reflects a deeper theoretical issue: these models are (in their current formulation) computationally intractable. This has important theoretical implications, because it means that there is no way of knowing whether or not the properties and regularities observed for the toy models would scale up. All is not lost however, because awareness of intractability allows us to face the issue of scaling head-on, and can guide the development of our theories.
Keyword: cognitive science
URL: https://escholarship.org/uc/item/021734q4
BASE
Hide details
2
Why is scaling up models of language evolution hard? ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
2
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern