DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...20
Hits 1 – 20 of 392

1
Perturbations in Non-Flat Cosmology for $f(T)$ gravity ...
BASE
Show details
2
Quantifying knowledge synchronisation in the 21st century ...
BASE
Show details
3
Application of Quantum Density Matrix in Classical Question Answering and Classical Image Classification ...
Zhao, X. Q.; Wan, H.; Chen, H.. - : arXiv, 2022
BASE
Show details
4
Networks and Identity Drive Geographic Properties of the Diffusion of Linguistic Innovation ...
BASE
Show details
5
Power laws prevail in medical ultrasound ...
Parker, Kevin J.. - : arXiv, 2022
BASE
Show details
6
Quantum Accelerator Stack: A Research Roadmap ...
Bertels, K.; Sarkar, A.; Krol, A.. - : arXiv, 2021
BASE
Show details
7
Composable constraints ...
BASE
Show details
8
Characterization of 128x128 MM-PAD-2.1 ASIC: A Fast Framing Hard X-Ray Detector with High Dynamic Range ...
Gadkari, D.; Shanks, K. S.; Hu, H.. - : arXiv, 2021
BASE
Show details
9
Quantifying language changes surrounding mental health on Twitter ...
BASE
Show details
10
Poincare pressure and vorticity monopoles in the kinetic model of matter-extension ...
Bulyzhenkov, Igor E.. - : arXiv, 2021
BASE
Show details
11
Modified Gravity and Cosmology: An Update by the CANTATA Network ...
BASE
Show details
12
A MLIR Dialect for Quantum Assembly Languages ...
BASE
Show details
13
Teleparallel Gravity: From Theory to Cosmology ...
BASE
Show details
14
Drift in a Popular Metal Oxide Sensor Dataset Reveals Limitations for Gas Classification Benchmarks ...
BASE
Show details
15
Inferring the drivers of language change using spatial models ...
Burridge, James; Blaxter, Tamsin. - : arXiv, 2021
BASE
Show details
16
Transient Chaos in BERT ...
Abstract: Language is an outcome of our complex and dynamic human-interactions and the technique of natural language processing (NLP) is hence built on human linguistic activities. Bidirectional Encoder Representations from Transformers (BERT) has recently gained its popularity by establishing the state-of-the-art scores in several NLP benchmarks. A Lite BERT (ALBERT) is literally characterized as a lightweight version of BERT, in which the number of BERT parameters is reduced by repeatedly applying the same neural network called Transformer's encoder layer. By pre-training the parameters with a massive amount of natural language data, ALBERT can convert input sentences into versatile high-dimensional vectors potentially capable of solving multiple NLP tasks. In that sense, ALBERT can be regarded as a well-designed high-dimensional dynamical system whose operator is the Transformer's encoder, and essential structures of human language are thus expected to be encapsulated in its dynamics. In this study, we investigated ... : 11 pages, 5 figures ...
Keyword: Artificial Intelligence cs.AI; Chaotic Dynamics nlin.CD; Computation and Language cs.CL; Dynamical Systems math.DS; FOS Computer and information sciences; FOS Mathematics; FOS Physical sciences; Machine Learning cs.LG
URL: https://arxiv.org/abs/2106.03181
https://dx.doi.org/10.48550/arxiv.2106.03181
BASE
Hide details
17
Frequency compression in bimodal CI users (Sharma et al., 2021) ...
BASE
Show details
18
Clinical and patient-reported voice measures (Houle & Johnson, 2021) ...
Houle, Nichole; Johnson, Aaron M.. - : ASHA journals, 2021
BASE
Show details
19
Frequency compression in bimodal CI users (Sharma et al., 2021) ...
BASE
Show details
20
Clinical and patient-reported voice measures (Houle & Johnson, 2021) ...
Houle, Nichole; Johnson, Aaron M.. - : ASHA journals, 2021
BASE
Show details

Page: 1 2 3 4 5...20

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
392
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern