1 |
XTREME-S: Evaluating Cross-lingual Speech Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Self-supervised Learning with Random-projection Quantizer for Speech Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Unsupervised Data Selection via Discrete Speech Representation for ASR ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
mSLAM: Massively multilingual joint pre-training for speech and text ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
MAESTRO: Matched Speech Text Representations through Modality Matching ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
LPInsider: a webserver for lncRNA–protein interaction extraction from the literature
|
|
|
|
In: BMC Bioinformatics (2022)
|
|
BASE
|
|
Show details
|
|
8 |
Observation of new excited ${B} ^0_{s} $ states
|
|
|
|
In: Eur.Phys.J.C ; https://hal.archives-ouvertes.fr/hal-03010999 ; Eur.Phys.J.C, 2021, 81 (7), pp.601. ⟨10.1140/epjc/s10052-021-09305-3⟩ (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Parental use of relational language with 3-year-olds in math and spatial activities: A cross-cultural perspective
|
|
Zhang, Yu. - : eScholarship, University of California, 2021
|
|
BASE
|
|
Show details
|
|
10 |
Joint Unsupervised and Supervised Training for Multilingual ASR ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Scaling End-to-End Models for Large-Scale Multilingual ASR ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Improving Confidence Estimation on Out-of-Domain Data for End-to-End Speech Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Book Review: Data Collection Research Methods in Applied Linguistics
|
|
|
|
In: Front Psychol (2021)
|
|
BASE
|
|
Show details
|
|
19 |
Offline Handwritten Chinese Text Recognition with Convolutional Neural Networks ...
|
|
|
|
Abstract:
Deep learning based methods have been dominating the text recognition tasks in different and multilingual scenarios. The offline handwritten Chinese text recognition (HCTR) is one of the most challenging tasks because it involves thousands of characters, variant writing styles and complex data collection process. Recently, the recurrent-free architectures for text recognition appears to be competitive as its highly parallelism and comparable results. In this paper, we build the models using only the convolutional neural networks and use CTC as the loss function. To reduce the overfitting, we apply dropout after each max-pooling layer and with extreme high rate on the last one before the linear layer. The CASIA-HWDB database is selected to tune and evaluate the proposed models. With the existing text samples as templates, we randomly choose isolated character samples to synthesis more text samples for training. We finally achieve 6.81% character error rate (CER) on the ICDAR 2013 competition set, which is the ... : 6 pages, 5 figures, and 3 tables ...
|
|
Keyword:
Computer Vision and Pattern Recognition cs.CV; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2006.15619 https://dx.doi.org/10.48550/arxiv.2006.15619
|
|
BASE
|
|
Hide details
|
|
20 |
Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing? ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|