DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...165
Hits 1 – 20 of 3.291

1
Proficiency and the Use of Machine Translation: A Case Study of Four Japanese Learners
In: L2 Journal, vol 14, iss 1 (2022)
Abstract: While the use of machine translation (MT) in the classroom has been explored from various perspectives, the relationship between language proficiency and MT use regarding learners’ behaviors and beliefs remains unclear in the research literature. This study focused on four Japanese learners with various language proficiencies from a fourth-year Japanese language class (two advanced-level, one intermediate-high, and one novice-high level) and investigated how they edited self-written text with MT by examining the scope and types of revisions they made as well as their perceptions about using MT for editing. The data included four types of drafts of a writing assignment: (1) D1 (self-written drafts in Japanese without the help of MT); (2) D2 (revised corresponding drafts in L1 provided by MT); (3) D3 (drafts in Japanese provided by MT based on D2); (4) D4 (revised drafts based on comparison of D1 and D3) and their reflection papers. The results show that the four participants adopted various ways of editing self-written text. While all the participants’ revisions are at local levels, the two advanced level learners primarily focused on vocabulary revision while the other two learners’ revisions extended to the sentence level. The findings also show that the advanced-level and intermediate-high-level learners have various degrees of positive attitudes toward using MT. In contrast, while the positive effects of MT use are acknowledged, the novice-high level learner also feels ashamed and dishonest when using MT. This article concludes with insights that can assist instructors in facilitating MT as a pedagogical tool for language learning and teaching with diverse students.
Keyword: Japanese; L2 writing; Machine Translation; Proficiency
URL: https://escholarship.org/uc/item/1fw545k9
BASE
Hide details
2
Generating Authentic Adversarial Examples beyond Meaning-preserving with Doubly Round-trip Translation ...
Lai, Siyu; Yang, Zhen; Meng, Fandong. - : arXiv, 2022
BASE
Show details
3
SMDT: Selective Memory-Augmented Neural Document Translation ...
Zhang, Xu; Yang, Jian; Huang, Haoyang. - : arXiv, 2022
BASE
Show details
4
Conditional Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation ...
BASE
Show details
5
MSCTD: A Multimodal Sentiment Chat Translation Dataset ...
BASE
Show details
6
How does our brain respond to moral violations and semantic violations? Evidence from Event-related Potentials ...
Xu, Xiaodong. - : Open Science Framework, 2022
BASE
Show details
7
Improving the Compilation of English–Chinese Children's Dictionaries: A Children's Cognitive Perspective
In: Lexikos; Vol. 32 (2022); 49-65 ; 2224-0039 (2022)
BASE
Show details
8
Learn from Structural Scope: Improving Aspect-Level Sentiment Analysis with Hybrid Graph Convolutional Networks ...
BASE
Show details
9
huggingface/datasets: 1.18.1 ...
BASE
Show details
10
Multilingual CoNaLa Datset, train data ...
Zhiruo Wang; Cuenca, Grace; Shuyan Zhou. - : Zenodo, 2022
BASE
Show details
11
Multilingual CoNaLa Datset, train data ...
Zhiruo Wang; Cuenca, Grace; Shuyan Zhou. - : Zenodo, 2022
BASE
Show details
12
The temporal (re-)construal of experience: how native speakers of English and advanced Chinese learners select and interpret simple past/present tenses ...
Xu, Jiahuan. - : Macquarie University, 2022
BASE
Show details
13
The temporal (re-)construal of experience: how native speakers of English and advanced Chinese learners select and interpret simple past/present tenses ...
Xu, Jiahuan. - : Macquarie University, 2022
BASE
Show details
14
MCoNaLa: A Benchmark for Code Generation from Multiple Natural Languages ...
BASE
Show details
15
Focus on the Target's Vocabulary: Masked Label Smoothing for Machine Translation ...
Chen, Liang; Xu, Runxin; Chang, Baobao. - : arXiv, 2022
BASE
Show details
16
Towards the Next 1000 Languages in Multilingual Machine Translation: Exploring the Synergy Between Supervised and Self-Supervised Learning ...
BASE
Show details
17
CINO: A Chinese Minority Pre-trained Language Model ...
Yang, Ziqing; Xu, Zihang; Cui, Yiming. - : arXiv, 2022
BASE
Show details
18
Wukong: 100 Million Large-scale Chinese Cross-modal Pre-training Dataset and A Foundation Framework ...
Gu, Jiaxi; Meng, Xiaojun; Lu, Guansong. - : arXiv, 2022
BASE
Show details
19
Zero-shot Cross-lingual Conversational Semantic Role Labeling ...
Wu, Han; Tan, Haochen; Xu, Kun. - : arXiv, 2022
BASE
Show details
20
Probing Structured Pruning on Multilingual Pre-trained Models: Settings, Algorithms, and Efficiency ...
Li, Yanyang; Luo, Fuli; Xu, Runxin. - : arXiv, 2022
BASE
Show details

Page: 1 2 3 4 5...165

Catalogues
27
4
368
0
115
0
7
Bibliographies
223
0
0
12
0
0
2
12
39
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
2.581
5
1
1
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern