DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 39

1
A research report on the development of the Test of English for Academic Purposes (TEAP) writing test for Japanese university entrants
Weir, Cyril J.. - : Eiken Foundation of Japan, 2020
BASE
Show details
2
Are current academic reading tests fit for purpose?
BASE
Show details
3
Testing communicative language use: a brief overview
Weir, Cyril J.. - 2020
BASE
Show details
4
Automated approaches to establishing context validity in reading tests
BASE
Show details
5
Validating performance on writing test tasks
Weir, Cyril J.. - 2020
BASE
Show details
6
Research and practice in assessing academic reading: the case of IELTS
Weir, Cyril J.; Chan, Sathena Hiu Chong. - : Cambridge University Press, 2020
BASE
Show details
7
The historical frame of reference
Weir, Cyril J.. - 2020
BASE
Show details
8
Reading in a second language: process, product and practice
Urquhart, A.H.; Weir, Cyril J.. - : Routledge, 2019
BASE
Show details
9
The relative significance of syntactic knowledge and vocabulary breadth in the prediction of reading comprehension test performance
Shiotsu, Toshiko; Weir, Cyril J.. - : SAGE, 2019
BASE
Show details
10
Continuity and innovation: a history of the Cambridge Proficiency in English examination 1913-2002
Weir, Cyril J.; Milanovic, Michael. - : Cambridge University Press, 2019
BASE
Show details
11
Examining writing: research and practice in assessing second language writing
Shaw, Stuart D.; Weir, Cyril J.. - : Cambridge University Press, 2019
BASE
Show details
12
Language testing and validation: an evidence based approach
Weir, Cyril J.. - : Palgrave, 2019
BASE
Show details
13
An empirical investigation of the componentiality of L2 reading in English for academic purposes
BASE
Show details
14
Researching the comparability of paper-based and computer-based delivery in a high-stakes writing test
BASE
Show details
15
Researching participants taking IELTS Academic Writing Task 2 (AWT2) in paper mode and in computer mode in terms of score equivalence, cognitive validity and other factors
Chan, Sathena Hiu Chong; Bax, Stephen; Weir, Cyril J.. - : British Council and IDP: IELTS Australia, 2017
Abstract: Computer-based (CB) assessment is becoming more common in most university disciplines, and international language testing bodies now routinely use computers for many areas of English language assessment. Given that, in the near future, IELTS also will need to move towards offering CB options alongside traditional paper-based (PB) modes, the research reported here prepares for that possibility, building on research carried out some years ago which investigated the statistical comparability of the IELTS writing test between the two delivery modes, and offering a fresh look at the relevant issues. By means of questionnaire and interviews, the current study investigates the extent to which 153 test-takers’ cognitive processes, while completing IELTS Academic Writing in PB mode and in CB mode, compare with the real-world cognitive processes of students completing academic writing at university. A major contribution of our study is its use – for the first time in the academic literature – of data from research into cognitive processes within real-world academic settings as a comparison with cognitive processing during academic writing under test conditions. The most important conclusion from the study is that according to the 5-facet MFRM analysis, there were no significant differences in the scores awarded by two independent raters for candidates’ performances on the tests taken under two conditions, one paper-and-pencil and the other computer. Regarding analytic scores criteria, the differences in three areas (i.e. Task Achievement, Coherence and Cohesion, and Grammatical Range and Accuracy) were not significant, but the difference reported in Lexical Resources was significant, if slight. In summary, the difference of scores between the two modes is at an acceptable level. With respect to the cognitive processes students employ in performing under the two conditions of the test, results of the Cognitive Process Questionnaire (CPQ) survey indicate a similar pattern between the cognitive processes involved in writing on a computer and writing with paper-and-pencil. There were no noticeable major differences in the general tendency of the mean of each questionnaire item reported on the two test modes. In summary, the cognitive processes were employed in a similar fashion under the two delivery conditions. Based on the interview data (n=30), it appears that the participants reported using most of the processes in a similar way between the two modes. Nevertheless, a few potential differences indicated by the interview data might be worth further investigation in future studies. The Computer Familiarity Questionnaire survey shows that these students in general are familiar with computer usage and their overall reactions towards working with a computer are positive. Multiple regression analysis, used to find out if computer familiarity had any effect on students’ performances on the two modes, suggested that test-takers who do not have a suitable familiarity profile might perform slightly worse than those who do, in computer mode. In summary, the research offered in this report offers a unique comparison with realworld academic writing, and presents a significant contribution to the research base which IELTS and comparable international testing bodies will need to consider, if they are to introduce CB test versions in future.
Keyword: computer-based testing; L2 writing; language assessment; language testing; testing; writing; writing assessment; X162 Teaching English as a Foreign Language (TEFL)
URL: http://hdl.handle.net/10547/622176
BASE
Hide details
16
Assessing English on the global stage : the British Council and English language testing, 1941-2016
Weir, Cyril J.; O'Sullivan, Barry. - : Equinox, 2017
BASE
Show details
17
Measured constructs: a history of Cambridge English Examinations, 1913-2012
Weir, Cyril J.; Vidakovic, Ivana; Galaczi, Evelina D.. - : Cambridge University Press, 2013
BASE
Show details
18
Measured constructs: a history of Cambridge English language examinations 1913 - 2012
Weir, Cyril J.. - : Cambridge ESOL, 2013
BASE
Show details
19
Investigating learners' cognitive processes during a computer-based CAE Reading test
Bax, Stephen; Weir, Cyril J.. - : Cambridge ESOL, 2012
BASE
Show details
20
Language testing : theories and practices
O'Sullivan, Barry; Kiely, Richard; Graham, Suzanne. - Basingstoke [u.a.] : Palgrave Macmillan, 2011
BLLDB
UB Frankfurt Linguistik
Show details

Page: 1 2

Catalogues
10
0
6
0
0
0
0
Bibliographies
11
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
22
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern