21 |
Applying the socio-cognitive framework: gathering validity evidence during the development of a speaking test ; Lessons and Legacy: A Tribute to Professor Cyril J Weir (1950–2018)
|
|
|
|
BASE
|
|
Show details
|
|
22 |
Development of empirically driven checklists for learners’ interactional competence
|
|
|
|
BASE
|
|
Show details
|
|
23 |
Validating speaking test rating scales through microanalysis of fluency using PRAAT
|
|
|
|
BASE
|
|
Show details
|
|
24 |
Interactional Competence measured in group oral tests: how do test-taker characteristics, task types and group sizes affect co-constructed discourse in groups?
|
|
|
|
BASE
|
|
Show details
|
|
26 |
Researching metadiscourse markers in candidates’ writing at Cambridge FCE, CAE and CPE levels
|
|
|
|
BASE
|
|
Show details
|
|
27 |
Effects of pre-task planning time on paired oral test performance
|
|
|
|
BASE
|
|
Show details
|
|
28 |
Aspects of fluency across assessed levels of speaking proficiency
|
|
|
|
BASE
|
|
Show details
|
|
29 |
Towards a model of multi-dimensional performance of C1 level speakers assessed in the Aptis Speaking Test
|
|
|
|
BASE
|
|
Show details
|
|
30 |
Developing tools for learning oriented assessment of interactional competence: bridging theory and practice
|
|
|
|
BASE
|
|
Show details
|
|
32 |
Researching L2 writers’ use of metadiscourse markers at intermediate and advanced levels
|
|
|
|
BASE
|
|
Show details
|
|
33 |
Aspects of fluency across assessed levels of speaking proficiency
|
|
|
|
BASE
|
|
Show details
|
|
34 |
Researching L2 writers’ use of metadiscourse markers at intermediate and advanced levels
|
|
|
|
BASE
|
|
Show details
|
|
35 |
The role of the L1 in testing L2 English ; Ontologies of English. Conceptualising the language for learning, teaching, and assessment
|
|
|
|
BASE
|
|
Show details
|
|
36 |
The discourse of the IELTS Speaking Test : interactional design and practice
|
|
|
|
BASE
|
|
Show details
|
|
37 |
Exploring the use of video-conferencing technology in the assessment of spoken language: a mixed-methods study
|
|
|
|
Abstract:
This research explores how internet-based video-conferencing technology can be used to deliver and conduct a speaking test, and what similarities and differences can be discerned between the standard and computer-mediated face-to-face modes. The context of the study is a high-stakes speaking test, and the motivation for the research is the need for test providers to keep under constant review the extent to which their tests are accessible and fair to a wide constituency of test takers. The study examines test-takers’ scores and linguistic output, and examiners’ test administration and rating behaviors across the two modes. A convergent parallel mixed-methods research design was used, analyzing test-takers’ scores and language functions elicited, examiners’ written comments, feedback questionnaires and verbal reports, as well as observation notes taken by researchers. While the two delivery modes generated similar test score outcomes, some differences were observed in test-takers’ functional output and the behavior of examiners who served as both raters and interlocutors.
|
|
Keyword:
language assessment; mixed-methods research; Q330 English as a second language; speaking; speaking assessment; video conferencing
|
|
URL: https://doi.org/10.1080/15434303.2016.1263637 http://hdl.handle.net/10547/621954
|
|
BASE
|
|
Hide details
|
|
38 |
Exploring performance across two delivery modes for the same L2 speaking test: face-to-face and video-conferencing delivery: a preliminary comparison of test-taker and examiner behaviour
|
|
|
|
BASE
|
|
Show details
|
|
39 |
Exploring performance across two delivery modes for the IELTS Speaking Test: face-to-face and video-conferencing delivery (Phase 2)
|
|
|
|
BASE
|
|
Show details
|
|
40 |
Investigating examiner interventions in relation to the listening demands they make on candidates in oral interview tests ; Emerging issues in the assessment of second language listening
|
|
|
|
BASE
|
|
Show details
|
|
|
|