Evaluating rater and rubric performance on a writing placement exam

Date
2012
Authors
Meier, Valerie
Contributor
Advisor
Brown, James D.
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
As higher education institutions pursue internationalization in response to globalization, the lingua franca status of English is driving expectations that even in countries where English is not a national language, graduate students should have the language skills to be able to disseminate their research in English. It is within this wider context that several departments at Universidad de Los Andes (Los Andes), a well-respected private research university in Colombia, asked the Department of Languages and Socio-Cultural Studies (D-LESC) to create a program that would promote their PhD students’ ability to write for publication and present at academic conferences in English. Faculty in D-LESC developed both the curriculum for the resulting Inglés para Doctorados (IPD) program and the IPD Placement Exam, which includes a reading section already in wider use at Los Andes as well as speaking and writing sections written specifically for the new program. During a pilot phase and after the IPD exam became operational, the faculty involved in test development checked its reliability and monitored how well students were being placed into IPD classes. However, as the potential consequences of test use became more extreme—shortly after the IPD program was approved, completion thought the third level (IPD 3) became required for all PhD students, and some departments began to limit admissions to their PhD programs based on IPD exam results—the lead test developer felt a more thorough evaluation of the exam’s reliability and validity was in order. Thus in the spring of 2012 I joined the lead test developer in a comprehensive evaluation of the IPD Placement Exam. One part of this larger evaluation project involved investigating the writing section in order to address practical concerns of administrators and faculty in the IPD program: namely, whether raters and the scoring rubric were functioning effectively. I assumed responsibility for this part of the evaluation, and the current report is a more extensive and technical presentation of findings that will be shared with IPD program stakeholders so that they can make informed decisions about whether there are any aspects of rater training and/or scoring materials and procedures which could benefit from revision.
Description
Keywords
Citation
Extent
55 pages
Format
Geographic Location
Time Period
Related To
University of Hawai'I Second Langauge Studies Paper 31(1)
Table of Contents
Rights
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.