Please use this identifier to cite or link to this item: http://hdl.handle.net/10125/40721

Evaluating rater and rubric performance on a writing placement exam

File SizeFormat 
Meier (2012)_31(1).pdf1.63 MBAdobe PDFView/Open

Item Summary

Title: Evaluating rater and rubric performance on a writing placement exam
Authors: Meier, Valerie
Advisor: Brown, James D.
Issue Date: 2012
Abstract: As higher education institutions pursue internationalization in response to globalization, the lingua franca status of English is driving expectations that even in countries where English is not a national language, graduate students should have the language skills to be able to disseminate their research in English. It is within this wider context that several departments at Universidad de Los Andes (Los Andes), a well-respected private research university in Colombia, asked the Department of Languages and Socio-Cultural Studies (D-LESC) to create a program that would promote their PhD students’ ability to write for publication and present at academic conferences in English. Faculty in D-LESC developed both the curriculum for the resulting Inglés para Doctorados (IPD) program and the IPD Placement Exam, which includes a reading section already in wider use at Los Andes as well as speaking and writing sections written specifically for the new program. During a pilot phase and after the IPD exam became operational, the faculty involved in test development checked its reliability and monitored how well students were being placed into IPD classes. However, as the potential consequences of test use became more extreme—shortly after the IPD program was approved, completion thought the third level (IPD 3) became required for all PhD students, and some departments began to limit admissions to their PhD programs based on IPD exam results—the lead test developer felt a more thorough evaluation of the exam’s reliability and validity was in order. Thus in the spring of 2012 I joined the lead test developer in a comprehensive evaluation of the IPD Placement Exam. One part of this larger evaluation project involved investigating the writing section in order to address practical concerns of administrators and faculty in the IPD program: namely, whether raters and the scoring rubric were functioning effectively. I assumed responsibility for this part of the evaluation, and the current report is a more extensive and technical presentation of findings that will be shared with IPD program stakeholders so that they can make informed decisions about whether there are any aspects of rater training and/or scoring materials and procedures which could benefit from revision.
Pages/Duration: 55 pages
URI/DOI: http://hdl.handle.net/10125/40721
Appears in Collections:SLS Papers



Items in ScholarSpace are protected by copyright, with all rights reserved, unless otherwise indicated.