Please use this identifier to cite or link to this item:
Collaborative Rubric Adaptation—Engaging Multiple Perspectives to Define Quality
|Hill2014_04_WASC_CollaborativeRubricAdaptation.pdf||846.74 kB||Adobe PDF||View/Open|
|Title:||Collaborative Rubric Adaptation—Engaging Multiple Perspectives to Define Quality|
|Authors:||Hill, Yao Zhang|
program SLOs assessment
|Date Issued:||25 Apr 2014|
|Citation:||Hill, Y. Z. (2014, April). Collaborative rubric adaptation: Engaging multiple perspectives to define quality. Mini-workshop session presented at the WASC Academic Resource Conference, Los Angeles, CA.|
|Abstract:||This mini workshop provides techniques and simulation opportunities for participants to facilitate collaborative adaptation of a rubric for program learning assessment purposes. Participants will discuss how multiple perspectives in rubric adaptation help program quality evaluation.|
|Description:||A rubric is a scoring guide that describes the criteria that faculty use to evaluate student performance, understanding, or behavior. With abundant sample rubrics available, it is a common practice to adapt an existing rubric in program learning assessment. Using a collaborative process in rubric adaptation, in which all faculty from the program contribute, a program can establish a common set of criteria to evaluate student work. This process can also help faculty establish common interpretations of the quality of student work, which leads to more reliability and robust evaluation of the evidence of student learning. This workshop will show participants the process of collaborative adaption of a rubric through: (1) evaluating the draft rubric against program outcomes, curriculum, and expectations; (2) testing the draft rubric using student sample works; and (3) finalizing the list of modifications. The participants will have the opportunity to facilitate each part of the process by following a script. The facilitator will also provide strategies to involve employers, alumni, and/or faculty from peer institutions in a collaborative rubric adaptation process and lead discussions on its implications in measuring program quality.|
|Appears in Collections:||
Workshops and Presentations|
Please email firstname.lastname@example.org if you need this content in ADA-compliant format.
Items in ScholarSpace are protected by copyright, with all rights reserved, unless otherwise indicated.