Building evidence folders for learning through libraries
The controversy over the 65% Solution underscores the public’s perception that school libraries do not provide vital instructional programs. This legislation, which has been enacted in states such as Texas, Kansas and Louisiana, pumps 65 percent of a state’s educational budget into direct classroom resources (Toppo 2006). Unfortunately, school library media specialists are classified as non-instructional, support personnel according to the National Center for Education Statistics (NCES). This has led the American Association of School Librarians to produce a position statement on the instructional classification of school library media specialists (http://www.ala.org/ala/aasl/aaslproftools/positionstatements/instclass.htm) in hopes of including certified library media specialists as part of the NCES “instruction” classification.
Even without the 65% Solution, however, library media programs are frequently on the chopping block when school budgets shrink. A principal in that predicament recently told me, “I would love to keep my librarian but I have to consider my priorities. I need to retain the positions and programs that show my students are actually learning.” Indeed!
When building level administrators and school advisory councils meet to wrangle over budgets, they seek to support programs that demonstrate positive student growth in areas of high need. The big question is: As a library media specialist, are you able to produce this type of evidence? While many library media specialists spend a major portion of their week engaged in instructional activities, the impact of their teaching is often invisible (Harada and Yoshina 2006). The following are critical questions that library media specialists must wrestle with:
- How does your library media center support student learning?
- What compelling evidence do you have that students have achieved the learning targets?
For library media specialists, who have been comfortable with traditional forms of reporting, responses to the above questions require a dramatic paradigm shift from an object-oriented approach to a student-oriented approach to assessment and evaluation. The object-oriented approach centers on evaluation reports that include statistical counting of “things” such as new acquisitions, circulation figures, and numbers of instructional sessions and planning meetings. The student-oriented approach focuses on assessment of student performance. It involves not only what students learn but also the degree to which student learning is demonstrated (Harada and Yoshina 2005).