Student Assessment and Program Evaluation
Aims, Actions, Adaptations
For an assessment system to support standards-based learning, it must have what the National Research Council’s Committee on the Foundations of Assessment refers to as coherence. Specifically, “as one moves up and down the levels of the system, from the classroom through the school, district, and state, assessments along this vertical dimension should align” (National Research Council (U.S.), Atkin, Black, & Coffey, 2001). This concept of assessment coherence fits well with other district efforts at coherence in mathematics and science reform strategies.
Reform leaders saw two kinds of opportunities to develop coherence in relation to Illinois mathematics and science assessments. One attempted to buffer the effects of the state testing program on CPS stakeholders; the other created a bridge between the district and the state. Teachers concerned about student performance on standardized tests often felt compelled to “teach to the test,” at the risk of abandoning the content and pacing of the district-supported materials. OMS attempted to buffer school staff from the potential ramifications of this “test-driven” behavior. Through both formal communications and messages delivered in professional development and coaching sessions, OMS sent a consistent message: standardized test preparation focused on sample test items is not the same thing as science or math instruction, but student-focused instruction that is faithful to the design of CMSI-supported math or science instructional materials does prepare students for high-stakes exams. On the other hand, OMS attempted to bridge state assessment policies by nominating individuals to be part of the state’s review committees for the assessment development process. Additionally, OMS provided feedback to state leaders concerning assessment proposals.
In order to meet the goal of “assessment coherence,” the district took a unique approach to the design of the benchmark assessment system. Teachers worked with OMS and university staff to score extended response items. The district shared multiple choice distractor analyses and incorporated professional development on assessment literacy directly into the assessment development process. What was unique about the process was that it integrated not just the development of psychometrically valid assessment, but it simultaneous developed the aligned professional development for use of the assessments, incorporated the scoring by teachers as an integral part of the assessment, and developed a platform for delivery of the results and associated tools for use by teachers in using and analyzing the assessment results.
The design of the benchmark assessment program contributed to district-wide coherence in additional ways. The forms had the same types of item formats (multiple choice, short answer, and extended response) that appear on the state’s high stakes assessment. Also, the assessment used the same rubric for scoring extended-response questions as the state assessment. Panels of both external measurement experts and district teachers scored actual CPS student work to use as concrete exemplars at each dimension and score point of the rubric. These exemplars along with scoring guides were disseminated to all district staff. As more schools adopted the district-supported instructional materials, the benchmark assessments were refined to more closely match the instructional pacing of the materials.
Coherence between elementary and high school algebra
As a way to bridge the elementary-high school divide, the reform leaders used assessment to provide a platform for increased coherence. This was done through the development of an end-of-course, 8th-grade algebra assessment, the Algebra Exit Exam. As the district established structures and protocols to enable more middle grade students to take a high school-level algebra course, it was necessary to establish consensus around what it meant to master algebra, regardless of student grade level. By working with both high school and elementary teachers and administrators, as well as university mathematics faculty, to create an end-of-course assessment, the district created instructional coherence around algebra.
Coherence in classrooms
At the classroom level, the district used CMSI professional development to focus teachers on using the assessments embedded within the district-supported curricula cite as well as on analyzing student work. The district worked with external assessment experts to understand the kinds of assessments embedded in each set of instructional materials, and how they were designed to be used. The results of this project were integrated into the CMSI professional development. The OMS also developed protocols for examining student work and incorporated these into the materials-specific professional development sessions. In this way, teachers learned how to adapt their instructional practices based on results from both curriculum-embedded assessments and analysis of their students’ work. Working with external assessment experts, the district undertook a project to capture the nature and use of the benchmark assessments developed.
Taken together, these efforts created a coherent assessment system by incorporating several different elements. First, they addressed vertical coherence by aligning classroom, benchmark, and statewide high stakes assessments. Second, they addressed content coherence by closely matching district-developed assessments with the content and pacing of district-supported mathematics and science instructional programs at both the elementary and high school levels. Third, the efforts addressed issues of format coherence, by structuring the elementary mathematics benchmark assessments to utilize formats similar to those on the statewide assessments and having classroom-embedded assessments and the benchmark assessments use formats similar to those used in the instructional materials.
The reform effort addressed scoring coherence in the design of the benchmark assessment program and in training. The benchmark assessment program addressed scoring coherence by providing teachers with benchmark assessment guides that used actual student work examples. Also, the benchmark assessment scoring rubrics were very similar to those used in the statewide assessment. By providing benchmark assessment training to teachers and district staff and employing and modeling inter-rater reliabilities techniques scoring coherence was improved.
Finally, by developing and distributing training modules and materials the district attempted to address coherence in assessment training. Given the size of the district, variability in presenters, and assessment literacy among district staff, this has remained a continuous challenge.