Outcomes

Student Outcomes

Previous Page Next Page

Evaluators conducted many different studies in an effort to try to understand how different aspects of CMSI may have affected student outcomes. They were trying to identify correlational and cause and effect relationships in a complex environment, across a complex set of interconnected activities. These relationships are important not just for CPS, but for the community of school districts that undertake systemic reform. The complexity of the studies described below reflects the complexity of the task at hand…of trying to know what makes a difference.

There are several ways to examine the effects of the reform’s efforts on students over the years. This section begins with a discussion of various analyses related to student performance on the Illinois Student Achievement Test (ISAT), the state of Illinois’ criterion-referenced assessment for grades 3-8, including a broad summary of the findings from across these studies. Interested readers may examine the individual studies for more details, including their strengths and limitations, a variety of charts and graphs, additional statistical details, as well as nuances of the findings. Also described in this section are other elementary student outcomes, such as the performance of elementary students on an end-of-course algebra assessment and their subsequent performance in high school mathematics.

It is also important when reading the following sections to remember several of the difficulties with this type of work in applied settings. First, there is the difficulty that arises in measuring effects of complex initiatives especially those that are evolving over time both in terms of the reform activities themselves, but also where teachers and schools change their participation status (i.e., participating or not or changing dosage of participation). A second difficulty occurs when there is not a universally accepted outcome measure(s). The reform involved multiple grade levels and several different measures of student outcomes some of which are more relevant to certain stakeholders than others. A third difficulty occurs due to the challenges of unpacking and systematically measuring all of the intermediate casual steps (e.g., professional development and in-classroom coaching leading to changes in classroom practice which in turn lead to changes in student performance on high stakes assessments; enhanced teacher content knowledge leading to changes in the rigor of assigned student tasks leading to changes in student performance on interim assessments) in a complex reform's theory of action.

Evaluators attempted to address these difficulties by conducting a range of studies, using different outcome measures and methodologies such that a body of evidence could be collected about the initiative. Although each of these individual approaches has weaknesses, the idea was to see what could be gleaned from the totality of the analyses. Evaluators also sought to pair these quantitative outcome studies with more qualitative studies of implementation and if we would have had more years of implementation and study our plan was to bring these two lines of inquiry into closer alignment. For example, we did do this in the study of 8th grade algebra where we carefully measured classroom implementation and correlated with student performance. We also developed and validated measures of implementation for CMSI implementing classrooms but as of this writing have not completed analyses linking these measures to student outcomes. However, we believe that in an applied setting where implementation of a complex initiative is likely to change over time, this multi-method, evolving approach might be more advantageous than trying to do more rigorous random control studies which might be better conducted in a situation where the treatment can be carefully defined and stays constant over time. In some ways, this approach is philosophically analogous to the Design, Educational Engineering, and Development approach that recently seems to be gaining more visibility (Bryk, 2009).

Previous Page Next Page