Systemic Nature of Reform in Chicago

Studying the Systemic Reform

Previous Page Next Page

This website exists because the program to transform mathematics and science teaching and learning that CPS undertook in 2002 included in its scope a redefined role for evaluation research. Before 2002, CPS systemic reform efforts did not include an evaluation research component. CPS did submit to funders grant annual reports for compliance purposes, but they were essentially a retrospective aggregation of activity data, and they did not include evaluation or analysis of either program processes or outcomes. Beginning in 2002, as it embarked on its systemic reform effort, the district dedicated resources to develop an “internal” evaluation team and hire “external” evaluators. These evaluators provided information to the district management throughout the lifecycle of project activities.

This study seeks to improve the understanding of mathematics and science education reform by addressing three issues that are too seldom addressed in research on systemic reform:

Bridging the research and practice divide through dialogue

There is a need for research that addresses the complexity of science and math education reform efforts by looking not only at the impacts of the reform, but also the processes by which innovations are used and brought to scale (Hamilton et al., 2003). This calls for research and evaluation that combine summative (impact-focused) methodologies and analyses with more formative, process-oriented approaches (Nevo, 2002). Those who attempt this integration of process and outcome emphases have the opportunity to overcome the notorious theory/practice divide and apply educational theories (of learning, instruction, policy implementation, etc.) to the lived experiences within math and science K-12 schooling.

The district math and science leaders had explicit beliefs about evaluation activities that are described in detail in another section of this document. At the core of these beliefs was the understanding that “real-time” presentation of and dialogue around research findings strengthened reform efforts. The internal evaluation team met weekly with the reform’s leadership team. Those meetings were informed by input from the weekly meetings of the internal and external evaluators to share evaluation findings and plans. The delivery of evaluation findings was refined to align with key budget and policy decision requirements.

The studies described on these Scale Up: Systemic Reform of Math and Science Education in Chicago web pages are the result of these evaluation efforts, and refer to the evaluation reports that described math and science reform activities from 2002 – 2008. Throughout this web site there are links to over 100 original evaluation reports that are on CPS’s public web site. cite

Grounding understanding in empirical longitudinal evidence

Across communities of educational reform researchers and practitioners, the “promising” practices of math and science systemic reform are often shared informally, rather than systematically. This is in part because studies of reform implementation are generally not very rich or longitudinal in nature, and this is in part because few systemic reform efforts plan strong evaluation efforts from the beginning. Tables 1 and 2 below provide a sampling of the types and numbers of evaluation data sources collected from 2002-2007 as part of the CMSI effort to evaluate the initiative from its inception.

Longitudinal case studies of schools affected by the CMSI, using a sample population that mirrored the district demographically, were a central component of the evaluation data collected across the first five years of the initiative. These cases, which included interviews, focus groups, observations, and shadows, have allowed researchers to link teachers’ beliefs and practices to school contextual issues and student outcomes in reports and throughout this document. Table 1 below summarizes a selection of the empirical case study data used in this effort. The data listed are not just discrete pieces of data collected each year but rather were collected longitudinally.

Table 1: Inventory of Select CMSI Case Study Evaluation Data

Year Principal Interviews Teacher Interviews/ Focus Groups Classroom Observations School Meetings or Professional Development Workshops Specialist/Lead Teacher Shadows and Interviews
2003-2004
(17 Schools)
28 28 56 18
2004-2005
(14 Schools)
28 28 56 33
2005-2006
(13 Schools)
26 26 26 33
2006-2007
(13 Schools)
11 47 24 17 9
2007-2008
(14 Schools)
7 40 40 18 2
Totals 100 169 64 173 95

In addition to the case studies, evaluators conducted interviews and focus groups, administered surveys and written reflections, and observed district and school personnel between 2002 and 2007. Table 2 displays a sampling of these data sources.

Table 2: Inventory of Select Non-Case Study Evaluation Data

Year Interviews/Focus Groups OMS Staff

(# of staff)
Interviews/Focus groups Instructional Leaders*

(# of staff)
“Drop-in” school studies**

(# of schools)
Surveys and written reflections***

(# of administrations)
2002-2003 20 9 13
2003-2004 17 16 4
2004-2005 34 32 48 10
2005-2006 33 48 27 3
2006-2007 3 10 5
2007-2008 3 30 1
Totals 110 145 75 36 (over 2000 completed surveys and reflections)

*Includes: School-based specialists, citywide specialists, Area coaches, and university instructors
**Data collected in an open-ended format via researcher visit
***Data collected from OMS staff, instruction leaders, teachers, and administrators

In addition to this inventory of data in Table 1 and Table 2, many other quantitative data sources were used in this study. For example, district data on teacher endorsement/certification and student test score, pass rates, AP data were analyzed.

The data include not only quantitative and qualitative information about the processes and impact of reform activities at many levels of implementation; they also describe the processes and impacts of collaboration between math and science educators, school and university administrators, funding agencies, and external and internal evaluators. As such, the evaluation teams’ records of their interactions with CMSI stakeholders were used for this study also.

Addressing complexities inherent in evaluating systemic reform efforts

The complexities inherent in large-scale systemic reforms make documenting and evaluating them a comparably complex task. Others studying systemic education reform have pondered the question of how to examine a large scale, complex, and dynamic systemic process. Supovitz and Taylor outline various approaches that they and others have used in evaluating systemic reform (Supovitz & Taylor, 2005).

Focus on one facet. One approach is to examine a single facet, or component, of the reform. In Chicago, both internal and external evaluators studied individual components of the CPS math and science reforms as seen in the body of evaluation reports. These component evaluations were useful to CPS staff involved in planning and implementing reform activities. For example, an internal evaluation of a summer program for students entering high school and an external evaluation of algebra courses for 9th graders, were done separately and used by program planners. However, neither evaluators nor program managers made connections between the studies, even though often the same students took the summer school program and then enrolled in algebra in the fall.

Focus on the parts of the whole. A second approach is to “decompose the key components of a systemic reform effort and to examine each component individually, such that the collective studies provide a concerted picture” (Supovitz & Taylor, 2005), p. 206). The authors of this study employed this approach in the evaluation of different facets of the Chicago Math and Science Initiative (CMSI). Using an evaluation framework aligned to the CMSI theory of action, separate reports on professional development, instructional leadership, and teachers’ use of instructional materials were generated over six years. While these evaluations were useful to district staff, they failed to document the systemic nature of the reform, where all facets affect each other.

Focus on the parts, together. A third approach is to “examine relevant components of an integrated system simultaneously” (Supovitz & Taylor, 2005), p. 206), the design concept behind this study. The material on this website is based on evaluation and research that focused on parts of the systemic reform effort (Supovitz and Taylor’s second approach). However, this paper integrates those pieces to examine the whole systemic reform effort.

The framework used in this study looks across “programs” and “policies” to the contexts, visions, and dynamics that fueled the interactive nature of math and science education reform across time.


Please see the program evaluation search tool (keyword= "CMSI") at Chicago Public Schools Office of Performance for research reports.

Previous Page Next Page