University Courses on
Mathematics and Science Content Knowledge

Aims, Actions, Adaptations:

Sidebar
An Example of Assessing Teacher Learning Related
to University Courses

Previous Page

Loyola University, one of the university partners working with CPS, has found some initial evidence regarding the influence of their expanded endorsement program--a three-year program designed to improve the quality of middle school science teaching. In summer 2005, 23 teachers from 23 different high-needs schools began a nine-course sequence of courses, including six science content courses and three courses on middle school science pedagogy. In addition to the courses, participants worked with research-active science faculty during enhancement sessions and academic-year follow-up sessions, which give them insights into what it means to “do” science. Teachers also completed an action research project on the topic of their choice to further improve their classroom practice.

Evaluation teams measured changes in teacher content knowledge, changes in instructional practice, and changes in student achievement. They gathered both quantitative and qualitative data to describe changes in teacher content knowledge. In each content course, Loyola researchers found statistically significant gains in teacher content knowledge as measured by pre- and post-testing. Participants were also asked to write a series of reflections that were analyzed to identify important themes. Many teachers expressed a greater understanding of science content, including: looking at science with a new outlook, better understanding of the connection between math and science, and increased personal confidence in content knowledge and teaching. Journal reflections described changes in instructional practice as well, including: using hands-on activities and lab activities to explain content, connecting the knowledge with other disciplines or real-world phenomena, and paying attention to students’ prior knowledge and misconceptions.

Another data source for monitoring change in instructional practice was the Survey of Enacted Curriculum (SEC) (www.seconline.org), a self-report online instrument which had been used for analyzing differences in instructional practices in a variety of settings (Blank, Birman & Garet, 2005). The SEC assesses what content is being taught by the teacher and how it is being taught. Loyola compared the responses of the 23 participants to 21 non-participating teachers from their schools as a comparison group. SEC data revealed that participating teachers, versus comparison group teachers, self-reported that they more often allowed students to work individually on scientific projects, modeled science for their students, engaged students in hands-on scientific experiments, encouraged students via journals and portfolios to self-reflect on their classroom experiences, and used technology in the classroom. Although these differences cannot be attributed directly to participation in Loyola’s program, it was encouraging to note at least some association between participation and classroom behavior. Analyses of how the teachers’ responses changed over the course of the program (ended in spring 2008) may also indicate whether the program impacted instructional practice.

Loyola has also begun to analyze the state assessment results of students in participant teachers’ classrooms. They can only analyze the data from teachers who teach 4th or 7th grades because those were the years in which state assessments in science were given. Of the 23 participants, 13 teachers taught 4th or 7th grade science. Data from two participants’ students were not available. Performance data from the remaining 11 participant schools were matched to comparison schools that were selected based on geographical location, school size, racial/ethnic make-up of the student body, and percentage of low-income families. Tests of the difference in medians for the treatment and comparison schools indicated the median percent of students who met or exceeded standards in participants’ schools (median = 84%) was significantly higher than the median percent of students who met or exceeded standards in comparison schools (median = 74%, x2(1) = 4.56, p < .05). Thus, students attending schools of participating teachers seemed to meet or exceed state science standards at a higher rate than students in matched comparison schools. Once again, these differences cannot be causally attributed to Loyola’s program or their teachers’ success in the classroom. Loyola intends to extend this analysis to assessment data from the same sets of schools in academic year 2004-2005 (pre-program) and 2006-2007 (year 2) as the data become available. This would indicate whether the differences observed were consistent.


Previous Page