Evaluating DMI Research Overview

This multi-year, full scale efficacy study of the Developing Mathematical Ideas (DMI) elementary math professional development program builds off prior work showing the impact of DMI on teachers' mathematical and pedagogical knowledge. It asks, How does elementary teacher participation in DMI affect teacher knowledge, teaching practice, and student learning?

The study uses experimental and quasi-experimental methods, works with teachers and students in urban and suburban school districts, and begins to gather evidence of the impact of implementing DMI at scale. It also builds and tests new measures of teacher content knowledge and teaching practice.

Research Questions

  1. Does participation in the DMI professional development program lead to increases in reform-oriented teaching?
  2. Does participation in DMI lead to increases in students' mathematics learning and achievement, especially in their ability to explain their thinking and justify their answers?
  3. What is the process by which a reform-oriented professional development program can influence teaching practice and, thus, student learning? Through what mechanisms does DMI have impact, and with what kinds of support do we see the desired changes in our outcome measures when the larger professional development context is examined?

#Back to Top

Research Design

Evaluating DMI uses a randomized experimental design in which volunteer teachers are assigned to take one or two DMI professional development seminars (24 to 48 hours of PD) either during one academic year or to wait until the following year. (Our first cohort were assigned to summer/ fall/ winter 2011-12, or summer/ fall/ winter 2012-13 sessions. Our second cohort will be assigned to summer/ fall/ winter 2012-13 or summer/ fall 2013 sessions.) We follow both Treatment and Comparison groups for two years, allowing for experimental comparison between groups in Year 1, and quasi-experimental follow-up in Year 2. This provides evidence of changes attributable to DMI in the Treatment group over longer times, as well as direct pre- to post-treatment comparison of teaching practice in the Comparison group.

The estimated 195 participants will be Grades 1-5 teachers drawn from several Massachusetts and North Carolina districts currently using the Investigations in Number, Data and Space curriculum to reduce the impact of curriculum differences.


Evaluating DMI Advisory Board

Linda Davenport
Boston Public Schools
Megan Franke
University of California, Los Angeles (UCLA)
Nicole Kersting
University of Arizona, Tucson
Susan Jo Russell
TERC
Aline Sayer
University of Massachusetts, Amherst
Deborah Schifter
Education Development Center (EDC)
Myriam Steinback
TERC
Bill Nave
External Evaluator

Note: The pink diamond represents the experimental intervention. Green ovals represent the several outcomes of interest. Yellow boxes represent the several measures used to operationalize each outcome. Blue boxes represent fidelity and background measures.

#Back to Top

Measures and Analyses

We study the impact of teachers' DMI participation on their knowledge, their classroom practice, and on their students' learning and achievement, using several measures for each. We also gather background experience and fidelity of implementation data to be used as covariates in our analyses. Fidelity measures include extent of teachers' participation in DMI and in PD or other professional supports with similar characteristics (i.e., with a focus on deepening understanding of math and student learning, and discourse to support inquiry).

We conduct several analyses, using ordinary least squares (OLS) regression, hierarchical linear modeling (HLM), or Structural Equation Modeling (SEM) as appropriate. These examine the experimental impact of DMI on knowledge, practice, and student achievement, as well as a variety of quasi-experimental comparisons. We also explore how teacher knowledge mediates DMI's impact on teaching practice, and how both teacher knowledge and practice mediate DMI's impact on student achievement.

#Back to Top

Project Outcomes

We anticipate the following research outcomes from the project:

  • Classroom Video Assessment (CVA) measure on number and operations. This measure of teachers' situated knowledge about math content has been developed with collaborator Nicole Kersting, extending prior work she's done on Fractions, Ratios and Proportions, and on Variables, Expressions and Equations.
  • Published papers, conference presentations, and a report for academic and practitioner audiences
  • Evaluation of the utility of this report to large districts across the country

As they become available, Findings for the study are being posted here.
#Back to Top