Does participation in the DMI professional development program lead to increases in reform-oriented teaching?
Does participation in DMI lead to increases in students' mathematics learning and achievement, especially in their ability to explain their thinking and justify their answers?
What is the process by which a reform-oriented professional development program can influence teaching practice and, thus, student learning? Through what mechanisms does DMI have impact, and with what kinds of support do we see the desired changes in our outcome measures when the larger professional development context is examined?
Evaluating DMI uses a randomized experimental design in which volunteer teachers are assigned to take one or two DMI professional development seminars (24 to 48 hours of PD) either during one academic year or to wait until the following year. (Our first cohort were assigned to summer/ fall/ winter 2011-12, or summer/ fall/ winter 2012-13 sessions. Our second cohort will be assigned to summer/ fall/ winter 2012-13 or summer/ fall 2013 sessions.) We follow both Treatment and Comparison groups for two years, allowing for experimental comparison between groups in Year 1, and quasi-experimental follow-up in Year 2. This provides evidence of changes attributable to DMI in the Treatment group over longer times, as well as direct pre- to post-treatment comparison of teaching practice in the Comparison group.
The estimated 195 participants will be Grades 1-5 teachers drawn from several Massachusetts and North Carolina districts currently using the Investigations in Number, Data and Space curriculum to reduce the impact of curriculum differences.
Evaluating DMI Advisory Board
Linda Davenport Boston Public Schools Megan Franke University of California, Los Angeles (UCLA) Nicole Kersting University of Arizona, Tucson Susan Jo Russell TERC Aline Sayer University of Massachusetts, Amherst Deborah Schifter Education Development Center (EDC) Myriam Steinback TERC Bill Nave External Evaluator