In a climate of assessment and accountability, school districts are constantly on the search for product solutions to the pressing issue of student reading performance, as assessed by annual state standardized accountability testing. In an attempt to increase student reading growth rates, some districts are turning to formative assessment systems, such as the formative online reading assessment Diagnostic Online Reading Assessment, or DORA (commercially available from Let’s Go Learn, Inc.), in order to encourage more differentiated classroom instruction and as a growth monitoring tool for instruction. The implementation of DORA, while top-down in nature, is being done in collaboration with the product developers, whose professional development model utilizes a Concerns-Based Adoption Model (CBAM) framework, providing for extensive administrator support, modeling of usage, using a teacher cohort model, and allowing for teacher concerns to be address during the professional development cycle.
The purpose of this study was to examine the effects in student reading growth of the teacher-level implementation decisions and in the top-down, district wide implementation of a formative online reading assessment, as part of district-wide measures to improve student reading performance, and subsequently, student reading test scores.
Data was collected from a large, urban school district in Southern California, with a large minority population, many of whom speak English as a second language. Existing student DORA data was collected, while a survey was administered to teachers to collect teacher-level professional development implementation data. Student data and teacher data were analyzed using regression analysis and Hierarchical Linear Modeling to determine 1) the relationship between teacher-level professional development implementation decisions and the overall teacher use of DORA, and 2) the relationship between teacher-level professional development implementation decisions and student DORA growth over time.
For the first research question, it was determined that none of the study variables pertaining to teacher professional development and implementation decisions were statistically significant predictors of overall teacher usage of the DORA formative assessment program. For the second and third research questions, while it was found that student DORA growth curves did vary by teacher in grades two, three, and five. Few of the study variables pertaining to teacher professional development and program implementation accounted for variance in student growth curves at the teacher level, and none consistently across grade level.
Although many of the study variables pertaining to professional development were not found to be significant in this study, future study regarding the implementation of large-scale formative assessment programs should examine how the effects of this professional development vary as the time of the implementation vary. Further research should also examine the effects of a Concerns-Based Adoption Model (CBAM) on the implementation of district-wide formative assessment programs. Within a climate of assessment and accountability, more emphasis is being placed on formative assessment use in the classroom, and further study of these formative assessments is warranted.