Consulting & Evaluation

Documents

Order by : Name | Date | Hits [ Ascendant ]

Santa & Santa; Reading Research Santa & Santa; Reading Research

Date added: 06/21/2010
Date modified: 06/21/2010
Filesize: 53.47 kB
Downloads: 1176

Carol Minnick Santa,

John L. Santa,

 

Reading Research

Reading Research Plagued by Poor Designs and Misleading Conclusions

 

Comment on the IES study “Effectiveness…” Highlighting several major problems of studies like this.

 

 

Robert E. Slavin; Response to Greenleaf and Petrosino Robert E. Slavin; Response to Greenleaf and Petrosino

Date added: 10/16/2010
Date modified: 10/16/2010
Filesize: 65.83 kB
Downloads: 1072

Robert E. Slavin responding to concerns raised by Cynthia Greenleaf and Anthony Petrosino (2009) in their letter to the editors.

 

 

Slavin, Robert E., Cheung, Alan, Groff, Cynthia, & Lake, Cynthia. (2008, July/Aug/Sept). Effective reading programs for middle and high schools: A best-evidence synthesis. Reading Research Quarterly, 43(3), 290-322.. Reprinted with permission of the International Reading Association.   (This permission also includes the Responses/letters to the editor.) 

 

IES Report (Sept. 2008): The Impact IES Report (Sept. 2008): The Impact

Date added: 06/21/2010
Date modified: 06/21/2010
Filesize: 371.29 kB
Downloads: 2047

Institute  of   Education    Sciences

NATIONAL CENTER FOR EDUCATION EVALUATION AND REGIONAL ASSISTANCE

 

The Impact of Two Professional Development Interventions on Early Reading Instruction and Achievement

 

The study produced the following results:

  • Although there were positive impacts on teacher’s knowledge of scientifically based reading instruction and on one of the three instructional practices promoted by the study PD, neither PD intervention resulted in significantly higher student test scores at the end of the one-year treatment. Teachers in schools that were randomly assigned to receive the study’s PD scored significantly higher on the teacher knowledge test than did teachers in control schools, with standardized mean difference effect sizes (hereafter referred to as “effect sizes”) of 0.37 for the institute series alone (treatment A) and 0.38 for the institute series plus coaching (treatment B). Teachers in both treatment A and treatment B used explicit instruction to a significantly greater extent during their reading instruction blocks than teachers in control schools (effect size of 0.33 for treatment A and 0.53 for treatment B). However, there were no statistically significant differences in achievement between students in the treatment and control schools.
  • The added effect of the coaching intervention on teacher practices in the implementation year was not statistically significant. The effect sizes for the added impact of coaching were 0.21 for using explicit instruction, 0.17 for encouraging independent student activity, and 0.03 for differentiating instruction, but these effects may be due to chance.
  • There were no statistically significant impacts on measured teacher or student outcomes in the year following the treatment.

 

IES Report (Nov. 2008): Enhanced Reading IES Report (Nov. 2008): Enhanced Reading

Date added: 06/21/2010
Date modified: 06/21/2010
Filesize: 786.08 kB
Downloads: 1320

Institute  of   Education    Sciences

NATIONAL CENTER FOR EDUCATION EVALUATION AND REGIONAL ASSISTANCE

 

The Enhanced Reading Opportunities Study

Findings from the Second Year of Implementation

  

This report presents findings from the Enhanced Reading Opportunities (ERO) study — a demonstration and rigorous evaluation of two supplemental literacy programs that aim to improve the reading comprehension skills and school performance of struggling ninth-grade readers: RAAL and Xtreme Reading

 

The key findings discussed in the report include the following:

  • On average, across the 34 participating high schools, the supplemental literacy programs improved student reading comprehension test scores by 0.08 standard deviation. This represents a statistically significant improvement in students’ reading comprehension (p-value = 0.042).
  • Seventy-seven percent of the students who enrolled in the ERO classes in the second year of the study were still reading at two or more years be-low grade level at the end of ninth grade, relative to the expected reading achievement of a nationally representative sample of ninth-grade students. One of the two interventions – Reading Apprenticeship Academic Literacy (RAAL) — had a positive and statistically significant impact on reading comprehension test scores (0.14 standard deviation; p-value = 0.015). Although not statistically significant, a positive impact on reading comprehension (0.02 standard deviation) was also produced by the other intervention, Xtreme Reading. The difference in impacts between the two programs is not statistically significant, and thus it can-not be concluded that RAAL had a different effect on reading comprehension than Xtreme Reading.

 

The overall impact of the ERO programs on reading comprehension test scores in the second year of implementation (0.08 standard deviation) is not statistically different from their impact in the first year of implemen­tation (0.09 standard deviation), nor is each intervention’s impact in the second year of implementation statistically different from its impact in the first year.

The implementation fidelity of the ERO programs was more highly rated in the second year of the study than in the first year. In comparison with the first year, a greater number of schools in the second year of the study were deemed to have programs that were well aligned with the program developers’ specifications for implementation fidelity (26 schools in the second year, compared with 16 schools in the first year), and fewer schools were considered to be poorly aligned (one school in the second year, compared with 10 schools in the first year).

 

IES Report (May 2010): Effectiveness IES Report (May 2010): Effectiveness

Date added: 06/21/2010
Date modified: 06/21/2010
Filesize: 2.44 MB
Downloads: 3577

Institute  of   Education    Sciences

NATIONAL CENTER FOR EDUCATION EVALUATION AND REGIONAL ASSISTANCE

Effectiveness of Selected Supplemental Reading Comprehension Interventions: Findings from Two Student Cohorts

The Institute of Education Sciences (IES) of the Department of Education (ED) has undertaken a rigorous evaluation of curricula designed to improve reading comprehension as one step toward meeting that research gap. The study was conducted based on a rigorous experimental design for assessing the effects of four reading comprehension curricula on reading comprehension in selected districts across the country, where schools were randomly assigned to use one of the four treatment curricula in their fifth-grade classrooms or to a control group. The four curricula included in the study are: (1) Project CRISS, developed by CRISS (Santa et al. 2004), (2) ReadAbout, developed by Scholastic (Scholastic 2005), (3) Read for Real, developed by Chapman University and Zaner-Bloser (Crawford et al. 2005), and (4) Reading for Knowledge, developed by the Success for All Foundation (Madden and Crenson 2006).

 

The main findings are:

  • The curricula did not have an impact on students one year after the end of their implementation. In the second year, after the first cohort of students was no longer using the interventions, there were no statistically significant impacts of any of the four curricula.
  • Impacts were not statistically significantly larger after schools had one year of experience using the curricula. Impacts for the second cohort of students were not statistically significantly different from zero or from the impacts for the first cohort of students. (Treatment students in the second cohort attended schools that had one prior year of experience using the study curricula, while treatment students in the first cohort attended schools with no prior experience using the study curricula. Reading for Knowledge was not implemented with the second cohort of students.)
  • The impact of one of the curricula (ReadAbout) was statistically significantly larger after teachers had one year of experience using the curricula. There was a positive, statistically significant impact of ReadAbout on the social studies reading comprehension assessment for second-cohort students taught by teachers who were in the study both years (effect size: 0.22). This impact was statistically significantly larger than the impact for first-cohort students taught by the same teachers in the first year of the study.

In summary, our findings do not support the hypothesis that these four supplemental reading comprehension curricula improve students’ reading comprehension, except when ReadAbout teachers have had one prior year of experience using the ReadAbout curriculum.