When evidence-based literacy programs fail



What can districts learn from an intervention for struggling readers that didn’t work as well as expected?


More than ever, educators are expected to implement evidence-based interventions to improve outcomes for students who have fallen behind. No doubt, that’s why the Every Student Succeeds Act (ESSA) mentions the term evidence-based 61 times. There’s broad agreement that interventions that have been proven effective in one or more research studies stand a better chance of having a positive effect on students and increasing the return on education investments than those without any evidence to back them. But what should educators do when there aren’t many programs that have been proven to work?

Secondary literacy teachers in California’s Oakland Unified School District (OUSD) recently faced that very challenge. In 2015, almost one-third of OUSD students in the secondary grades were four or more years behind grade level in reading, but educators struggled to find intervention programs that had been proven effective at accelerating these older, struggling readers.

To date, the What Works Clearinghouse — the primary source of scientific evidence on what works in education — has identified high-quality research on 21 adolescent literacy programs. Only one program had strong evidence of positive effects: READ 180. Yet after offering READ 180 for years, OUSD found that many secondary readers languished in intervention classes without making sufficient progress. The district’s experience was consistent with research showing that evidence-based programs may not always work for those adolescent readers who are furthest behind. A study of four reading interventions (including READ 180) among students in grade 9 found that none of the interventions improved the achievement of students who had been reading five or more grade levels below expectations (Lang et al., 2009).

To find a solution, OUSD educators expanded their search for evidence-based solutions and decided to try Leveled Literacy Intervention (LLI), an intervention that multiple research studies had found to be effective for students in early grades. But OUSD would later learn that the program could be difficult to implement in secondary schools as designed and could actually do more harm than good.

A promising intervention

LLI is a short-term, intensive intervention system designed to help teachers provide daily, small-group instruction to students who are not achieving grade-level expectations in reading. Although it was originally developed for students in early grades, LLI has since expanded across grades K-12.

Many school districts across the country have used LLI, which has been shown to rapidly improve reading outcomes for students in early grades. A review of the research on LLI by the What Works Clearinghouse (2017) determined that, after 12-18 weeks, LLI had positive effects on general reading achievement, potentially positive effects on reading fluency, and no discernible effects on alphabetics for students in grades K-2.

To implement LLI, teachers begin by assessing students to determine which of LLI’s levels match students’ instructional and independent reading abilities. Teachers then form small groups of up to five students with similar assessment scores and deliver 45-minute daily lessons using a series of texts and lesson guides of progressing difficulty. The recommended program length ranges from 12 to 24 or more weeks, depending on students’ starting reading level and progress.

Given the shortage of evidence on effective adolescent reading interventions and the investment required by LLI (including materials, training and support, and staffing), OUSD was interested in evaluating the program’s usefulness at the secondary level. “We’d been making big investments, taking students out of other classes, and didn’t know if it was benefiting students,” said Nancy Lai, OUSD’s director of language and literacy.

In 2015, the district began piloting LLI in a number of secondary schools. In 2016, after one year of piloting the program, OUSD partnered with Mathematica Policy Research to evaluate LLI’s effectiveness with struggling readers in grades 6-9 in a study funded by the U.S. Department of Education’s National Center for Education Research. The study was the nation’s first randomized controlled trial of LLI in secondary grades.

Unexpected findings

In fall 2016, intervention teachers in the 10 participating schools identified groups of struggling readers to be part of the study. Each group included three to seven students with a similar starting reading level. More than 90% were eligible for free or reduced-price lunch, and almost one-third were English language learners.

The district’s experience was consistent with research showing that evidence-based programs may not always work for those adolescent readers who are furthest behind.

Researchers used a lottery to randomly assign half of the groups (145 students) to receive LLI and the other half (147 students) to the control group. Teachers then began LLI instruction. At the end of the school year, students in all grades took the Scholastic Reading Inventory, which assesses reading comprehension skills, and students in grades 6-8 took the Smarter Balanced Assessment, which tests mastery of grade-level standards in English language arts (ELA)/literacy.

When LLI began, the students scored an average of four years below their expected grade level on the Scholastic Reading Inventory, and 77% scored in the lowest performance level on the Smarter Balanced ELA/literacy assessment. The study found that, after an average of 19 weeks of instruction, LLI had no impact on students’ reading comprehension and a negative impact on their mastery of ELA/literacy standards (Gonzalez et al., 2018). LLI’s impact on Smarter Balanced ELA/literacy scores was roughly equivalent to students losing more than five months of learning, based on the typical annual growth of students in grades 6-8. Students who were pulled out of other classes to receive LLI (rather than receiving instruction in a regularly scheduled intervention class) were particularly negatively affected, possibly as a result of missing grade-level content covered in the ELA/literacy assessment.

Unpacking the results

During the trial, secondary schools in Oakland faced various challenges implementing LLI, including delayed start dates and varying end dates, trade-offs between pullout groups and scheduled classes, skipped and modified lesson components, limited teacher training, and lower student attendance and engagement at the high school level. As a result, most students fell short of the recommended minimum number of LLI sessions and received instruction with low to medium fidelity to the program model.

Most students fell short of the recommended minimum number of LLI sessions and received instruction with low to medium fidelity to the program model.

These implementation challenges likely affected the results. The existing evidence on LLI featured stronger implementation: Students received a greater number of sessions through daily lessons, instructional fidelity ratings were high, and all teachers received eight days of professional development, along with continuing support throughout the period of implementation (Ransford-Kaldon et al., 2010, 2013).

However, the existing evidence on LLI was also based on students in a very different grade span (K-2 rather than 6-9), and some of the implementation challenges reported by study schools were uniquely relevant to secondary schools, meaning that elementary schools would not face these challenges. Yet even with stronger implementation, it is possible that LLI might not be as effective with older students. The research team did not find evidence that the results differed for students who were taught LLI with higher instructional fidelity or by an experienced LLI teacher, although additional research is needed.

The contrasting results show that finding a promising evidence-based program is only one piece of the puzzle. As stated by Henry Levin and colleagues (2010), “Progress in strengthening young people’s literacy now depends on schools a) choosing appropriate programs and b) implementing them consistently and effectively.” The study of LLI in Oakland revealed important lessons for selecting and implementing a literacy intervention in secondary schools.

Lessons for secondary schools

Some of the challenges that study schools faced related to scheduling. Scheduling a regular intervention class can present logistical challenges for middle and high schools, particularly given the small number of students that can be in LLI at one time. Yet, offering LLI in pullout groups required students to miss other classes and make up the work, which concerned some students, parents, and classroom teachers. Students in pullout groups had worse results and were more likely to refuse LLI and attend fewer sessions than those in regularly scheduled classes, particularly at the high school level.

Study schools also found that it is important to consider whether materials are appropriate for students’ reading level and age. Although LLI has developed materials for students in grades 6-12, many teachers used materials designed for earlier grades that matched the starting reading levels of their students, as recommended by the program. This might help explain why teachers tended to skip some lesson components (such as phonics, which some teachers said their students didn’t need) and why high school students were less engaged. Finding materials that are appropriate for struggling readers in high school might be particularly challenging.

The study also suggested that secondary reading intervention teachers might have different training needs from those at the elementary level. Unlike elementary teachers, secondary literacy teachers are not typically trained in the foundational reading skills that are part of LLI and other intervention programs for students with limited literacy. Of the 20 teachers in the study, 14 received training, typically in a two-day session led by an LLI representative, a half-day training led by OUSD staff, or both. Observations showed that these training opportunities did not sufficiently prepare all teachers to conduct LLI as designed. Additional professional development could focus on the intervention as well as foundational reading more broadly.

Looking ahead

Despite the study’s findings, some secondary schools in OUSD are continuing to offer LLI. Some of these schools did not participate in the study but say they have seen positive results due to better implementation. To improve results, the district now recommends that secondary schools offer LLI in regularly scheduled classes rather than pullout groups. In addition, the district is offering teachers training on the program and on foundational reading skills. The district also formed a collaborative of secondary literacy intervention teachers to study the features of successful reading intervention programs, discuss implementation, and track the growth of their students on reading assessments.

There’s reason to believe that additional professional development for teachers could improve implementation of adolescent literacy interventions like LLI, potentially leading to better results. A systematic review of 33 studies of adolescent literacy programs found that the effective approaches provided extensive professional development that led to significantly changed teaching practices; in addition, programs designed to change daily teaching practices had greater research support than those focused on curriculum alone (Slavin et al., 2008).

Like other intensive intervention programs, LLI is fast-paced and requires nuanced judgment calls that might require extensive teacher training and support to be implemented successfully. This is particularly true at the secondary level, where teachers might be less familiar with the strategies that underlie the program. For district staff, this has been one of the biggest lessons from the study. “The results helped to reemphasize that training intervention teachers is really important. But because we’re very decentralized and have limited funding, we sometimes support the purchase of a program and release it to schools to figure out on their own,” said Abbey Kerins, OUSD’s secondary literacy coordinator.

Beyond evidence-based

More broadly, the district is encouraging schools to consider both whether an intervention is backed by evidence from a similar context and whether they have the resources and conditions in place to implement it as designed. District leaders encourage teachers and principals to consult the What Works Clearinghouse, but they’ve found that the information busy educators need to make good decisions is not always easily accessible there. To address this need, staff developed summaries of evidence from the clearinghouse as well as implementation requirements and challenges to consider based on educators’ experiences.

There is also room for more research to identify promising reading interventions for secondary students and to better understand whether programs can work under different implementation contexts and what adaptations might be most beneficial as contexts change. “We have to prove that reading intervention can be effective at the secondary level — we still have such inconsistent results,” said Nancy Lai, OUSD’s director of language and literacy. Partnerships between school districts and researchers, like the one in this study, provide opportunities to study both impacts and implementation in a real-world setting. As OUSD learned, educators must carefully consider both when choosing an evidence-based intervention intended to help students with the greatest needs.


Gonzalez, N., MacIntyre, S., & Beccar-Varela, P. (2018, June). Challenges in adolescent reading intervention: Evidence from a randomized control trial (Working Paper 62). Oakland, CA: Mathematica Policy Research.

Lang, L., Torgesen, J., Vogel, W., Chanter, C., Lefsky, E., & Petscher, Y. (2009). Exploring the relative effectiveness of reading interventions for high school students. Journal of Research on Educational Effectiveness, 2 (2), 147–175.

Levin, H.M., Catlin, D., & Elson, A. (2010). Adolescent literacy programs: Costs of implementation. New York: Carnegie Corporation of New York.

Ransford-Kaldon, C., Ross, C., Lee, C., Sutton Flynt, E., Franceschini, L., & Zoblotsky, T. (2013). Efficacy of the Leveled Literacy Intervention system for K–2 urban students: An empirical evaluation of LLI in Denver Public Schools. Memphis, TN: Center for Research in Education Policy.

Ransford-Kaldon, C., Sutton Flynt, E., Ross, C., Franceschini, L., Zoblotsky, T., Huang, Y., & Gallagher, B. (2010). Implementation of effective intervention: An empirical study to evaluate the efficacy of Fountas & Pinnell’s Leveled Literacy Intervention System (LLI). Memphis, TN: Center for Research in Education Policy.

Slavin, R.E., Cheung, A., Groff, C., & Lake, C. (2008). Effective reading programs for middle and high schools: A best evidence synthesis. Reading Research Quarterly, 43 (3), 290–322.

What Works Clearinghouse. (2017). Leveled Literacy Intervention: Intervention report. Washington, DC: U.S. Department of Education, Institute of Education Sciences.


Citation: Gonzalez, N. (2018). When evidence-based literacy programs fail. Phi Delta Kappan, 100 (4), 54-58.

NAIHOBE GONZALEZ (ngonzalez@mathematica-mpr.com) is a researcher at Mathematica Policy Research in Oakland, Calif.

One Comment

  • Dear PDK Team and Education Colleagues-

    My Reading Apprenticeship colleagues and I were disappointed to read this article suggesting that programs like ours do not exist. While we agree with some of the author’s points (about READ 180, for example), there are other programs like ours that demonstrate with STRONG evidence that it is possible to positively impact secondary teachers’ literacy teaching practices across disciplines and thereby improve academic outcomes for secondary students. We engage thousands of teachers and their students — working at all literacy levels — in deep disciplinary reading, thinking, academic conversation, and learning every single day across the U.S. and beyond.

    Below, please find a link to our broader evidence base on the Reading Apprenticeship website, which includes several efficacy trials and randomized controlled trials funded over the years by IES (U.S. Dept. of Education), National Science Foundation, Carnegie Corporation of New York, William and Flora Hewlett Foundation, Stuart Foundation, Walter S. Johnson Foundation, among others. More recently, we received two SEED grants (Supporting Effective Educator Development: 2015-2018, and 2018-2022) from the U.S. Dept of Education’s Office of Innovation and Improvement. You can read more about our most recent SEED grant here: https://readingapprenticeship.org/sli-wins-seed-grant-and-top-score-in-2018-award-program/

    Reading Apprenticeship: Research and Evidence

    We would be very happy to connect with anyone who would like to know more about Reading Apprenticeship — how it works, our evidence base, and/or our ongoing professional learning institutes (face-to-face and online) for secondary and post-secondary teachers interested in supporting disciplinary literacy in their subject area classrooms.


    Mira Katz
    Mira-Lisa Katz, Ph.D.
    Associate Director
    WestEd, Strategic Literacy Initiative
    730 Harrison Street, Third Floor
    San Francisco, CA 94107
    Follow us! http://www.twitter.com/readapprentice

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

WP_User Object ( [data] => stdClass Object ( [ID] => 489 [user_login] => NGonzalez [user_pass] => $P$BTN77knsG7J3oU676lleMI1XZtehcu/ [user_nicename] => ngonzalez [user_email] => ngonzalez@fake.fake [user_url] => [user_registered] => 2018-11-19 15:50:16 [user_activation_key] => 1542642617:$P$BzEhzYSYhr4nw9CCRELRgTArL8ou8b/ [user_status] => 0 [display_name] => Naihobe Gonzalez [type] => wpuser ) [ID] => 489 [caps] => Array ( [author] => 1 ) [cap_key] => wp_capabilities [roles] => Array ( [0] => author ) [allcaps] => Array ( [upload_files] => 1 [edit_posts] => 1 [edit_published_posts] => 1 [publish_posts] => 1 [read] => 1 [level_2] => 1 [level_1] => 1 [level_0] => 1 [delete_posts] => 1 [delete_published_posts] => 1 [author] => 1 ) [filter] => [site_id:WP_User:private] => 1 ) 489 | 489


Columns & Blogs