Students learn line dancing as part of the Super Summer Academy program at St. Claire Evans Elementary School in Duval County, Fla. Photo by Lisa Spofford
Children from low-income families had higher test results following participation in structured summer learning that shared some key characteristics.
By Daniel Browne
After spending the morning learning fractions, a group of eight- and nine-year-olds are sailing in Boston Harbor, building teamwork and soaking up the sun. You might think they attend an elite prep school, but this is summer, and these are students of the Boston Summer Learning Project, a voluntary program for children from low-income families run by Boston Public Schools.
A study, the largest of its kind, conducted by the RAND Corporation and funded by The Wallace Foundation, now offers evidence that such programs can provide academic and other benefits — particularly to students with high attendance (Augustine et al., 2016).
Why summer matters
School districts face a pair of persistent gaps in opportunities for students and gaps in academic achievement. Children from low-income families don’t have access to the same enriching activities and learning experiences as their more affluent peers. They also lose ground in learning over the summer relative to children from higher-income families.
Summer programs can help shrink these gaps, exposing disadvantaged students to stimulating new activities and helping them make up ground in core academic subjects. Voluntary programs run by the school district that offer a mix of academic and enrichment activities have the potential to reach more students than traditional summer school or boutique programs run by outside organizations.
Yet until now, research on the effectiveness of summer programs has mostly focused on programs that were either mandatory for students or not run by the district. To fill in this blank, The Wallace Foundation launched the National Summer Learning Project in 2011. Wallace commissioned RAND to study the large-scale, voluntary summer learning programs led by five public school districts: Boston, Mass.; Dallas, Texas; Duval County, Fla.; Pittsburgh, Pa.; and Rochester, N.Y. (Two districts, Boston and Dallas, worked with a local nonprofit intermediary to handle the selection, preparation, and oversight of community-based enrichment providers.) The goals of the project were to provide summer learning opportunities to thousands of children in low-income communities, help the districts improve their programs, and understand what effect, if any, the summer programs have on participating students — and what factors influence results.
What the programs look like
The programs created by the districts in the study had several elements in common:
- A mix of academics and enrichment activities;
- Certified teachers providing academic instruction;
- Small class size (no more than 15 students);
- Full-day programming, provided five days a week, except holidays, for five to six weeks;
- At least three hours of instruction in math and English language arts daily;
- No fee for participation; and
- Free transportation and meals.
These elements reflect what experts and evidence say makes for a successful summer program. They also help remove barriers that could prevent families from participating, including cost and transportation.
Beyond these common features, districts had the freedom to make a number of decisions about program design. They chose from a list of established math and English language arts curricula. They also varied significantly in how they approached enrichment. Some had a strong focus on the arts; others offered a broader menu of options, including cooking, computer coding, and sailing. In some districts, enrichment activities differed in each program site, depending on the needs, interests, and nonprofit providers in the community.
“There are going to be some areas in our district where the kids are going to want to learn mariachi music,” said Crystal Rentz, director of summer learning for the Dallas Independent School District. “There are going to be some areas where the kids are going to want to learn how to step [dance].”
When it comes to wooing teachers, a reminder of how much their work means to students can go a long way.
Programs also varied in how they incorporated enrichment into the schedule. Duval County interspersed academic and enrichment classes throughout the day. Pittsburgh offered academics in the morning and enrichment in the afternoon but sought opportunities to connect the two when appropriate. “Each of our enrichment partners designs a writing component to their programming,” said Christine Cray, Pittsburgh’s director of student service reforms. “As kids are getting on bikes and riding around the trails, they’re also writing public service announcements for their peers; they’re writing to their public officials advocating for bike safety and bike lanes.”
What the research says so far
Starting in 2011, RAND looked closely at each district’s summer learning program, identifying strengths and weaknesses and helping get the programs ready to be tested for effectiveness. With two years of program improvements under the districts’ belts, the researchers began a randomized controlled trial in 2013 to evaluate education outcomes, focusing on children who were in 3rd grade in spring 2013. The 5,600 students who applied to summer programs in the five districts were randomly assigned to one of two groups — those selected to take part in the programs for two summers (the treatment group) and those not selected (the control group). Random assignment allowed researchers to attribute any differences in outcomes between the two groups to the summer programs. It also ensured the selection process was fair, given that there were more students who wanted to attend the programs than available slots. Children not selected for the program were given information about free, nonacademic, recreation-based programs and, in one case, scholarships to attend them.
RAND used two types of analysis in the study: causal and correlational. The causal analysis looked at the results of the randomized control trial for students in both the treatment and control groups to determine the effect of the summer programs on school grades and attendance, measures of social-emotional skills, and student performance on standardized assessments of math and reading.
The most notable finding of the causal analysis to date is that children who were selected to take part in the summer learning programs scored higher on the math test taken in fall 2013, after the first summer, than children who applied but were not selected. This edge in math — equivalent to about 15% of what students of the same age typically learn over the course of a year — was statistically significant. This qualifies as strong evidence under the Every Student Succeeds Act (ESSA).
By contrast, in fall 2014 after the second summer, the causal analysis did not show a similar advantage in math for students in the programs. There’s at least one possible explanation why the second summer of programming didn’t yield the same benefit in math as the first: Far fewer students showed up the second summer. By summer 2014, 11% had moved out of the district offering the program (more students changed addresses within districts); another 37% simply didn’t attend. In all, nearly half were no-shows for the second summer.
In a randomized control trial, students in the treatment group are included in the analysis, whether they show up for the program or not. The fewer students who show up, the bigger the effect the program would need to have for the researchers to detect it. In other words, the no-shows diluted any effect there may have been on students who did attend.
Attending a summer program is a choice that is as much the student’s as it is the parents’.
It’s worth pointing out that, for the purposes of the study, students signed up for both summer 2013 and summer 2014 at the same time. By the time the 2014 programs rolled around, 14 months had passed since they first committed to attend. In any other scenario, a program would be unlikely to enroll students so far in advance, so districts shouldn’t necessarily expect to see such a high no-show rate in a typical summer. In 2013, the first year of the trial, the no-show rate was substantially lower — 21%.
High-attending students benefited
The story is different when we shift from the causal analysis to the correlational analysis. Researchers collected extensive data on factors like attendance and the quality of instruction, enabling them to explore the relationship between these factors and student outcomes.
RAND found that those who attended a summer program for 20 or more days in 2013 did better on state math tests than similar students in the control group. This bump was statistically significant and lasted through the following school year.
The results are even more striking for high attenders in 2014: They outperformed control group students in both math and English language arts on fall tests and later in the spring. The difference in performance translates to 20% to 25% of the typical annual gain in math and 20% to 23% of the typical annual gain in English language arts. They also received higher scores on assessments of their social-emotional strengths.
Because most high attenders in 2014 were also high attenders in 2013, the reason for the broader positive outcomes in 2014 isn’t clear. There may have been a cumulative effect of attending two summers in a row; it’s also possible that improvements to the programs in the second summer made a difference. For example, the 2014 English language arts curriculum was better aligned to state standards, and teachers reported that students had reading texts that were more appropriate for their reading level than they had been in 2013. RAND believes the explanation is likely a combination of factors.
High attendance isn’t the only factor that appears to have played a role in student outcomes. Students who spent more time on task — not just sitting in the classroom but actually receiving academic instruction — showed signs of benefitting, too. Those who received at least 25 hours of math or 34 hours of English language arts instruction during their summer program did better than control group students on tests in fall 2013 and fall 2014. After the second summer, these gains lasted into the spring. RAND also found a link between the quality of instruction (which consists of a number of elements, including whether teachers make sure students understand the material and correct inaccurate information) and how well students did in English language arts.
These findings are correlational but are controlled for prior achievement and demographics, giving researchers confidence that the benefits are likely due to the programs and meeting the requirements for promising evidence under ESSA.
When it came to social and emotional outcomes, the researchers were unable to control for students’ preexisting skills, so they are less confident that improved performance was due to the summer programs.
Neither the causal analysis nor the analysis of the benefits for high-attending students found statistically significant effect on grades. RAND will continue to look at outcomes for students in both the treatment and control groups through 7th grade.
What the study means for districts
“Our study clearly shows the benefits for the students who had high attendance rates or high amounts of academic instruction in the summer learning programs,” said Catherine Augustine, senior policy researcher at RAND and the report’s lead author. “I’d add that first, it may be easier to improve reading outcomes than it is to improve math outcomes in a five- to six-week summer program; second, students do need to attend these programs in order to benefit — but getting students to attend consistently is not easy; and third, it’s not just attendance, it’s how time is used.”
The question for districts interested in launching or improving a voluntary summer program is how to translate these findings into action. RAND has issued a series of reports full of practical recommendations. In particular, Getting to Work on Summer Learning: Recommended Practices for Success focuses on practical guidance for those managing programs (Augustine et al., 2013). Here are some highlights from those reports.
Start planning early. “Too often the assumption is you can’t start thinking about summer programs in January, February, or March because it’s still cold out,” said James Doyle, Pittsburgh’s coordinator of out-of-school time. The reality is districts need a long lead time to work through institutional obstacles, from attendance systems that go offline in June to nurses and security personnel who are busy wrapping up school-year tasks when their summer responsibilities begin.
RAND recommends committing to a summer program by December and starting the planning process — covering areas like curriculum, teacher selection, tech support, transportation, etc. — no later than January. Rentz from the Dallas Independent School District goes even further. “If you try to start a brand-new program after October, you set yourself up for failure,” she said.
Invest in high-quality instruction. RAND advises districts to recruit teachers with subject and grade-level experience so they can make connections between the summer curriculum and what students are learning during the school year. The trick is convincing top teachers to give up two precious months of time off.
Dallas recruits only “distinguished” teachers — those who score at the high end of its evaluation system — and offers them an opportunity to further build their skills. “We’re giving them development around leadership, mentorship, coaching, giving effective feedback, and working with peers to grow as a team,” said Tim Hise, executive director of the Thomas Jefferson Feeder Pattern in Dallas, which includes three summer program sites.
Pittsburgh recognizes the need to create a fun and rewarding environment for teachers as well as students. “We’ve got a strong group of folks who come back year after year,” Cray said. “They say things like, ‘This feels like family.’ They love to connect with colleagues from other schools, pick up new strategies, meet new students, take elements of the curriculum that we use over the summer and figure out how to implement that in their classrooms.”
When it comes to wooing teachers, a reminder of how much their work means to students can go a long way. “We survey campers at the end of the summer. Any quotes that speak to how important their teacher has been, we use that as kind of a tug at the heartstrings,” Cray said.
Track and maximize attendance. Taken together, students who came to the programs in 2013 and 2014 attended about 75% of the days, although these numbers ranged from 60% to 80% in specific districts. Among those who participated, about 60% were high attenders, meaning they attended for at least 20 days. (Because some of the programs lasted only five weeks, students could attend 75% of the days and still not meet the 20-day threshold for high attendance.)
Students miss days for many reasons, including family vacations, the need to care for siblings, and fear of bullying. Addressing these issues requires a combination of approaches. Offering the program to multiple age levels may head off the need for older children to care for little brothers and sisters. Hiring staff who have time to focus on student behavior can help cut down on bullying and fighting.
Offering the program to multiple age levels may head off the need for older children to care for little brothers and sisters.
To compete with a trip to the playground or beach, districts need to recognize that attending a summer program is a choice that is as much the student’s as it is the parents’. “It has to be a warm and welcoming place that kids wake up wanting to come to every day,” Cray said. That means “a lot recognitions and rewards and celebrations.” One event in Pittsburgh this summer was Camp Hollywood, where teachers played the role of paparazzi. “They roll out the red carpet, kids walk down it with their families and have a photo taken,” Cray said.
Contain costs. Even as districts strive to raise attendance numbers, they must be ever-mindful of the bottom line. RAND counsels districts to use historical data on no-show and attendance rates when deciding how many teachers to hire, how much space is needed, and other matters that are affected by actual attendance rather than enrollment. If they don’t keep no-show and attendance records, they can refer to RAND’s findings — a 20% to 30% no-show rate and a 75% attendance rate — as a guide.
What the study doesn’t measure
Upcoming reports from RAND’s study will add more to our knowledge of summer learning. Researchers will check in on students who participated in the study in spring 2017, when they’ll be rising 8th graders, to see if the effects of those two summers in district-run programs grow stronger, dwindle, or disappear altogether. They will also add context to the study results with a set of smaller reports on the public policies that help and hinder summer learning, how districts can incorporate summer planning and programming into their broader operations, and an update of best practices for launching and running a summer program.
Lead author Augustine is the first to acknowledge that there is much more to these programs than what she and her colleagues were able to capture in the study. “Some programs provide clothing to students in need. They send meals home with kids who are getting second and third helpings at breakfast and lunch. They set up girls’ breakfasts and lunches for preteens starting to have interpersonal conflicts. They send social workers into the home when they suspect there’s an incident happening.”
Cray points out that Pittsburgh’s summer program provides a service not only to students and their families but also to the district because it acts as an incubator for innovative new approaches. “We’re able to say, ‘Let’s get it going for 300 kids and then think about lessons learned and how we can expand it for 3,000 kids in the school year,’” she said. She credits the program’s enrichment partners with leading the way in developing a “digital badge” system for middle school students. The system enables college admission officers and potential employers to view an online profile showing the skills, experiences, and accomplishments that an individual student picks up outside the traditional school day and year.
Dallas uses its summer program to foster the next generation of teachers. Three of its sites are designated “learning labs,” where Teach for America members serve as apprentices in classrooms led by experienced educators. “Some of our distinguished teachers are really interested in being a part of something different, not just owning their classrooms but also supporting new teachers who are coming into the profession,” Hise said.
In the end, the overriding goal is to help students succeed in school and life. Chris Smith, president and executive director of Boston After School and Beyond, the district’s nonprofit partner, said the National Summer Learning Project is advancing the conversation in the field, from whether voluntary, district-run programs can benefit children to how they can do so. “It’s been incredibly valuable to have rigorous research on an approach many programs would like to implement,” he said. “When we liberate individual programs from the need to prove themselves, we can focus on the practices that are linked with the results.” And the study described here focuses on just that.
Augustine, C.H., McCombs, J.S., Pane, J.F., Schwartz, H.L., Schweig, J., McEachin, A., & Siler-Evans, K. (2016, September). Learning from summer: Effects of voluntary summer learning programs on low-income urban youth. New York, NY: The Wallace Foundation. http://bit.ly/ WallaceSummerLearning
Augustine, C.H., McCombs, J.S., Schwartz, H.L., & Zakaras,L. (2013, August). Getting to work on summer learning. Santa Monica, CA: Rand Corp. http://bit.ly/RandSummerLearning
DANIEL BROWNE is a freelance writer based in Birmingham, Ala. He works frequently with The Wallace Foundation (@WallaceFdn).
Originally published in December 2016/January 2017 Phi Delta Kappan 98 (4), 15-20. © 2017 Phi Delta Kappa International. All rights reserved.