by Alexander Russo
There’ve been some dramatic and confusing education stories coming out of Tennessee these past few weeks, all focused on a state report that revealed that an eye-popping percentage of high school graduates had been given diplomas without taking all of the state’s required courses.
“One-third of Tennessee high school graduates receive a diploma without ever completing the state’s minimum course requirements,” according to the state report, released publicly late last month.
School board members and others were shocked. How could that be? Were schools and districts cheating on their graduation rate numbers? Would Tennessee students have their diplomas taken back? The state had recently declared an 88 percent graduation rate.
News outlets were all over the story: “Tennessee has been praised nationally for its high graduation rate while also maintaining rigorous graduation requirements for high school. But it turns out, that’s not entirely true,” reported Chalkbeat. Other outlets soon followed:
Initial coverage of state graduation audit
But questions about that 33 percent number started coming up quickly – some of them from the state itself, which had known from the start that its number might be off, and some also from local districts who saw the headlines and questioned the figures.
Not too long afterwards, the state revealed that the more accurate estimate of kids who graduated without fulfilling all the course requirements was more like 22 percent.It was still a big number, but much lower than the original 33 percent figure.
And so a second wave of stories came out, attempting to clarify the situation. A February 15th Chalkbeat story was headlined That stunning statistic about a third of Tennessee graduates not meeting requirements? It’s not true.
Who’s mess was this?
Based on a review of the report and accompanying coverage, as well as interviews with administrators and journalists who covered the story, it seems clear that everyone could have done a better job.
The state issued a bad number without carefully considering its flaws or making them clear to reporters and board members, then belatedly realized its mistake and walked the initial figure back. But news outlets contributed to the problem by rushing to report the initial figure without questioning just how iffy it might be, unintentionally delivering inaccurate information to the public. The end result has been widespread confusion that will take a long time to clear up – if it ever is.
It’s a good lesson for us all, assuming journalists can understand and acknowledge their role in the debacle. Can they?
Screengrab from state PowerPoint presentation
What the state was trying to do – audit its graduation rate – was by all accounts a necessary and admirable thing to do. But the state’s initial estimate of kids who’d graduated without fulfilling diploma requirements was overstated in ways that were apparent from the start and became much clearer over time.
What had happened, according to Tennessee Organization of School Superintendents executive director Wayne Miller, is that the data released in the report were the product of an incomplete and flawed information reporting system that doesn’t capture detailed information about students’ course-taking. “The state doesn’t have a way of knowing the course profile of every student,” said Miller in a phone interview. “There’s no way to record substitutions and other courses in the software package.” There were also data entry problems.
These issues weren’t unknown to the state at the time they delivered the report. There were some caveats included in the initial PowerPoint presentation from the state: “We believe up to 50% are data related.”
“We believed in that potentially up to 50 percent of the data would be substitution or entry error,” said state education department staffer Sara Gast. “We said that at the event and on the Powerpoint slide.”
However, the state did not flag questions about the number in the press release or the original version of the report.
How did news outlets do at reporting the dramatic figure and the many questions about it?
Initial news reports essentially took the report at face value and transmitted the inaccurate information to readers. Journalists reporting on the audit all included some mention of the possibility of data errors but their stories didn’t seem to contemplate that the state’s estimate might not have been thoroughly vetted, or to grasp how big the error margin might be.
The initial story from The Tennesseean was headlined State says 1 in 3 high school grads don’t meet requirements. The story noted at bottom that data entry mistakes and other technical issues might have inflated the figure. “Tennessee Department of Education spokeswoman Sara Gast said [that] the state believes some portion of the third can be attributed to data entry mistakes.” But it didn’t take the question any further.
A Nashville Public Radio story was headlined Tennessee Says A Third Of Its High School Graduates Didn’t Meet Requirements. The story mentioned the possibility of poor data but didn’t quantify the impact.
Chalkbeat’s original January 27th story mentioned possible errors but they were not quantified or elaborated: “Reasons for missed credits included a lack of teachers in some subject areas, especially foreign language; data entry errors; and a dearth of school counselors.”
A February 15th Chalkbeat story described the revisions and clarifications to the state’s original report, and included a correction of the state’s initial report: “This version corrects that, based on current information, only 22 percent of Tennessee graduates did not meet requirements. In a previous version, Commissioner Candace [sic] McQueen misspoke regarding the percentage of missing requirements attributed to data errors.”
Professional journalists are generally very careful about what statistics they share with the public. However, none of the published accounts took responsibility for transmitting information that turned out to be incorrect. And none of the journalists interviewed about the situation said that they or their media colleagues were even partially culpable for the confusion.
“Part of what we do at Chalkbeat is go to the meetings that other people aren’t going to and share the news that comes out of them,” said Chalkbeat managing editor Philissa Cramer. “What we did in this story is exactly what we do. In this case, the state’s statistic turned out not to be exactly as it was initially discussed. Our ongoing reporting helped bring that to light.”
“You sort of take the state department at their word that what they put out is accurate,” said Nashville Public Radio’s Blake Farmer. “I don’t know what more could have been done. It’s hard to imagine seeing a story like that and not getting something out there pretty quickly…. Even with the state’s caveats, one believes as a journalist that a government agency compiles a report, they’re doing so because they feel like the number is meaningful.”
Educators and administrators interviewed for this column saw a more mixed picture.
State Commissioner of Education Candice McQueen accepted a large portion of the responsibility for the confusion that’s taken place. As The Tennessean reported, “The [state education] department said it didn’t do enough to provide proper context behind the numbers.”
However, the state was clear from the start that “a good portion of that number was related to human entry error or confusion around substitutions,” she said in a phone interview. “The stories that ran focused on that one number, and the headlines and stories that resulted were somewhat misleading.”
“The difficulty with this statistic was that it was an easy one to blow up into a major story,” said Tennessee education department head researcher Nate Schwartz. “A big bunch of Tennessee students didn’t appear to have met all the requirements. The natural next step for a lot of people to take was to conclude that as a whole lot of Tennessee high school graduates who didn’t deserve to graduate. But that wasn’t accurate or fair.”
For his part, school administrators association head Wayne Miller said he would have appreciated more of a chance to debunk the story before it was published initially. “It would have been really nice [for reporters] to have come to this group of stakeholders and said, ‘Hey, what do you think?’”
To be clear, the state department would have checked with the districts before presenting this information to the board and to the public. The state should also have anticipated that the graduation rate number would attract enormous amounts of attention and include additional warnings and caveats about the preliminary nature of the number. The primary responsibility was theirs.
However, media outlets that glommed onto the initial statistic should also have included the known concerns more prominently, taken the time to check with districts and others to make sure that the number was solid, and at least have reported more prominently that the state was putting out a confusing and potentially misleading number that hadn’t been checked out. Just because the number comes from a state agency doesn’t mean it shouldn’t get checked.
Sure, there are realistic limits on what journalists can do in a breaking news situation. And in a perfect world reporters wouldn’t have to vet official reports for inaccuracies. But they can certainly do a better job indicating to readers the uncertainties surrounding new information from the start than they seem to have done in this instance. They could have kept tuning their pieces in the hours after initial publication, adding information from the state and additional responses from districts. And they could have made sure to correct the inaccurate figures from the initial wave of coverage much more quickly. The initial Chalkbeat story was left without an update until very recently. The Nashville Public Radio story remains as it was originally.