Why reporters should use survey results more frequently in their stories in 2019-2020 – and smart ways to do so.
By Alexander Russo
Earlier this week, Phi Delta Kappa (PDK) International’s annual education poll came out, generating a surge in coverage.
Outlets including Governing, Education Dive, and LA School Report all reported the main findings of this year’s poll, including public views on teacher salaries, classroom morale, and raising taxes to fund education.
“Half of teachers say they have seriously considered leaving the profession, and most said they would strike if given the opportunity,” reported the Washington Post. “Most said they would not want one of their own children to follow them into teaching.”
If past coverage is any indicator, however, reporting of this and other similar education poll results will drop off sharply after a week or two after their release. And that’s too bad.
Poll results provide more than a jolt of immediate information. On a beat as polarized as education, they allow reporters to get past the black-and-white claims of advocates and political opponents and share the often more nuanced views of the public on hot-button issues.
Disclosure: PDK is the fiscal sponsor for The Grade. However, The Grade is editorially independent from PDK.
The Washington Post and several other outlets reported the results of the new PDK International poll this week.
In the 2019-2020 school year that’s soon beginning, education reporters and editors should incorporate public opinion results more frequently in their stories than they have in the past.
Conversations with pollsters highlight this need.
“Clearly, there’s a role for journalists to take advantage of the public opinion data that we have,” says USC professor Morgan Polikoff, who helps run the university’s annual statewide education poll.
Yet surveys are too seldom used as a standard resource, say Polikoff and others. “On a typical one of our polls we will get a half dozen stories about our poll,” says Polikoff. “Then we might see dribs and drabs, someone will highlight a particular number down the road, maybe two or three times down the year.”
He’s not the only one who feels like poll results are under-used.
Aside from the annual release each August, “I hear only from maybe 6 to 10 reporters across the course of the year, looking for poll results,” PDK’s Joan Richardson said in a recent phone interview. “I’m always surprised that I don’t hear from more people during the year.”
However, it doesn’t take more than a few stories that have used polling results well to show how useful they can be.
Reporting on teacher strikes in the past year cited surveys showing public support for paying teachers more.
A handful of outlets — Chalkbeat most notably — have used survey results to reveal complicated but important truths about differences among white, black, and Latino communities.
Outlets including NPR and the New York Times have used public opinion polling to highlight the contrast between widespread public fears about school safety and available school safety statistics.
These are small but positive signs of reporters’ willingness to use polling results in their stories long after a poll result has been published.
This Chalkbeat story used survey data to describe a racial divide among Democratic voters on the topic of charter schools.
The world of education polling date has expanded in recent years, giving reporters a greater wealth of data to tap.
Some of these polls are able to tell us what educators think, along with broader public perceptions. Some polls are now also able to put journalists in touch with people who participated in the polls, to interview them for further insights. And a few, like this 2016 poll from the Leadership Conference Education Fund, are focused on education views of parents of color. Polls repeated on a regular basis to catch changes in opinion provide important historical context.
And, while there’s no ongoing meta-analysis of education poll results like you might find at FiveThirtyEight, a handful of attempts like this 2018 Public Agenda report have been made to pull together findings from a range of different surveys so that readers can see results from more than one poll, finding common ground or key contradictions.
Without polling data, journalists are left describing the news based on advocates, anecdotes, and social media, where often the loudest voices are heard.
As EdChoice polling guru Paul DiPerna puts it, polling data is “a helpful way to see where the general public stands.”
Polls can even help reporters avoid embarrassing mistakes. Last November, AEI’s Rick Hess noted that news reports that suggested that education could play a “historic” role in determining the midterm results often ignored Gallup results showing education was the 12th-ranked issue in national polls at the time.
For more on other problems with 2018 midterm coverage, check out Accuracy question plagues midterm 2018 education coverage.
Some of the wariness among education journalists about using poll data is understandable.
Polls can be misleading and controversial, and sometimes even seem to contradict each other. They take some work to use well. “You can’t just look at the topline results and grab what might be useful for the particular narrative you’re working on,” warns EWA’s Emily Richmond.
However, there are strengths and weaknesses to pretty much every kind of information that journalists provide in their stories. Many of the same caveats apply.
And some of the concerns about polling such as the issue of bias may be exaggerated, according to those involved in producing or analyzing education polls.
While the EdNext approach to its survey work is somewhat conservative and the PDK poll is written from a more teacher-oriented perspective, according to USC’s Polikoff, “for the most part those polls are conducted by respected polling firms, and the data can be trusted.”
This 2018 Vox story highlights public opinion on teacher strikes and teacher pay.
There will be plenty of opportunities for education reporters to use polling data in the next few weeks and months ahead, whether it’s providing context for the next education proposal in the news or giving background on the return of a familiar debate.
For example, the brand-new new PDK International poll finds that nearly two of three teachers don’t believe that school discipline is sufficiently robust in their schools, according to the Washington Post writeup, despite ongoing concerns about disproportionate effects of school discipline on black and Latino students. And yet whites were less likely to support automatic punishments for accidental offenses (such as bringing a folding knife to school) than nonwhites.
Let’s hope this new information will be used in future stories.
Anecdotes and interviews may be more comfortable and familiar for reporters to use. Poll numbers are rarely as conclusive as they may seem at first glance, or as advocates would have you believe. But many of the concerns about using poll data can be overcome with careful reporting and transparency, and survey responses are too important to be ignored for much of the year.
Education Researcher, Expert Source, Media Critic (profile of Morgan Polikoff)