Toward more effective data use in teaching 



To make the best use of data, educators must go beyond the big tests and involve teachers and students in collecting and analyzing data. 



By the first decade of the 21st century, an enthusiasm for using data to inform teaching had swept into U.S. schooling from other fields, especially from manufacturing, medicine, and sports (where, as in teaching, decision making used to rely mainly on gut instinct). In particular, elected officials and other policy influencers — e.g., representatives in Congress, state governments, foundations, and school districts, as well as technology entrepreneurs and curriculum marketers — have argued that using data in teaching can improve student outcomes, especially in schools affected by neighborhood poverty.  

For the most part, however, advocates have neglected to say precisely how data might improve teaching and learning or to show teachers what data-informed practice looks like and how it feels — and yes, how it feels does matter. In first-rate manufacturing settings, sports teams, health clinics, and also schools, data use doesn’t simply replace gut-level intuition; rather, the use of data complements experienced practitioners’ intuitive sense of what to do next (Baker, 2016; Gawande, 2007; Senge, 2006).  

To date, proponents of data use in schools have tended to underappreciate the complexity of teaching, and they have placed too much trust in the power of standardized test data to improve instruction. Further, few advocates have given much attention to the ways in which students’ test data might be interesting and informative to the students themselves, fueling their own agency as they try to overcome the effects of poverty. 

Proponents of data use in schools have tended to underappreciate the complexity of teaching. 

Moreover, the educational research community has been complicit in spreading this overly simplistic view of the ways in which data can drive instruction. In 2012, following an extensive review of the existing research in this area, the Spencer Foundation concluded that policy reformers’ insistent calls for data-driven practice relied on a thin research base. In fact, researchers knew little about the ways in which data might best inform teaching, in what contexts, and under what conditions. The foundation went on to issue a call for proposals to deepen this research base, funding a number of macro-level studies followed by a set of 13 school- and classroom-level ones. The study that I describe here is one of the latter. (See Barnes & Fives, 2018, for methods and findings of this and 10 other foundation-funded studies.) 

Nora Isacoff, Dana Karin, Susan Neuman, and I conducted this research with the help of James Kemple and his colleagues at the Research Alliance for New York City Schools. We focused on nine high-poverty elementary and middle schools in New York City, which we selected for their reputations among district and network leaders for using data to inform instruction.  

Over three years, we studied these schools’ data use in teaching literacy at the 4th- and 7th-grade levels. We used low-inference transcriptions of teachers’ and students’ actions and words, followed by transcript-based conversations with the teachers. We dove deep inside their teaching, looking for all signs of data-use in their practice. Relying on observations and interviews, we also studied other aspects of their schools’ data use systems, following Judith Warren Little’s (2012) advice to alternate between a close-up and wide-angle view when studying data use in schools, as well as James Spillane and his colleagues’ (2011, 2014) advice to attend to how a school’s organizational routines mediate data use. In our recent book, Data and Teaching: Moving Beyond Magical Thinking to Effective Practice (McDonald, Isacoff, & Karin, 2018), we share detailed findings from this study, including promising and cautionary portraits of practice.  

Major findings 

In this article, I offer a distilled accounting of four major findings from the study. The first has to do with the two main kinds of data teachers use to inform their teaching; the second with the ways in which teachers draw upon data when making complex, moment-to-moment decisions; the third with the systems schools need to build to support smart data use in teaching; and the fourth with the role students can play in the process.  

1. Two kinds of data  

Among the schools we studied, those that were most successful in using data to boost student learning gave equal, and complementary, attention to what we call big-test data (for example, data from state-mandated literacy tests or from annual tests of an English language learner’s growth) and what we call intimate data (data collected by teachers themselves).  

One teacher told us that he loves numbers, not just the ones passed on to his school in state testing reports, but the ones he generates himself in running records of his assessments of students’ oral reading and the ones his students generate when they self-assess their engagement in silent reading. Then he added, “And this is data too [showing us a stack of sticky notes] — little messages to myself like ‘Work with Maya on keeping track of characters.’” In fact, nearly all the educators we interviewed, both in our most-effective and least-effective schools, distinguished between big-test data and intimate data (though they called them by various names). And we found, across our 90 interview transcripts, that whenever a teacher referred to one of these kinds of data, they soon mentioned the other as well.   

Still, many of these interviewees told us that they thought “higher-ups,” especially district officials, underappreciated the intimate, most often formative, data that teachers collected. This assumption accounted in part for why some schools we studied underinvested in the teaching side of data use. A common pattern in such schools was to assume that most of the data a teacher needed for teaching could be found in annual test results — particularly item analyses indicating whether a student had met a given learning standard. Teachers were then expected to use the data to regroup students for reteaching, and their subsequent instruction, we observed, often relied on or mimicked old test items, so that their teaching became a kind of test prep.  

By contrast, in the more successful schools, teachers did not wait for big-test item analyses (which often arrived well after the first weeks of school) but generated their own standards-focused data by immediately engaging students in authentic reading and writing tasks and recording the results. Then, when they received big-test item analyses, teachers in these schools gathered in grade-level and
subject-level teams to read both sets of data side by side and to plan instructional paths designed to yield future evidence of growth and additional evidence of needs. 

2. Using data in the moment 

An effective system of data use in teaching must help teachers decide what to do in the moment, as they wrestle with the complex demands and dilemmas they face in the classroom.  Guided by David Cohen’s (2011) parsing of teaching, we identified three key demands in particular.  

The first has to do with figuring out what to teach, not just when planning instruction but also when in the midst of a lesson. To make these decisions, teachers can look to their state standards documents and the local curriculum, and they can review data from big tests and their own formative assessments. But they also have to rely on their own knowledge of the subject, as well as their ability to recognize the subtle signs that students are, or are not, productively engaged with the material. In most schools, teachers rely on their private, gut-level instincts when making these decisions (e.g., whether to explain a concept again, skip ahead, remind students of something they learned earlier, and so on), but in the most effective schools in our study, subject-focused teacher teams shared their strategies with each other, creating a consensus around what to teach and how to adapt in certain situations (e.g., which concepts are so essential that they should be explained again, even if that changes the class schedule).  

The second demand involves figuring out how to build mutually respectful and beneficial relationships with students. To make decisions about classroom climate, rules, interpersonal norms, and such, it can be helpful to look through the school’s background data and collect information from families. But the most important data tends to be the information gathered through close and discerning contact with the students themselves. Here too, we’ve observed that in most schools, teachers go by their gut instincts to decide how to build good classroom relationships, but in the most effective schools, teachers share this information with each other (e.g., trading their impressions as to which students appear to struggle with social anxiety or whether a group has responded particularly well to opportunities to choose their own essay topics). 

Finally, the third demand has to do with assessing what students already know and what they’ve learned from a particular lesson or unit. Here, again, it’s helpful for teachers to be able to draw on deep content knowledge — for instance, a teacher who knows little about literary interpretation might not notice that a student has trouble keeping track of characters and plot developments. In the most effective schools, we’ve found, colleagues will help each other become more aware of the signs that a student is struggling with such skills, and they’ll fill each other in about individual students’ strengths and weaknesses.  

If teachers are to manage these demands — deciding what to teach, how to build relationships, and how to assess students’ progress — they’ll need more than just good data (both testing data and intimate data collected by teachers themselves). They’ll also need to have certain teaching moves in their repertoire, ways of acting upon what they know about individual students. From our observations, we found four of these moves to be particularly important: pressing students to rethink an idea or try something again (while keeping in mind the difference between a respectful press and a harsh push); pulling aside a student in a way that demonstrates caring and does not stigmatize; asking questions in ways that respect students’ own ways of thinking and knowing, and doesn’t come across as fishing for right answers; and walking away — not in abandonment, but as a sign of confidence in the student’s capacity to achieve.  

3. Components of a school-based approach to using data  

Given that teachers rely both on big-test data and on-the-ground assessments of students’ needs, and given how demanding it can be to use that information on the fly, what does it mean to create an effective school-based system to support data use in teaching? 

What does it mean to create an effective school-based system to support data use in teaching? 

First, such a system has to be able to collect, organize, analyze, and share both kinds of data — not just the test scores, grades, and attendance records included in a typical data dashboard but also teachers’ more intimate knowledge about students’ needs (Boudett, City, & Murnane, 2013). Second, the system must have a means of curating materials, consultants, and technologies to support data use. (Two of the schools in our study had created sophisticated systems of this kind, involving teams of teachers who researched, built, and continually tinkered with a flexible data management platform that the whole school, including students, could access.) Third, the entire faculty must be involved in collecting, using, and learning from the data system. In our study, for example, the schools that did this well assigned more senior teachers to coach their colleagues on how to ask good questions, how to adapt one’s tone of voice to the given student, how to pose a question that invites divergent thinking, and so on (Allen, 2013; Horn, 2010).   

In the most successful of the schools we studied, data was often shared with students. 

4. Handing the keys over to the kids 

Typically, the work of monitoring student learning is assumed to lie with teachers and administrators. But in the most successful of the schools we studied, data was often shared with students, giving them opportunities to understand, monitor, and report on their own learning needs and learning gains. For example, one teacher showed us a data table he uses to track 7th graders’ progress. Of the table’s seven rows, three were taken up by data from testing and teacher-created assessments. The other four rows included data that students had reported themselves: the number of pages they had read yesterday, their engagement each day in independent reading (marked as a 0 or 1), their daily overall “productivity” (0 or 1), and their major (student-formulated) reading or writing goal of the day. To turn over the work of data collection in this way involved rewriting the school’s learning standards in kid-friendly language; expanding the use of self and peer critique in classroom routines; teaching students to read, create, and manage data sets; enhancing opportunities for public display and performance of student learning; and reorienting family conferences to put students in charge of the reporting and data sharing (Berger, Rugen, & Woodfin, 2014). 

In an interview near the end of our study, we asked Annamarie Smith, a teacher in a school that seemed especially successful in implementing data use in teaching, what she found most valuable about that work. She surprised us with the following answer: 

You know what? The kids are taking ownership of their learning. They’re really excelling, a lot of them. I think they like the fact that they’re more responsible now. They can see it. They can physically see it. They have something to show for it. Especially, I think, just putting in their head: No, when you come up for conferences, you’re gonna be the one telling your parent or guardian how you’re progressing. Your teacher is your adviser. She is taking the backseat to you. 

The value of giving the kids this level of ownership is backed up by emerging research about the value in general of goal-setting and self-testing, and about young people’s interest (whether in sports, gaming, or in this case, learning) in individual performance data (Carey, 2014; Drazan et al., 2017). A teacher from another successful school, Lucas Nguyen, called our attention also to the ways in which goal setting and monitoring can help counteract the effects of poverty. He told us about Wilson, a student he had taught the previous year who had shown signs then of gang interest. But Nguyen was determined, he said, “to brainwash” this 7th grader with his elaborate (and fascinating) system of goal setting and self-tracking. And it worked. “Every time I see him now,” Nguyen told us, “I’m like, are you still on the road to whatever high school you want? And being a Blood is not even part of who he is now.”  

Stepping back from the findings 

For years, proponents of data use in teaching have presumed that it can be incentivized, designed, installed, and supervised from afar. Here afar means not just Washington, D.C., or a state capital, but also a school district’s main office. Our findings from a very small sample of schools and teachers cannot necessarily be generalized, but they do at least warrant consideration of the following question: Might it be true that this complicated and potentially very valuable innovation cannot simply be purchased or otherwise imported, and that to some significant extent, it must be built by on-site leaders, including teachers and students themselves?  


Allen, A. (2013). Powerful teacher learning: What the theatre arts teach about collaboration. Lanham, MD: Rowman & Littlefield. 

Baker, G. (2016, July 24). Athletes, coaches trying to find balance between analytics and ‘gut feeling,’ Seattle Times. 

Barnes, N. & Fives, H. (2018). Cases of teachers’ data use. New York, NY: Routledge. 

Berger, R., Ruge, L., & Woodfin, L. (2014). Leaders of their own learning. San Francisco, CA: Jossey-Bass. 

Boudett, K.P., City, E.A., & Murnane, R.J. (2013). Data wise: A step-by-step guide to using assessment results to improve teaching and learning (revised and expanded ed.). Cambridge, MA: Harvard Education Press. 

Carey, B. (2015). How we learn: The surprising truth about when, where, and why it happens. New York, NY: Random House.  

Cohen, D.K. (2011).  Teaching and its predicaments. Cambridge, MA: Harvard University Press. 

Drazan, J.F., Loya, A.K., Horne, B.D., & Eglash, R. (2017). From sports to science: Using basketball analytics to broaden the appeal of math and science among youth. Cambridge MA: MIT Sloan Sports Analytic Conference.  

Gawande, A. (2007). Better: A surgeon’s notes on performance. New York, NY: Henry Holt. 

Horn, I. (2010). Teaching replays, teaching rehearsals, and re-visions of practice: Learning from colleagues in a mathematics teacher community. Teachers College Record, 112 (1), 225-259. 

Little, J.W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 118 (2), 143-166. 

McDonald, J.P., Isacoff, N.M., & Karin, D. (2018). Data and teaching: Moving beyond magical thinking. New York, NY: Teachers College Press. 

Senge, P. (2006). The fifth discipline (rev. ed.). New York, NY: Doubleday. 

Spillane, J.P. (2014, February 7). Untitled talk at the Spencer Foundation Evidence for the Classroom (EFC) Project meeting. Chicago, IL. 

Spillane, J.P., Parise, L.M. & Sherer, J.Z. (2011). Organizational routines as coupling mechanisms: Policy, school, administration, and the technical core. American Educational Research Journal, 48 (3), 586-619. 


Citation: McDonald, J.P. (2019). Toward more effective data use in teaching. Phi Delta Kappan, 100 (6), 50-44. 

JOSEPH P. McDONALD (; @HGLdad) is an emeritus professor of Teaching and Learning at New York University, New York, N.Y. He is a coauthor of Data and Teaching: Moving Beyond Magical Thinking to Effective Practice (Teachers College Press, 2018).

No comments yet. Add Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

WP_User Object ( [data] => stdClass Object ( [ID] => 661 [user_login] => JPMcDonald [user_pass] => $P$BWeX63iEZegcLqJYXwEYP0L4FX8u5h1 [user_nicename] => jpmcdonald [user_email] => jpmcdonald@fake.fake [user_url] => [user_registered] => 2019-02-20 19:24:22 [user_activation_key] => 1550690664:$P$Bn7DrUXeKX8/s/AF./Mhd/iAPBSBIO. [user_status] => 0 [display_name] => Joseph P. McDonald [type] => wpuser ) [ID] => 661 [caps] => Array ( [author] => 1 ) [cap_key] => wp_capabilities [roles] => Array ( [0] => author ) [allcaps] => Array ( [upload_files] => 1 [edit_posts] => 1 [edit_published_posts] => 1 [publish_posts] => 1 [read] => 1 [level_2] => 1 [level_1] => 1 [level_0] => 1 [delete_posts] => 1 [delete_published_posts] => 1 [author] => 1 ) [filter] => [site_id:WP_User:private] => 1 ) 661 | 661


Got grit? Maybe . . .  

R&D: Using data wisely at the system level

Columns & Blogs