PDK_100_7_Pak_Art_554x350px

 

To use data dashboards effectively, principals will need more and better professional development from their districts. 

 

For the last 17 years, under both No Child Left Behind and the Every Student Succeeds Act, school leaders have been mandated to employ data-driven decision-making (DDDM) to diagnose student needs, implement targeted supports, and design school improvements (Player et al., 2014; Wayman et al., 2013). Typically, districts provide principals with data on attendance, grades, suspensions, and other easy-to-quantify measures (Gulson & Webb, 2017), and in turn, those principals are expected to interpret the data and use it to inform their work (Goldring & Schuermann, 2009).  

Yet, it has been difficult to help school leaders develop either the capacity for DDDM or the commitment to use it. Many principals and teachers distrust the validity and reliability of the data, which they associate with the “gotcha” tactics of high-stakes accountability reform (Fullan et al., 2015; Grissom et al., 2017). Many also disagree with the metrics used to measure indicators such as climate, student proficiency, and college readiness, especially when the quantitative data appear to conflict with practitioners’ working knowledge about those indicators. Further, many schools have no organizational routines in place to foster regular examinations of the data (Spillane et al., 2002). And finally, many schools struggle with the technical demands of managing, retrieving, and analyzing data (Williamson, 2017).  

In response to these challenges, some districts have invested in data dashboards, hoping they will simplify the work of DDDM and make it more attractive to practitioners. Single-screen dashboards, in particular, allow school leaders to monitor, compare, or evaluate trends at a glance (Few, 2013). These dashboards can display multiple sources of data in one easy-to-read system, cutting back on the time it takes to log in to various platforms, and requiring less expertise for manipulating data sets (Wayman et al., 2004). 

As with any initiative that involves technology, though, all sorts of things can go awry. For instance, schools may have limited internet connectivity, hardware problems, or a lack of technical know-how. Additionally, the behind-the-scenes work of creating and updating data spreadsheets can require enormous amounts of time and effort, and it can be expensive to pay for dashboard platforms, user licenses, and maintenance.  

Further, it can be particularly challenging for districts to engage principals in efforts to build capacity for DDDM. District staff themselves may not have the capacity or time to support principals’ professional development (PD) (Darling-Hammond et al., 2007). Also, principals may not be receptive to the PD that’s offered to them (Honig, 2012), and even if they do value that advice and support, they may not be able to follow through on it, given the daily demands of their job (Grogan & Andrews, 2002).   

At the same time, DDDM holds great promise for transforming school leadership, and ultimately teaching and learning. Applying real-time data to shape school and classroom interventions has the potential to allow schools to calibrate their interventions to their school context and to help with the identification and reduction of inequalities in schools (Bryk et al., 2015; Finnigan, Daly, & Che, 2013), in a way other reforms do not.  

To learn more about the challenges and opportunities associated with DDDM, we conducted an intensive study of one district’s effort to develop principals’ capacity to analyze, manage, and make good use of their school-level data. In this case, the district launched a comprehensive data dashboard and sponsored a bimonthly PD series to help principals learn how to use this data to more effectively meet several outcome goals for the elementary, middle, and high schools throughout the district. Conversations with key stakeholders during the first year revealed several important lessons for building upon this ambitious effort in the second year of the intervention, which we share here to inform similar efforts elsewhere.   

Four lessons 

We conducted our research in an urban district in the United States that developed an interactive data dashboard displaying current and historical information on quantitative school-level trends. During the 2017-18 school year, central office staff collaborated to provide a bimonthly series of workshops for the district’s principals to support their use of the dashboard to analyze trends and implement data-based solutions.    

In the spring of 2018, we observed two full days of DDDM workshops and interviewed 20 central office staff members, four principal supervisors, and one external consultant involved in the work. We also ran focus groups with 12 elementary, middle, and high school principals who had participated in the workshops. In our analysis of these data, we found that participants highlighted four main suggestions for other districts to consider: (1) Clearly define the goals for capacity building, (2) Plan for changes in PD culture and alignment, (3) Anticipate additional technical requests and desires for assistance, and (4) Build trust in DDDM.  

#1: Clearly define the goals for capacity building. 

The district’s commitment to implementing a multiyear DDDM intervention is commendable in and of itself. The overarching goal of this initiative is to provide principals with the tools for meeting districtwide outcome goals specific to elementary, middle, and high schools, and the central office regularly convened all principals, along with representatives from district offices that provide a system of supports to schools, to provide invaluable opportunities for information dissemination and networking. This feat helped get members of a large organization all on the same page in pursuit of a common set of goals. Yet as with any new initiative, the first year of the DDDM workshops bore typical implementation challenges. 

To build a high-performing school system, district leaders must convince principals that they are committed to learning from and with them. 

One recommendation that emerged from interviews with the PD planners and participants was to more clearly specify the purpose of each individual workshop scheduled throughout the two-day sessions, and to show how these sessions aligned with the overarching vision for the DDDM intervention. The workshops were perceived to be addressing several goals at once — training principals in the technical side of using the new data dashboard, offering PD on strategies related to data analysis, selecting interventions, and performance management, and providing principals opportunities for networked professional learning, where principals shared DDDM approaches they have found helpful that were relevant to the trends found in the data. Many of the interviewed central office staff members, principal supervisors, and principals indicated that the workshops could have benefited from more clarity about which of these specific purposes connected to each individual workshop, to help shape the nature and intended outcomes of the workshop participants’ interactions.  

 As several interview participants reminded us, there is a time and place for all of these goals. Training on the data dashboard is required to familiarize principals with this new tool. PD on new strategies is required when the district’s vision is to develop principals’ DDDM leadership capacity. Networked professional learning is required for principals to reflect on their own DDDM practices. While the district implemented all of these goals over the course of the year, one recommendation is to explicitly communicate which goal a specific workshop is aimed at addressing, relay that goal clearly to the workshop planners and participants, and then design the workshop to target that goal.  

If training is the goal, then the structure of the meeting should be technical in nature. It should begin with an overview of the dashboard, and principals should be able to practice using the tool. Then they should have the time to ask questions in a psychologically safe environment (keeping in mind that people sometimes feel vulnerable asking questions about data and technology). The facilitators of this meeting should be those who are most familiar with the tool and those who can answer principals’ questions about the expectations for how they should use the tool.  

If PD is the goal, then the structure of the meeting should mirror the kinds of classroom teaching and learning the district encourages: differentiated direct instruction, small-group practice (e.g., fishbowl activities, role plays, collaborative action planning), independent practice, assessment of learning, and communication about next steps. The facilitators should be people the principals will deem as credible leadership experts.  

If networked professional learning is the goal, then the structure of the meeting should involve cohort discussions where principals are strategically placed together to share their experiences and serve as support systems for each other. Cohort discussions can center on school-based problems of practice (Coburn, Penuel, & Geil, 2013; Darling-Hammond et al., 2007) or meaningful deep dives into the data they were exposed to in the technical training and the strategies they were exposed to in the previous PD. Facilitation, in this scenario, is flexible.  

#2: Plan for changes in PD culture and alignment.   

In our study, respondents noted that the DDDM professional development called for principals to take charge of their own learning. Some principals were asked to colead sessions or speak on panels about how they used the data to inform decisions around school interventions, while all principals were allowed to choose which sessions to attend. However, this contrasted with traditional approaches to PD where principals reported taking a less active role. Interview respondents believed that this culture of principal PD was shifting in a positive direction, but in order to increase principal ownership over their learning experiences, they needed to be part of the PD planning process and get used to “flexing their muscles” as leaders who have expertise to share with their colleagues. In acknowledging this point, the district set up a principal planning committee during the second year of this DDDM initiative.  

Additionally, interview respondents were eager to hear from the central office how they could use the data dashboard to further the district’s broader goals for student learning and school performance. They also suggested in future iterations that the DDDM meetings be aligned with other PD the district sponsored for principals, to allow deep engagement in a consistent area of focus.   

#3: Anticipate additional technical requests and desires for assistance.    

The data dashboard that the district rolled out as part of this DDDM intervention was generally well received. Several principals expressed how helpful it was to now have access to both this year’s trend data and historical data, and a few talked of using the data in their weekly staff newsletters and in communications with parents.  

Yet when the dashboard was initially rolled out, principals reacted to the new system by requesting more technical functionalities on top of what was provided through the dashboard. For example, the district intentionally chose to build a dashboard that provides real-time information on school-level data points that align with the district’s outcome goals. Yet principals also desired individual student-level data to help them make more context-specific decisions around their next steps.  Though this student-level data can be found through other district data systems, principals requested the integration of this information into the dashboard. And because the dashboard purposely contains system-level data, it displays trend data that does not reflect the day-to-day changes in student enrollment and attendance. Principals wished to be able to see not only the system-level trends, but also the day-to-day fluctuations in the data.   

Principals also recommended having the central office send data analysts to meet with principals directly at their school sites to support individual data needs, or to have them available through a central office hotline. There are a handful of individuals within the central office who have the programming abilities to update and refine the dashboard, and while their supports are appreciated, principals wished for more hands on deck to assist with their individualized needs.  Further, they recommended sending these experts out to the train other administrators and teachers to use the dashboard as well, easing the burden on the principals. On the positive side, this demonstrates that principals were engaged and very much wanting to use the dashboard.   

#4: Build trust in data-driven decision making. 

DDDM can be intimidating for those without much data analysis experience, and providing feedback on DDDM can also be intimidating, as it might reveal some of these vulnerabilities. Knowing this, the district provided regular, anonymous opportunities for principals to provide their feedback on the effectiveness of the workshops.  

Still, several of our respondents told us that there were principals who withheld their feedback because they didn’t want to expose their own weaknesses with data analysis or technology. Other principals were skeptical that their feedback would result in meaningful change. 

It will be difficult to successfully build local capacity for DDDM if principals and other educators are uncomfortable providing input, or if they feel safest staying under the radar (Johnson et al., 2015). Just as principals should strive to make decisions based on accurate data, district leaders should repeatedly communicate that they want and expect genuine feedback, signaling that they “are not simply acting out a symbolic drama for public consumption, but are seeking actual instrumental results” (Thompson et al., 2008, p. 20), and then show how they are acting upon this feedback. 

Just as principals should strive to make decisions based on accurate data, district leaders should repeatedly communicate that they want and expect genuine feedback. 

To build a high-performing school system, district leaders must convince principals that they are committed to learning from and with them (Honig, Lorton, & Copland, 2009). Arranging for a third-party study of a new initiative, as this district did, is a powerful indicator that they are open to feedback to inform continuous improvement. Even so, it may take more examples of this, and clear communication about how they used the feedback to improve their system, to continue to build trust with their constituencies. 

Creating the conditions for capacity-building 

In past decades, research has offered little guidance to central office leaders about how to create effective PD experiences for principals. This is partly because principals have tended to be given few professional learning experiences at all, much less ones that researchers can study. That’s true especially for efforts to build principals’ capacity for data-driven decision making. While DDDM has been mandated for more than 15 years, few pre- or in-service programs focus on the necessary skills (Bowers, 2017; Kowalski et al., 2011). Thus, this opportunity to study and learn from a districtwide DDDM capacity building initiative has been invaluable.  

Undergirding the four lessons we’ve identified is the belief that district leaders can create the conditions for meaningful professional learning in this area. Once a dashboard is created (a massive undertaking in and of itself), districts have the opportunity to shape these conditions in anticipation of some of the challenges of adopting new approaches. We must think carefully about the purposes for convening principals for learning about the dashboard, the ways in which those learning opportunities call for cultural and PD alignment shifts, the technical assistance principals are likely to request, and the kinds of feedback mechanisms that will encourage principals to participate and that will deepen their comfort with organizational change.    

References 

Bowers, A.J. (2017). Quantitative research methods training in education leadership and administration preparation programs as disciplined inquiry for building school improvement capacity. Journal of Research on Leadership Education, 12 (1), 72-96. 

Bryk, A.S., Gomez, L.M., Grunow, A., & LeMahieu, P.G. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press.  

Coburn, C.E., Penuel, W.R., & Geil, K.E. (2013). Practice partnerships: A strategy for leveraging research for educational improvement in school districts. New York, NY: William T. Grant Foundation. 

Darling-Hammond, L., LaPointe, M., Meyerson, D., Orr, M.T., & Cohen, C. (2007). Preparing school leaders for a changing world: Lessons from exemplary leadership development programs. Stanford, CA: Stanford Educational Leadership Institute. 

Few, S. (2013). Information dashboard design (2nd ed.). Burlingame, CA: Analytics Press. 

Finnigan, K.S., Daly, A.J., & Che, J. (2013). Systemwide reform in districts under pressure: The role of social networks in defining, acquiring, using, and diffusing research evidence. Journal of Educational Administration, 51 (4), 476-497. 

Fullan, M., Rincón-Gallardo, S., & Hargreaves, A. (2015). Professional capital as accountability. Education Policy Analysis Archives, 23, 15. 

Goldring, E., Grissom, J.A., Rubin, M., Neumerski, C.M., Cannata, M., Drake, T., & Schuermann, P. (2015). Make room for value added: Principals’ human capital decisions and the emergence of teacher observation data. Educational Researcher, 44 (2), 96-104. 

Goldring, E. & Schuermann, P. (2009). The changing context of K-12 education administration: Consequences for Ed.D. program design and delivery. Peabody Journal of Education, 84 (1), 9-43. 

Grissom, J.A., Rubin, M., Neumerski, C.M., Cannata, M., Drake, T.A., Goldring, E., & Schuermann, P. (2017). Central office supports for data-driven talent management decisions: Evidence from the implementation of new systems for measuring teacher effectiveness. Educational Researcher, 46 (1), 21-32. 

Grogan, M. & Andrews, R. (2002). Defining preparation and professional development for the future. Educational Administration Quarterly, 38 (2), 233-256. 

Gulson, K.N. & Webb, P.T. (2017). Mapping an emergent field of ‘computational education policy’: Policy rationalities, prediction and data in the age of Artificial Intelligence. Research in Education, 98 (1), 14-26. 

Honig, M.I. (2012). District central office leadership as teaching: How central office administrators support principals’ development as instructional leaders. Educational Administration Quarterly, 48 (4), 733-774. 

Honig, M.I., Lorton, J.S., & Copland, M.A. (2009). Urban district central office transformation for teaching and learning improvement: Beyond a zero sum game. Yearbook of the National Society for the Study of Education, 108 (1), 21-40. 

Johnson, S.M., Marietta, G., Higgins, M.C., Mapp, K.L., & Grossman, A. (2015). Achieving coherence in district improvement: Managing the relationship between the central office and schools. Cambridge, MA: Harvard Education Press.  

Kowalski, T.J., McCord, R.S., Peterson, G.J., Young, P.I., & Ellerson, N.M. (2011). The American school superintendent: 2010 decennial study. Lanham, MD: Rowan & Littlefield. 

Player, D., Hambrick Hitt, D., & Robinson, W. (2014). District readiness to support school turnaround: A users’ guide to inform the work of state education agencies and districts. San Francisco, CA: Center on School Turnaround at WestEd. 

Spillane J., Reiser B., & Reimer, T. (2002) Policy implementation and cognition. Reframing and refocusing implementation research. Review of Educational Research, 72 (3), 387-431. 

Thompson, C.L., Sykes, G., & Skrla, L. (2008). Coherent, instructionally-focused district leadership: Toward a theoretical account. East Lansing, MI: The Education Policy Center at Michigan State University.  

Wayman, J.C., Spikes, D.D., & Volonnino, M.R. (2013). Implementation of a data initiative in the NCLB era. In K. Schildkamp et al. (Eds.), Data-based decision making in education (pp. 135-153). Dordrecht: Springer. 

Wayman, J.C., Stringfield, S., & Yakimowski, M. (2004). Software enabling school improvement through analysis of student data (Report No. 67). Baltimore, MD: Center for Research on the Education of Students Placed at Risk. 

Williamson, B. (2017). Learning in the “platform society”: Disassembling an educational data assemblage. Research in Education, 98 (1), 59-82. 

 

Citation: Pak, K. & Desimone, L.M. (2019). Developing principals’ data-driven decision-making capacity: Lessons from one urban district. Phi Delta Kappan, 100 (7), 37-42. 

ABOUT THE AUTHORS

default profile picture

Katie Pak

KATIE PAK is a doctoral candidate at the University of Pennsylvania Graduate School of Education, Philadelphia.

default profile picture

Laura M. Desimone

LAURA M. DESIMONE is the director of the School of Education and director of research at the College of Education and Human Development,  University of Delaware, Newark.