Supporting State Department of Education Staff with Internal Decision Making
The results of educational assessments have never been more visible. Over the past 20 years, the reporting of state assessment results has shifted from oft-ignored printed handouts to publicly-available online dashboards and report cards. Parents, schools, and the general public have an almost unheard of level of access to data describing school performance.
States are also responding to an emerging consensus: that simply providing access to data is not enough. Scaffolding reports so users are walked through important questions – and so that stories are told – is key to ensuring assessment results are better used (e.g., Few, 2013; Nussbaumer-Knaflic, 2015). Dashboards and report cards need to be carefully tailored to the audience they are meant to serve, guiding them through their most important questions.
However, one key audience often overlooked, in the reporting of state assessments is the state department itself. We forget that a primary purpose of state assessment is to collect data from schools that will be used to inform state policy and help improve state-supported programs and initiatives. Under the immense pressure that comes with the annual assessment cycle, state staff often have little time to find or develop tools to provide themselves the insight into the data they need. While outward-facing tools have advanced, tools internal to state departments have – all too often – not.
Some of the work in which I have been recently involved has been aimed at developing these types of internal tools.
In thinking about the type of questions states need to answer and the best way to develop tools to support them, it may be helpful to group approaches into three classes of solutions:
- Ad-hoc Solutions. One off, or repeated, analyses based on data pulls from the state or vendor data warehouse(s) and created in a software package like R, SAS or SPSS and disseminated as static reports.
- Commercial Integration. Implement a commercial software solution (e.g., Tableau, Power BI), by building on an already existing data pipeline or developing alongside the integration.
- Open Solutions. Use open tools to develop analyses summarized by data-driven documents that are portable across states, supported by data pulls or interfaces with the data pipeline.
For many years, the norm has been ad hoc solutions – write a program or develop a routine as needed to answer a specific question. Commercial integration provides greater flexibility through interactive dashboards and reports, and these types of solutions continue to evolve. However, like ad hoc solutions, this approach still involves implementing tools that require a substantial investment of time, not to mention the substantial subscription costs that accompany them. In my work at the Center, I have found the open solutions approach particularly powerful and promising. The core idea is that questions that are of interest to one state will likely be of interest to another state. For example:
- How can we better understand the growth of students identified as having a disability?
- How much progress are we making toward our goal of eliminating achievement gaps?
- In which schools are students on track to college and career readiness?
In answering questions such as these, there is no need for state after state to reinvent the wheel. Analyses and reporting tools built to answer this question in one state can easily be ported over to another. This portability is key, as analyses and visualizations are time-intensive to develop. By working in an open way, states can share tools, even if the results of those tools are used internally. In future posts and presentations, my colleagues and I will share thoughts on building an infrastructure to support and encourage the development of open tools and offer examples of tools and resources that can be shared across states.