Everywhere these days you see schools and non-profits being asked to prove that they are getting results by gathering and presenting data… tons of data. Foundations want to know that their investments are paying off. State and Federal Education Agencies want to assure that all children are learning. Businesses want minute to minute analytics and dashboards for their shareholders. We don’t seem to question this increasingly urgent push to having access to more and more data about every aspect of our work. And yet, as the data pile up, we feel more and more inundated with data, a veritable tsunami of data, and we wonder if we really can make sense out of it all, in any meaningful way.
Many non-profts and schools are facing an increasing feeling, and reality, of data overload. Teachers feel beat up with data. Particularly in public education, we seem to use data more as a hammer than anything else. And yet, almost everyone wants more data. Almost no one is asking, “To answer what questions?” Even harder, “What data do we really need to answer those questions?” And, “How would we know if those data actually answered our questions?” Let me make an emphatic point: That we have access to more data than ever does not mean that we can necessarily answer important questions about the effectiveness of our schools and other organizations any better, nor does it mean that we have to use, or even can use, all of those data for some important purpose.
As we now know (and many were saying all along), over a decade of time and huge sums of money were invested in No Child Left Behind, mostly on the questionable strategy that state standardized tests used for accountability would improve student achievement. This massive effort resulted in only modest gains in some places, while at the same time demonizing and demoralizing teachers and schools, and setting them up as targets of reform rather than putting that time and money into developing their professionalism and their professional associations’ capacity to raise their own standards of practice (as a recent article by Jal Mehta in the Harvard Ed Review notes). A similar frenzy seems to be gripping the foundation and non-profit world, to produce more and more data about the programs funded by the foundations. In a parody of this frenzy that is not too far from the truth, non-profits could end up spending more time reporting on their work than actually doing it. It is certainly true that many felt the hours and hours of test prep and testing that we did in education for NCLB, not to mention the narrowing of the curriculum just to focus on what the tests were supposedly measuring, represented a sad waste of time and distraction from real learning. This is not a sustainable system.
So am I saying that data are not useful? Absolutely not! Data can help provide a compass, guiding us with a north star and a bearing toward where we want to go. Data can act as a roadmap (as long as we remember that the map is not the territory!). Most important, data can serve as part of a reflective cycle of inquiry, a cycle of continuous improvement, at all levels in the educational system, and in our non-profits. However, we will want to consider carefully what questions we want to address and what data will meaningfully and effectively address those, as we shift toward a more balanced use of data. And we will want to explore what sorts of evidence we really need to address those questions. We will want to expand our notions of what counts as data at the same time that we are pruning back our massively overgrown data “tree.” And we will want to consider some ways that we think about and engage with data as well. It’s not just a technical question we are addressing.
So, first of all, we need clarity on our questions, and on who is asking them, and on their purposes. Are we exploring a classroom or other learning experience, or a whole school’s effectiveness? Are we looking at program effectiveness? Are we determining the extent to which organizational systems are well matched to program processes and desired outcomes or accomplishments? Are we examining leadership? Are we prototyping a new process or product?
And then, to borrow from Habermas, we might consider data to have a technical aspect, a social or practical aspect, and a critical aspect.
Some technical questions we will want to ask are:
Some social/practical questions we will want to ask are:
Some critical questions we will want to ask are:
If we work to become clear about the Theory of Action of our program or non-profit or educational project (how do we see the resources we have and our choices of what we do as leading to the outcomes or accomplishments we want?), then we have a solid place to begin to craft good questions about our work that can drive good choices of data to use in our assessment, in our quest for continuous improvement, using a cycle of inquiry. That inquiry provides a setting for us to engage as professionals in collecting useful data about our work, analyzing those, making meaning out of the analysis, choosing practices that will help us improve, fine tuning our work, and building a higher quality knowledge base to drive our practice. This embedded reflection and knowledge stewardship is at the heart of real improvement.
So, again, I ask, are you a fan of big data? If so, what is your question? Taking into consideration what I have said above, you may find yourself using less data, but getting more out of it. That would be sustainable.