One of the most important lessons I learned in graduate school is that evidence and data matter. So, when there is a problem to be solved, a challenge to be met, or a strategy or policy to be designed, thinking should be informed, shaped and guided by the best evidence, data and information available. Regrettably, in my opinion, this lesson is too little evident or influential in the higher education sector. In my experience, decisions made in the higher education sector – both within institutions and government – are swayed more by stories, anecdotes, personal experiences and gut instincts. In many reports, my colleagues and I at the Higher Education Quality Council of Ontario have lamented the lack of good, reliable information and evidence to inform a problem or its solution. Ironic given that informed, deep analysis and critical thinking are, presumably, the bedrock of higher education.
This sorry state of affairs exists at a time when institutions complain, legitimately in my view, that they are inundated by requests for more and more data. Universities complain about the number of FTE’s they devote to simply reporting to government. Everyone complains about survey fatigue. What is going on? How can we have so little evidence-based decision making when more and more numbers are being reported and collected?
I think the answer is this. We confuse collecting numbers with having information. With data, context is everything. Data are collected for a purpose — to overcome a challenge, or to inform the design of practices and policies that solve problems. Yet, too often the context or purpose of the data sought is not always clearly articulated.
Far too often, without clearly identifying the context or issue, we engage in the collection of a lot of numbers to demonstrate that we are dealing with the problem. And in many cases, the more numbers the better. I have witnessed government meetings that go like this: A problem is identified – e.g. we want more sustainable institutions. A group of people meet and everyone suggests a number they would like to see. Hey, how about a five-year trend or, better yet, a 10-year trend? How about this ratio? How about reporting students as headcounts and FTE’s? The logic appears to be that the bigger the spreadsheet or survey, the greater the likelihood that the information they need is somewhere in there.
The trouble is that larger spreadsheets and reams of numbers don’t illuminate, they obfuscate. Sometimes the data set collected is so large that no one really knows what to do with it so it is relegated to a shelf or, in its more modern form, archived in an electronic file locked (and password protected to restrict access) on some server in the building’s basement.
Here are my recommendations for how we can move from a world drowning in questionable data to good information supporting evidence-based decision making.
- Before any data are collected, the purpose of the data collection exercise be clearly stated. If a reasonable person cannot understand the rationale, or if it is so vague that there is no way to see how the data collected could solve the problem, then no one fills out the survey or spreadsheet.
- Prior to any data collection, there is evidence that knowledgeable individuals have been engaged to identify the most critical, relevant or necessary data to inform or solve the problem or design the policy. The guiding philosophy: ask for the least amount of data necessary.
- The request for data be accompanied by clear statements about how the data will be used, by whom and by when.
- Whatever the motivation behind the data request, there be an obligatory evaluation of whether the program, practice or policy shaped by the data is working. The results of these evaluations are made public.
Since coming to HEQCO, I have been impressed (more accurately, depressed) by how often our efforts to improve postsecondary education are limited by the absence of meaningful information that would help us understand what is actually going on and where promising solutions may exist. Almost everyone would agree that there needs to be changes in Canada’s higher education sector. But, you can’t manage what you don’t measure. The good news is that what gets measured gets done. So, let’s start collecting data, and only those data and evidence that illuminate and inform the most pressing problems of higher education in Canada. This is the critical first step to effective solutions.
This commentary originally appeared in the fall 2016 edition of University Manager, published by the Canadian Association of University Business Officers.