If you were to read all that is out there about quality assurance in higher education, you might be left with the impression that quality assurance is a complicated, nuanced, arduous, complex concept and process that defies rigorous measurement, and that will take a decade or more to get it right.
I am fond of saying that there are few mysteries in life and measuring and assuring the quality of a higher education is no exception. To the contrary — it’s really quite simple. Let me explain.
We develop academic programs so students will learn the things we think are important — things we think they should know and be able to do. What is quality and quality assurance? Simply measuring whether students who take these programs have, in fact, acquired the information and skills the programs purport to teach.
It would be presumptuous of me to tell curriculum designers what content and skills are important. I leave that to the experts in the discipline. But if I want to know or assure myself that they have developed a quality program, then all I really need to know is whether the program successfully instilled this knowledge and skill set in the students who graduate from it.
We have absolutely no difficulty with this simple quality-assurance approach when we talk about content. Instructors spend a lot of time teaching content they deem important, evaluating whether students have absorbed this content and credentialing how much content students have learned by assigning them a course grade that appears on an official transcript. For reasons unclear to me, all of this gets hopelessly muddled and complicated when we start considering skills like critical thinking, communication, literacy, etc. These are skills that we say we want students to learn, and we have no dearth of educators and administrators quite prepared to assert, often with little evidence, that these skills and competencies are acquired. But assertions are, or at least should be, insufficient for sensible and meaningful quality-assurance systems. The essence of quality assurance is to assess — with real measurements — whether these skills and competencies are, in fact, acquired.
Instead of this simple and direct approach to quality assurance, we have designed bureaucratic, administratively burdensome processes that measure a whole bunch of things that may or may not reflect the quality of a program — things like library resources, the number of instructors with a PhD in a program, the number of courses, the presence or absence of an experiential component, the length of the program, etc. This has created a sizable bureaucracy that kills many trees and employs many people but does little to really assure anyone — certainly not a skeptic with a critical mind — about the quality of the program under scrutiny. The current approach to quality assurance would be akin to evaluating whether students have acquired the concepts and content a program is designed to teach by measuring the length of their answers on an exam or the type of pen they used.
I am particularly sensitive to this quality assurance issue because my colleague, Martin Hicks, and I recently returned from a Bologna Process Researchers’ Conference (to see the paper we presented on performance measurement go here) where we heard a lot about quality assurance. To me, it was impressive how much attention has been paid but so little tangible progress made in Europe on this matter (at least as reflected in the papers presented at this conference). There are still many active discussions about what “quality” in higher education means (with some sober commentators prepared to acknowledge that no one really knows what it means — or at least that there isn’t any consensus) and endless analyses bordering on the Talmudic of how to classify different quality assurance regimes in different countries, what the right measures of quality are, and on the development of qualifications frameworks and learning-outcomes inventories. In my view, the discussion in North America has thankfully moved a little beyond this to focus on the critical role of assessment; specifically, how do we know whether these desired qualifications and outcomes are actually achieved. I would direct readers to the approach of the National Institute for Learning Outcomes Assessment in the US and the one we at HEQCO have taken as examples of direct assessment. There are appropriate criticisms of the way we do quality assurance in Ontario. But my sense is that the province at least recognizes that proper quality assessment requires actually measuring whether desired learning outcomes have been achieved. In the spring, we will report the results of a large trial we conducted in Ontario that used an online version of the OECD’s PIAAC test to measure learning gains in literacy, numeracy and problem-solving skills in college and university students from the time they start their programs to when they graduate.
So, as is customary this time of year, here are two of my New Year’s resolutions regarding quality assurance in higher education.
First, I will continue to advocate the simple idea that quality assurance is no more than measuring whether the knowledge, skills and competencies a program was designed to foster and develop have actually been acquired. And I will argue that quality-assurance processes and bodies should focus on this assessment and purge themselves of the voluminous paperwork and administration that goes into collecting indirect and surrogate information that provides little evidence of quality.
Second, I am going to stop using the term “learning outcomes.” Many people, quite legitimately, are turned off by the term because they associate it with the extensive cottage industry that has emerged to develop qualification frameworks and mapping exercises that link qualifications to courses and programs. Instead, I will stick to the essential point: how to measure and credential the skills we think students should acquire as part of a postsecondary program. So, expect to hear me say “skills measurement” and continue to expect HEQCO to devote considerable attention and resources to working with others to figure out how to do this well.
Thanks for reading.