Categories

Harvey P. Weingarten – Suggestions for Better Performance Management of Postsecondary Funding

Today, the Higher Education Quality Council of Ontario (HEQCO) released a major evaluation of the outcomes of the investment Canadian federal and provincial governments have made in graduate education. This report, Intentions For and Outcomes Following a Decade of Government Investment in Graduate Education, continues a series of papers we have published on graduate education.  The previous papers were restricted to Ontario.  Prepared by Fred L. Hall, former dean of graduate studies at McMaster University and the University of Calgary and HEQCO researcher Hillary Arnold, the current paper represents a capstone publication because it extends these analyses to all of Canada.

In contrast to the famous New Yorker cartoon where two people are arguing about a movie even though neither has seen it but both have read reviews, I did read the whole report, which is long, detailed and as comprehensive as the data permit.   It is not a casual read and I will not review it here.  However, the paper led me to several overarching conclusions about how Canadian governments invest in higher education and how they monitor the consequences of these public investments.

First, the investment in graduate education in Canada over the last decade has been significant.  Since 2003, the federal government has invested an incremental (beyond what they were investing in 2002) $806M in graduate studies in Canadian universities.  Collectively, the provinces have invested additional $1.7B incremental dollars.  And, these are only the funds that were targeted specifically to graduate studies by these governments.  Presumably, the universities also deflected incremental funding to graduate studies through internal scholarships and philanthropic contributions.

These investments are associated with an increase in graduate student enrolment.   Not surprisingly, the greatest enrolment increases occurred in Canada’s more research-intensive universities and provinces that invested more in graduate expansion saw greater enrolment increases.  However, as the authors note, the positive trajectory of graduate student expansion that was evident prior to 2003 makes it unclear how much of this increased graduate enrolment is directly attributable to the increased dollars.

So, if one of the outcomes we wanted was more graduate students, the data suggest we got it.  But, presumably, this large investment was intended to yield benefits other than simply “more.”  For example, was the increased investment intended to increase the quality of Canada’s graduate student cohort?  Was it intended to increase graduate numbers specifically in programs and disciplines that might fuel innovation, entrepreneurship or industries most relevant to Canada’s economy?  Was the investment intended to facilitate and improve technology and knowledge transfer between universities and the private sector?  Was it intended to make graduates more “job ready”?  Was it intended to shorten the time it would take students to complete their degrees or increase the number of entering students who actually graduated?   The question is: if these were intended outcomes, were they actually achieved?

The problem is – and this is where the report is revealing and troubling to me – we don’t know the answers to these questions.  Why?  Because as the authors note:  “Despite our efforts at data collection, with the cooperation of various agencies and governments and our detailed analysis, we can identify with certainty only a very limited set of outcomes that these significant investments achieved.  In some cases, it was difficult to identify exactly what they were intended to achieve.”  Simply put, we were not always articulate about what we hoped to achieve with our investment (except perhaps for wanting “more”) and even when some intention was identified, we did not express it in a way that made it clear or obvious how to measure whether the objective was achieved.

One of the tenets of good management is to evaluate the success or outcome of any program one initiates.  It is disconcerting, then, to think that we can make significant investments of public funds and initiate long-term programs we think are important without designing them in ways that we can monitor performance and achievements adequately.  A cynical colleague of mine suggests that sometimes this ambiguity may be deliberate.  The colleague points out the difficulty of getting any government program launched and he suggests that “ambiguity helps get things approved.”  Perhaps true.  But it is ambiguity about desired outcomes and lack of clarity in measures to monitor performance that make evaluation and accountability difficult.

It is not useful simply to carp.  To be constructive, I offer the following suggestions for future occasions when we launch major investments in higher education or other public sectors.

First, at the time that an investment is made, the investor (government) should articulate the purpose of the investment, intended outcomes, and metrics that will be used to assess progress.

Second, some percentage (let’s say 3%) of any public investment should be set aside at the outset for monitoring and evaluating the outcomes of the investment.  There should be requirements for periodic progress reports and a comprehensive performance assessment at relevant intervals.  At the time of investment, the investor should identify who is responsible for conducting these assessments.

Third, the normative principle should be that public institutions, like colleges and universities, are required to divulge data that they collect about programs and projects that are supported by public funds.

These are not deep public policy recommendations.  They are just sound performance management techniques.  The beauty of good performance management is that it often leads to more successful achievement of desired outcomes and goals.  One might even harbour the view that more rigorous assessment of performance and greater disclosure of such evaluations might result in greater public support for public postsecondary education.

Thanks for reading.

-Harvey P. Weingarten

One reply on “Harvey P. Weingarten – Suggestions for Better Performance Management of Postsecondary Funding”

It would be a lot of work to specify in detail the intended outcomes of every component of public (or private) spending and then evaluate the outcomes. It would be even more work to evaluate detailed outcomes, and the evaluation results would often be inconclusive. Which, in addition to the benefits of ambiguity, may explain why such discipline is rare.

But Dr. Weingarten is right that more rigor in specifying the intended outcomes of public investment and more evidence demonstrating results would likely increase public support for postsecondary education. Such discipline would go a good way toward justifying our oft asserted claims of moral and intellectual leadership. The discipline of visibly setting ambitious objectives, even if falling short, followed by renewed commitment to achieving success would be an improvement over vague objectives and soft evidence. This could be a powerful way to build public confidence in our sector.

Leave a Reply

Your email address will not be published. Required fields are marked *