National Survey of Student Engagement Workshop

The National Survey of Student Engagement (NSSE) Workshop was an important event for HEQCO. It was our first public research venture and focused on two key components of our research plan – learning quality and accountability. It signaled our intent to work closely with established experts to build on existing research and analysis. Finally, it was a valuable opportunity to “test drive” a workshop model that we hope to use on future occasions.

The National Survey of Student Engagement (NSSE) Workshop was an important event for HEQCO. It was our first public research venture and focused on two key components of our research plan – learning quality and accountability. It signaled our intent to work closely with established experts to build on existing research and analysis. Finally, it was a valuable opportunity to “test drive” a workshop model that we hope to use on future occasions.

Feedback from the workshop on both content and format was uniformly positive. In no small measure, this outcome reflects the efforts of OISE’s Glen Jones. Glen organized the conference, chaired the proceedings and prepared the rapporteur’s report that forms part of this Web publication. He organized the day around four questions:

  • What have Ontario universities learned from NSSE?
  • How have NSSE findings been used within universities?
  • How have universities responded to these findings?
  • Should NSSE form part of the quality framework in Ontario?

His report is a concise and highly readable account of the presentations and discussion around each question.

Following welcome remarks by the Council’s Chair Frank Iacobucci and President James Downey, the workshop began with presentations by four individuals who have been leaders in bringing NSSE to Ontario universities and interpreting the results. Collectively, as well as individually, the panelists were uniquely qualified to lead the discussion. Two presenters, Tony Chambers and Phil Wood, are faculty members and academic administrators and two, Chris Conway and Louis Mayrand, head their respective universities’ institutional analysis divisions.

Several themes emerged from these presentations. One, NSSE owes much of its currency to the fact it is rooted in solid scholarly research on student learning. Two, the real value of the NSSE survey lies in the individual questions rather than the summary or benchmark results. Three, drilling down into the NSSE results reveals interesting patterns of engagement among Faculties and programs. Four, institutions need to be very careful in interpreting the drill-down results as sample sizes can be quite small. Five, the very act of producing and presenting the data generates considerable interest among faculty and students. Six, institutions are only beginning to consider how to use NSSE results to adjust academic practices and student services. Finally, when linked to an institution’s strategic plan, NSSE can be a useful performance indicator, but it is not intended, and should not be used, for ranking purposes.

A key objective of the workshop was to identify fruitful avenues for further research, and to this end we were not disappointed. The ideas discussed during the day generated several ideas for follow-up research on learning quality.

The key to using NSSE in learning quality research is to appreciate what the survey is and what it is not. NSSE is a measure of student engagement; it is not a gauge of actual learning outcomes. A considerable amount of scholarly research over many years, however, suggests that the two are highly and consistently correlated. Thus engagement can serve as a proxy, or more formally as an instrumental variable, for what we are truly interested in knowing, namely actual learning outcomes. Since engagement is relatively much easier and cheaper to measure, NSSE results provide a convenient starting point for understanding student learning.

As already noted, panelists reported that disaggregating NSSE results revealed interesting variations among Faculties and programs. This finding raises the obvious question of how to explain the variations in NSSE results in terms of what we think we know about student learning. The real potential of NSSE, however, lies in the fact that hypotheses generated in the course of these deliberations can be tested experimentally. Specifically, teaching and learning experts can suggest interesting interventions, and the NSSE survey can then be designed to track engagement in these interventions and compare the results to those for the rest of the student population. We will soon be announcing a collaborative, multi-institution project of this sort, and we are discussing a number of additional research ideas with teaching and learning and institutional analysis leaders.

HEQCO wishes to thank Glen Jones and his very capable graduate students for their valuable work, the panelists for their presentations and for leading the afternoon break-out groups, and all those who attended and participated in the discussion.

We wish to note in closing that we will be holding a similar event for the college sector on October 17, and will be posting the proceedings of that event shortly thereafter.

Ken Norrie
Vice President, Research