Piloting the CLA in Ontario

Research Summary:

Large-scale Learning Outcomes Assessments Should Focus on Student Needs to Increase Participation

The Council for Aid to Education’s (CAE) Collegiate Learning Assessment (CLA) and college-sector equivalent the Community College Learning Assessment (CCLA) are standardized tests to assess a postsecondary institution’s contribution to the development of students’ key competencies, including critical thinking, analytic problem solving and written communications.

However, a new study by the Higher Education Quality Council of Ontario (HEQCO) finds that the CLA/CCLA’s short testing windows and lack of direct feedback to participants create challenges for attracting enough participants to gather significant data. The CAE has recently developed the CLA+ that could provide valuable information and feedback to students. This student-centred approach may be more effective in recruiting participants, creating better data.

Project Description

The 90-minute written test consists of three online assessment components, including two analytical writing tasks and a performance task, that require students to analyze complex materials to create a well-constructed argument. The CLA/CCLA also assess the “value-add” of education, by accounting for several external factors, such as socio-economic status or prior educational attainment, which may affect performance but are not a reflection of an institution’s impact.

HEQCO supported eight Ontario colleges and universities in piloting the CLA/ CCLA to consider their value to institutions, and subsequently determine their usefulness and effectiveness for government, employers and students. The participating institutions were: Fanshawe College, Humber College Institute of Technology and Advanced Learning, McMaster University, Mohawk College and McMaster University, Queen’s University, Sheridan College Institute of Technology and Advanced Learning, University of Guelph and University of Windsor. While the approach and methodology were different for each institution, most of the tests took place in spring and fall 2012.

Findings

Despite significant institutional efforts, participation rates were low, which hindered the ability to conduct data analysis. Most participating institutions sought voluntary participation from students outside of class time. The recruitment of students and scheduling test times proved extremely challenging, especially given the limited time windows (February 1-April 15 and August 15-October 31), which conflicted with several institutional academic calendars. Two of the pilot institutions, Queen’s and Mohawk/McMaster, improved participation rates by embedding the test into class time, but this created planning challenges and required numerous levels of approval.

In addition to the small sample size, institutions raised data concerns including the self-selection bias of volunteering students and the uncertainty of student effort, as the CLA/CCLA had no grade impact or future value for participants. These issues are common for most large-scale, low-stakes assessments and can often be mitigated through careful research design. The CLA/CCLA are proprietary products and individual student responses are not made available to participants, faculty or institutions. Therefore it does not allow for additional institutional uses, such as early detection of how a student is performing.

The few institutions that did find the data valuable suggested they could be used for benchmarking critical thinking and writing against other institutions, identifying institutional value added and evaluating the effectiveness of certain teaching strategies.

HEQCO has convened a consortium of six Ontario colleges and universities to work on the assessment of general learning and cognitive skills. Additionally, HEQCO will be publishing a practitioner’s handbook on learning outcomes assessment. 

Piloting the CLA in Ontario was written by Mary Catharine Lennon, Higher Education Quality Council of Ontario (HEQCO).