Thoughtful USRI overhaul could encourage effective student feedback
November 2 2018 —
If you’ve spent a year or two in university, you probably know that an instructor can make or break a class. Are you lucky enough to have a slate of professors with captivating lectures and reasonable assessment measures? It’ll likely be a fulfilling semester. Do you have one or two instructors that make you dread dragging yourself out of bed to get to class? You may be in for a tough couple of months.
Despite the substantial impact having good professors has on the quality of the post-secondary experience, it’s difficult for students to assess the quality of instruction when selecting their courses. But potential changes to the Universal Student Ratings of Instruction (USRI) could soon make it easier for students to decide who to spend their semesters with.
University of Calgary vice-provost teaching and learning Leslie Reid presented possible USRI changes at the Oct. 16 meeting of Students’ Legislative Council. The revisions include altering the wording on questions, creating more faculty- and department-specific questions, shifting to online responses and making feedback more accessible to students — all positive steps. But what would USRIs look like in a perfect world?
For students, the biggest problem with utilizing previous USRIs is their accessibility. Viewing USRI results from previous terms requires navigating through the U of C’s terrible course-search interface to another archaic database — “The current USRI software does not support Chrome,” notes the U of C’s website. Once there, data for some course sections is inexplicably missing. Much like the rest of myUofC, USRIs would benefit greatly from a technical overhaul.
Having more accessible USRI results would make them a more appealing source for information compared to RateMyProfessor (RMP), the third-party site many students use to look into professors before choosing courses.
However, the USRIs and RMP face contrasting problems. Due to their in-class format, USRIs can face low response rates, creating a positive selection bias. Those still attending class by the end of the term who respond to the USRI are more likely to have a positive view of their instruction, compared to those who have decided to stop attending lectures. On the other hand, RMP exhibits a negative selection bias — students who had poor experiences are more inclined to voice their disapproval on the unmoderated and anonymous public platform.
The reality of the quality of instruction likely lies somewhere in between what the USRI and RMP indicate. One way to foster more accurate USRI results is to make it easier for all students to complete evaluations. Shifting data collection from in-class to online would help capture a wider sample. Still, having dedicated class time to inform students about the importance of their feedback is also crucial.
One thing that USRIs and RMP both currently get right is respondents’ anonymity. For students to provide candid evaluations, they must know that they will not receive any consequence nor benefit for their comments.
We’re paying hundreds of dollars for our courses. If a class sucks, we should feel able to voice our concern. Though it can, unfortunately, elicit inappropriate and unconstructive comments, anonymity is the only way to absolutely ensure students feel confident to provide honest feedback.
Worryingly, Reid said during her presentation to SLC that the U of C may consider removing anonymity from USRIs so students would be accountable to their comments. But doing this would immediately strip the USRIs of any credibility, as only opinions expressed without fear of reprisal hold any value. Removing anonymity would be a major misstep for the U of C.
Another major problem with USRIs is their timeline. Instructors can’t read student comments until after they’ve submitted final course marks due to the risk of compromising the anonymity of a student who is still being evaluated by the instructor. However, this also means that professors can’t integrate student feedback into the remainder of the session, perpetuating subpar instruction.
Allowing for an official but informal mid-semester feedback session, as Reid suggested, would be a great step to remedy this disconnect for current students to have their comments addressed before the class ends.
While they provide valuable information, it’s clear that the USRIs need some alterations to better serve students. Hopefully, the U of C will place emphasis on making USRIs more accessible and meaningful for students during its revision.
— Jason Herring, Gauntlet editorial board