A look at how students grade (or degrade) instructors
At the end of every quarter, many students take the opportunity to complete course evaluations, either on paper or online, but how their responses are actually used may be a mystery to many students.
In 2013, UC Davis launched the online Academic Course Evaluation system, to which most departments have now switched from the paper evaluation forms.
Scott Kirkland is the application architect for the College of Agricultural and Environmental Sciences Dean’s Office programming team. The Online Course Evaluations system was one of the projects of the development team he manages, which writes custom applications for UC Davis.
Kirkland discussed how the majority of departments on campus have opted to make the switch from paper to online course evaluations.
“Even though the system is completely optional, most academic departments are using the online system, well over 75 percent of all campus and med center departments, and a higher percentage than that of core campus departments,” Kirkland said. “Occasionally we’ll see a department where every professor except for one uses the system, and that’s ok too.”
Professors who choose this option miss out on the efficiency of the online system and create extra work that could be done effortlessly online rather than being transcribed by hand.
Kirkland explained how the responses from students are passed to the instructors, noting the information that individual professors and TAs have access to.
“Professors and TAs are treated similarly in online evaluations,” Kirkland said. “They can see the responses to any course-level question and then only the person-level questions that apply to them. So a TA cannot see what someone said about a professor and visa-versa. Basically it’s some nice bar charts of how students responded to each question asked, plus the text responses if applicable.”
Tracy Lade, the chief administrative officer and department manager for the Department of Physics, explained that the default for her department is to use the online evaluations, but that some instructors still opt to use paper evaluations.
“The decision to switch to online evaluations was driven by convenience for the students in the courses and by staff workload,” Lade said. “Processing online evaluations for high enrollment classes is more efficient.”
Anya Gibson, an administrator for the history, economics and East Asian studies programs, explained why the economics department chose to switch to the online system while the history department opted to continue using the paper forms.
“From a staff standpoint, the economics department wanted to move because it would save funds and people hours needed to get the hard copies of the evaluations, print them, and get the envelopes and pencils,” Gibson said. “It costs much less online, but the history department just isn’t sold on it.”
Adam Getchell, the director of information technology for the Computing Resources Unit for the College of Agricultural and Environmental Sciences Dean’s Office, said that some professors worry about lower response rates if evaluations are moved online.
“The perception remains that online surveys get less response rate than paper ones,” Getchell said. “Strategies such as extra credit or early viewing of grades have been proposed, but to date we are not allowed by the Academic Senate to provide these types of incentives for completing evaluations.”
In addition to the possibility of lower response rates with online forms, faculty may worry that their scores may be artificially lowered. Gibson disputes this, citing studies that are also mentioned on one of ACE’s information pages.
“Studies have been completed at several UCs showing that scores do not change even if fewer students respond,” Gibson said.
Regarding the actual questions asked on the evaluations themselves, UC Davis as a whole has two required questions, but the faculty of each department also decides additional questions they would like to include on the evaluation.
Many students may be curious how the information they provide through course evaluations is used and who has access to it. Does only the instructor read them, or are criticisms and negative trends monitored by the department? Does the information provided actually lead to change?
“Each department is allow to choose their permissions, but almost every department just has one or two staff members who control setting up and releasing evaluations,” Kirkland said.
Lade, along with people from the economics and geology departments, discussed the process of reviewing and analyzing the course evaluations.
“Student evaluations of faculty and lecturers are used during their periodic advancement review or during the reappointment process in the case of a temporary lecturer,” Lade said. “TA evaluations are monitored and they inform future TA appointments.”
Lade spoke about the importance of properly handling criticisms and negative trends about instructors.
“Negative feedback is taken seriously and is addressed in a way that’s appropriate to the situation,” Lade said. “Perhaps a mentoring conversation is indicated, or perhaps referral to the Center for Educational Effectiveness to take advantage of their services would be helpful. In the case of a temporary lecturer, the matter would be addressed in the review and reappointment process. Persistently low student evaluations may result in non-reappointment.”
The online system is designed to ensure that responses remain confidential. Kirkland also made it clear that responses from students are in no way adjusted or moderated. Departments do take care to distinguish between unjustified negative comments and valid criticisms of an instructor’s behavior or teaching style that should be addressed.
“The instructor sees everything that was said for them and about them, so a student could say whatever they want, even something profane,” said Mandy Hanou, an advisor for the department of earth and planetary sciences and the manager of course evaluations for geology. “That goes straight to the instructors, and they know what their students are saying about them.”
If there is a concern then, the department chairs are notified. Giovanni Peri, the chair of the economics department, explained how the department usually deals with negative evaluation of instructors and the role that this student-provided information plays in a professor’s merit evaluation, which occurs every two years.
“Faculties are evaluated on research, teaching and service,” Peri said. “Student evaluations are considered important (but not the only) inputs in evaluating the teaching of a professor.”
While some negative evaluations typically do not cause alarm, patterns and trends are monitored.
“A pattern of not good evaluation in one course, which point out at a concern which seems legitimate (lack of attention to students’ need, neglect of preparation, sloppy classes) are looked at very carefully,” Peri said.
As a department chair, Peri may talk with the professor to determine if specific circumstances led to negative performance evaluations in certain classes. Chairs also make a plan with the professor for how to fix the problem.
“If the problem persists there may be consequences, including the possibility of not obtaining a merit promotion,” Peri said. “It needs to be a pervasive and well-established problem and persistent with the professor to get to consequences. My experience is that evaluations which are not good are enough of an alarm bell for a professor to focus and figure out what is not working and improve.”
The hard work that went into developing the online system facilitates a faster turnover time, which is also useful since not as much time passes between the teaching of the course and the reception of the results.
“We haven’t gotten a ton of feedback from professors, I think most of them just sort of ‘get it’ and wonder why we didn’t have it on campus earlier,” Kirkland said. “They really like being able to see evaluations days after finals are over instead of months later.”
One of the unexpected benefits of the system has been that students can easily access course evaluations via their mobile devices.
“Over one third of all website views are from mobile devices, which were pretty novel back when this system was written, and I’m glad we spent the time to support them well,” Kirkland said.
There have still been some minor unexpected snags along the way.
“On the technical side, I didn’t anticipate the difficulty of dealing with 40,000-plus students all hitting your website during a limited time window,” Kirkland said. “We have auto scaling platforms now setup so as demand ramps up, our pool of available servers automatically grows to handle the load.”
Kirkland added that one of the best aspects of the online system is that it is oriented toward privacy, anonymity and student expression.
“There is absolutely no link between a student and their responses, and there is no data that connects a given response with the time, location, browser, etc., so evaluations can be completely truthful and anonymous, just like they should be,” Kirkland said.
Written by: Benjamin Porter— email@example.com