Loading icon

Evaluating Tufts

Campus | March 10, 2010

When asked about this seemingly unfair situation, Dean of Undergraduate Education James Glaser said “if students think that they don’t have access to student evaluations, they are incorrect to a degree.” How is this possible?

Technically, the Senate has the power to request the quantitative data (not the comments), process it, and post it on Jumbo Access, which is actually Tufts Reviews. While some student comments may be useful, the quantitative data under “Official” Reviews is woefully outdated and the site is barely managed.

For individual students to have access to official professor evaluations, “it requires the student senate to acquire the data and to post it,” said Glaser. “But they have not asked in a number of years…it’s a big chore.”  In fact, according to Glaser, the last time official course evaluations were released to students was in 2005.

What do we need, how do we get it?

The cumulative impact of filling out evaluations every semester without seeing any benefits seems to be that “students don’t really think that their course evaluations make any differences in the long run,” said senior Emily Maretsky, Student Trustee Representative on the Senate’s Academic Affairs Committee. This is readily apparent in the TCU Fall 2009 Survey.

Evaluations, which for all practical purposes are totally inaccessible for individual undergraduate students struggling to pick classes, even popped up several times in the freeform “suggestions” portion of the survey.

Students notice this gap, but so do professors. Professor Edith Balbach, Director of Community Health, recalls a student who was hesitant to take a class based on word-of-mouth. After seeing the 2005 reviews, however, he changed his mind and “got a lot out of it.”

This frustration is not so at other schools, especially Northwestern University, which Maretsky cites as a prime example for a great course evaluation program. Maria DiBenedetto, Senior Assistant Registrar at Northwestern, detailed how they’ve collected evaluations online since 2004 and have integrated it with their student records system to eliminate redundancies and streamline information.

Advantages include “a much quicker turn-around of the reports…more accurate course and enrollment information, and more extensive comments from our students.”

Furthermore, the Northwestern system encourages participation because student access to the results “is based upon their participation in the evaluation process,” and nonparticipation gets students blocked from the system for the next quarter. “Numerical data is published for every class” with no exceptions, and professors can even allow comments to be published.

Maretsky also finds a better example much closer to home. She praises the Tufts Medical School evaluations system for closing “the evaluation loop.” Not only does our own Medical School hire a specific person to process evaluations, they also sit down individually with professors to make recommendations based on the data, so that a connection is made between suggestions and solutions.

What’s the deal right now?

None of this is currently possible at Tufts, where even Dean Glaser describes the existing system of “paper and pencil evaluations” as “barely adequate.” A planned overhaul would 1) bring it online and 2) post scores for students to see.

No sooner than he acknowledged these plans did he fall back on the familiar administrative safety phrases: it won’t happen soon, there are “a variety of policy issues,” and some sort of resolution is being formulated by the Educational Policy Committee, which will be taken to the faculty for approval. Soon. In April.

Nevertheless, any improvement will be a drastic one, even if only half of the current student body will be around to potentially enjoy a comprehensive online learning management system that will be replacing “a very old, antiquated machine and program” in the next year or two.

While the system isn’t hard to navigate, Balbach and Maretsky both expressed the need to update and improve the actual surveys as well. “I think we collect a lot of information that isn’t that helpful,” said Balbach, who wants to see “how much students think they learned from a course and how effective the faculty member was.”

The current archaic system borders on hilarious—in her research, Maretsky was directed by Dean Glaser to an administrative department that gave her 40 or 50 floppy disks worth of quantitative evaluations data from Spring 2009. On a sidenote, this means that every department at Tufts must have a computer that can still process floppy disks for this very purpose. Furthermore, the prospect of sifting through 50 floppy disks probably discourages any cross-departmental and school-wide analysis.

Joann Jack, Tufts’ Registrar, didn’t have as much to say as her Northwestern counterpart. “[They] are scanned and the data is given to the department chairs to review,” she said. “They use the information for tenure cases.”

Balbach adds that evaluations “make a difference in retention, promotion, and salarity decisions.” However, according to Maretsky, who has actually seen the final report of evaluation results, departments only take four out of the 20-or-so questions into consideration.

Finally, there is little incentive to take evaluations seriously—“I just fill out all fives,” shrugged an anonymous senior. Balbach notes the potential harm of lacking access to evaluation data, saying that it worries her that students base their decisions solely on word of mouth.

A brighter future. soon. eventually.

Tufts’ eventual “move to an online course evaluation system and process” as described by Glaser involves several changes to the status quo, which has been detailed elsewhere. Optimistically, Glaser predicts that “we will be posting the quantitative evaluations” for students to see, though not the comments because they feel it’s inappropriate. The Class of 2012 or 2013 may one day have a chance to benefit from the experience of their peers, who may also face stronger incentives to thoughtfully complete evaluations.

The subjective, handwritten portion of the surveys seem unlikely to find a home in any new system, even though they’re crucial to the evaluation process. Despite finding this portion the “most useful,” Balbach admits that comments are sometimes mean, and she “worries about making them broady available” without editing.

In its constant struggle to maintain and increase the teaching quality while also pursuing research goals, Tufts’ planned reform to a more transparent online evaluations system is a big step forward. However, even the low-tech, bound, paper evaluations available in Ginn Library for Fletcher students still achieves the administrative transparency and student collaboration essential to improving the Tufts experience and reputation. O