Course evaluations may change

The last day of classes tends to mean two things to students: food from the professor and Course Evaluation Questionnaires, or CEQs. These evaluation forms have been used for a number of years by students to evaluate their professors and give them feedback on their classes. Soon, however, the forms might look very different.

Currently the CEQs consist of two documents: the quantitative form which allows students to evaluate the class and the professor by selecting numbered boxes, and a written evaluation that is sent directly to the professor. The general consensus among faculty members was that the evaluation forms were unsatisfactory and needed to be changed.

Professor of Education Chris Bjork noted his dissatisfaction with the forms. “I personally do not find the data from the CEQs very helpful when it comes to evaluating my teaching and coming up with strategies for improvement,” he wrote in an emailed statement. “The questions are fairly generic, and we receive little information to substantiate the quantitative feedback generated by the evaluations.”

Bjork is part of a CEQ sub-committee made up of faculty members who aimed to evaluate the current faculty evaluation forms and to take the necessary steps to design a new form. The creation of this committee arose amid concerns that the CEQ did not properly evaluate professors’ teaching methods, and that the questions on the form were too vague.

Professor of Political Science, Steven Rock, was the head of this committee and summarized the problems with the CEQs. “There is considerable unhappiness with the current CEQ among members of the faculty,” he wrote in an emailed statement. “Some members of the faculty feel that the questions or prompts are problematic because they are unclear, because they do not ask about things that are necessarily connected to effective teaching, and for other reasons. Some feel that rather than having a set of questions requiring numerical responses, followed by a separate narrative page seen only by the faculty member, there should be narrative boxes among the questions that would allow students to elaborate on their responses.”

The CEQ sub-committee worked with the current forms for several months, utilizing research from other institutions and faculty feedback to look critically at the current forms. Committee members also met regularly with students from the VSA Academics Committee to receive student input on the CEQ.

Outgoing VP for Academics, Shruti Manian ’14, provided the faculty committee with feedback and helped to create the new evaluation forms. She and the rest of the Academics Committee also helped to organize pilot testing to evaluate the questionnaire that might serve as the template for the new evaluation forms. “The Academics Committee has provided much needed and valuable student input to the process,” she wrote in an emailed statement. “…[T]he Academics committee provided a sample of around 40 students who were administered the new test before it was tested in certain classes in May. This group of 40 students (who were not from the VSA, they were uninvolved friends of VSA members and random volunteers who answered a campus wide email) filled out the new CEQs for an imaginary class and were asked for their opinions on the instrument…They gave the committee their thoughts and suggestions on the instruments. This ensured that student input, beyond just VSA council and committee members was very prominent in the whole process of crafting these new evaluation questionnaires.”

This sample testing provided the CEQ committee with data to analyze the effectiveness of the creation of the new form, which is currently being called the Student Course Evaluation (SCE). Professor Rock outlined the process for the creation of the SCE. “Phase one of the pilot study employed a number of student focus groups and phase two, just completed, had students in approximately 50 course sections complete both the SCE and the CEQ,” he explained. “We will use the results to compare the two instruments and to continue to refine the new SCE as seems necessary and appropriate.”

The second phase of pilot testing took place during the last days of classes, May 5 and 6. Professor of Psychology, Ken Livingston, detailed the goal of this testing.

“The pilot conducted this spring will provide some limited quantitative information in addition to the pattern of responses on the forms themselves,” he explained in an emailed statement. “By having students complete both forms we can establish whether the two forms (the existing CEQ and the new SCE) differ in their sensitivity to these sorts of factors, as well as looking at some technical properties of the question sets (e.g., are there questions that always pull the same answers so that they are in effect redundant).”

One of the main differences between the CEQ and the SCE is the narrative responses. On the CEQ students write separately from the quantitative responses, whereas the SCE has an optional comment box for students to provide details about their responses.

Professor of History, Miriam Cohen, who is part of the CEQ committee, described the reasons behind this change. “One concern was the fact that right now, any written comments that students make are not connected, at least in any systematic way, to the multiple choice questionnaire. By integrating short answer questions into the new evaluation form, the faculty hoped to learn more specifically the reasoning behind student choices in the quantitative section.”

The new SCE was administered online to professors who volunteered to be part of the pilot testing. Professor Rock explained the decision to use an electronic rather than a paper format.

“The committee decided to test online administration of the new SCE because the type of questionnaire that we are considering—a series of questions requiring numerical responses, with boxes for narrative elaboration at the close of each section and at the end of the survey—could not be handled by the optical scanning technology that the College currently uses to compile the CEQ results. Online administration saves paper and is more efficient. It was not a factor in our decision, but it is probably worth noting that most of our peer institutions administer their course evaluations online.”

Students were asked to bring their laptops or tablets to class for the pilot testing. All students had access to the SCE, which was part of the Moodle site for the class. Vassar’s Computing and Information Services (CIS) helped to create the new evaluation form and provided rental laptops for students who did not have them.

Director of Academic Computing Services, Steven Taylor, was responsible for formatting the SCE. “My initial task was to use Moodle’s questionnaire function to create online versions of both forms,” he explained in an emailed statement. “There was a separate form for each of the two parts of the CEQ, plus the new SCE, plus a short questionnaire about gender and ethnicity. So four questionnaires had to be added to each of the 52 participating courses. They were set up in such a way that the student input would be collected in a single location, which was not accessible by the instructors. Each was timed to become available just before the class session in which they would be used, and to become unavailable shortly afterwards.”

Dr. Taylor’s work, however, was not over when the pilot testing was complete. “Afterwards, CIS had to collect all the data and make it available to the subcommittee so that they could compare the effectiveness of the two forms,” he noted. “We also have to generate SCE reports on each course, to share with the individual instructors.”

The official outcome of the pilot testing is still unclear. With the data, the committee will evaluate the effectiveness of the SCE in the hopes of sending a final proposal to the faculty. If they vote to accept it, the questionnaire will be implemented in the coming semesters.

Taylor noted that changes in evaluation forms will mean changes for CIS. “The subcommittee will probably explore the results of the pilot study for some time and if it makes a recommendation to switch to the new evaluation, that will need to be discussed among the full faculty. If and when the new evaluation is adopted, CIS will have to come up with some new tools for doing the data analysis and reporting. We’ll also have to scale up our procedures for lending laptops to students who don’t own them.”

Faculty evaluation may not be limited to forms like the SCE, however. Professor Bjork, who has spent time observing in local schools, spoke to the importance of using different measures of evaluation. “Although it [is] officially known as the CEQ sub-committee, the group has actually spent a lot of time discussing other tools that can be used to evaluate professors,” he noted. “It’s not clear what those tools will be, but I would say there is a strong consensus that the College needs to take a more comprehensive approach to the evaluation of teaching.”

Professor Rock agreed, and provided some concrete measure of evaluation the committee might explore. “I might add that many faculty members feel that too much emphasis is placed on the CEQ, so we are also planning to look at other mechanisms that we might use to evaluate teaching, such as classroom observation, surveys of former students, and so forth.”

Leave a Reply

Your email address will not be published. Required fields are marked *