ICES responses only partially helpful to instructors

A kettle full of steaming chamomile tea, soothing ambient music on the radio, a full bottle of aspirin tablets, and — if worst comes to worst — my therapist on speed dial.

Yes, it’s once again time to kick back and attempt to interpret the ICES forms.

If you’ve forgotten all about the Instructor and Course Evaluation System forms you filled out near the end of last semester, I don’t blame you. Even we teachers have almost put them out of sight and out of mind. But we recently found our mailboxes crammed full of every last one of those blue-tinted sheets, a couple pages of statistical data and a helpful reminder for the confused and despairing instructor that interpretive guides are available upon request.

The interpretive guides really are necessary.

For example, I want good feedback about the amount of homework I gave last semester. Unfortunately the question which asks students about the amount of work is the only one where the most positive answer is in the middle column, not on the far left. Two students, either extremely happy with my class or extremely happy to be let out early from my class, just filled straight down the left column, skewing the results toward “an excessive amount of work.” How do I take that into account?

And more problems abound. How do I interpret the responses of A students, who blast straight through any assignment, versus those of F students? Should I compare myself to other classes with the same five-days-a-week format, or just other classes at the same introductory level?

And most important of all, can I get any statistically meaningful data from a single class of 20 students?

Sure. It just depends what meaning I want.

If I want to ask questions about the mechanics of my class — did I present clearly, was I well-prepared, did I give the right amount of work — then the ICES forms can provide an indication of the answer. Not the precise answer, clearly spelled out, but an indication. And they provide that indication about as well as any method; studies suggest that despite the vagaries of student evaluations, their review of teacher performance correspond quite well with peer reviews among teachers themselves.

That’s great if all I want to know is how I personally taught this one class, if all my students had come to college to take was my calculus course. But there’s one vital question which the ICES forms fail to give any information about: How well did my course prepare students for their later classes?

It’s not enough for a calculus course to be good, as a calculus course, teaching the ways of integration and differentiation. It must also adequately prepare students for future classes in calculus, engineering, architecture or what have you.

And if ICES forms did ask about how well the course prepared students for the future, I suspect most would respond with a cheeky, “I’ll let you know after a quick spin in my DeLorean.”

So let me make a modest proposal: In addition to the current ICES forms, we should include a similar, but still small questionnaire to be given to students near the end of their junior year, or perhaps midway through their senior year. This form would ask two or three questions about how a given lower division class helped prepare them for their upper division courses.

Ideally, departments would accrue this kind of data on their own, but it takes a lot of focus. Professors do not often teach courses in sequence, especially not if the sequence starts in one department and ends in another.

Students themselves, then, are one of the best resources to understand this problem. They took the course. They then took other courses. It would be greatly informative to know if the computer scientists found my methods extremely useful, but the biologists needed different topics emphasized.

At the very least, it would be nice to know that, if I did overwork my students last semester, at least I did so for a good cause.

_Joseph is a graduate student._