In an article entitled “The walking dead in higher Ed” (whatever that means; it never says) Geoff
Irvine tees off on what passes for assessment in higher education, and at the
institutional level, it is indeed woeful. So what the ever-loving hell does this guy mean when he says “they
[colleges and universities] can’t prove that students are learning”? That is
what this faculty member has done for the past 35 years! TPP constantly assesses student learning; they
learn content, they learn to make connections with other fields of knowledge, a
hallmark of a liberal education, they learn to understand concepts in more
sophisticated ways, they learn to think, they learn to observe, they learn how
to frame questions and test them, they learn how to learn. This all can be demonstrated to just about
anybody if they have the time to wade through all the materials collected
during a typical semester-long course. However, multiply this by the number of
courses, and the mountain of material becomes insurmountable, so the university
hired this faculty member to provide a single letter summary, a metric quite limited to be sure, and then they take my word for it that
students have learned what the course was intended to teach them. Now of course
what administrators really want, and what the non-educators who impose their
views on higher education really want is some nice easy metric that says this
university is this much better or worse than that university (nothing new here). Oh, there are
lots of metrics that don’t have anything to do with learning, e.g., graduation
rate. It is important to graduate students at a high rate, and it may mean you
are good university, or an average university with very good students, or an
easy university; it all can look the same. Teaching and learning are
complicated things to assess, and at higher levels, institutional levels they never are assessed because the essential interaction is at the grass roots level between faculty and students. TPP has tried more new things, more new approaches, and
more new techniques that you can imagine, some work well and others are quickly
discarded. But really, you think students can pretend learning and this faculty member won't notice?
What does this guy means when he says, “the primary source
of evidence for a positive impact of instruction has come from tools like
course evaluation surveys”? Course evaluation surveys tell you one thing and
one thing only; they tell you what students liked and what they didn’t like. As
a long time professor, TPP can assure you that students can learn a great deal
from things they don’t like, but fortunately as a pretty creative instructor,
he has found a lot of interesting, and yes, fun things from which they also
learn. And of this learning, he has direct evidence in many forms. Still you
must take my word for it because even with all the materials is you won’t know
what evidence of learning and what isn’t without my input. To demonstrate this to an administrator once,
TPP wrote three short justifications for a research project and told them one of was
a total fabrication, an out-and-out fib, and challenged them to pick it out. Of
course, in a distant field, someone could do the same to me.
Geoff says, “the problem is that faculty and program leaders
are indeed content experts, but they are no more versed in effective assessment
of student outcomes than anyone else on campus.” Say what? Who else is going to assess
whether students learned anything in my class? Who else would know? You must challenge students with situations,
some few of which are called exams, where they must demonstrate learning by first
drawing the dots, using correct terminology and examples to label the dots, and then connect the dots in a manner that demonstrates learning when done on blank pieces of paper.
What Geoff was trying to accomplish with his article was
pretty easy to figure out; he’s an entrepreneur, a huckster for his own company’s
assessment product. He’s just one more assessment consultant trying to
win the hearts and minds of clueless administrators who believe his assessment
critique BS. Yes, that’s right, Geoff Irvine is
the CEO of Chalk and Wire, where you will find “all you need, to achieve your assessment vision”. TPP envisions people learning the correct use
of commas, but here we have evidence that they don’t, an illustrative assessment.
Inside
Higher Ed should provide a disclaimer before they let a guy with a
vested monetary interest in criticizing assessment use their publication as a
bully pulpit. In case it isn't obvious, faculty generally dislike guys like Geoff because they seem to
lack a basic understanding of the educational process, and a general contempt
for what faculty think of as assessment, and how faculty assess learning. At the very end of the article, Geoff says that institutions of higher education need “one essential unified goal: to make certain students are
really learning”. Always thought that was our single reason for existing, and to say we need such a goal rather indicates Geoff's disconnect with what he wishes to assess.