This is shaping up to be a solid, rather lengthy rant about the nature of instructional technology, though I will endeavor to avoid foaming at the mouth. Feel free to go elsewhere.
I’ve been percolating, so to speak, this particular rant for a long time. You can see my early eruptions in Thomas Wortham’s 1998 article for the ADE Bulletin “After the Fall: Teaching “English” on the Internet at UCLA,” which quotes me at some length, but I’ll get back to that in a minute. A couple of weeks ago I sat in a meeting where a much respected professional in Instructional Technology kept referring to technology and teaching in terms that made it very clear that technology exclusively referred to digital technology, nothing else. Finally, last week I saw this “Live Colloquy” in the Chronicle of Higher Education: “When Good Technology Means Bad Teaching.” Here’s a sample:
Colleges have spent millions on “smart classrooms” packed with the latest gadgets to assist teaching—computerized projection systems, Internet ports at every seat, even video cameras with motion detectors that can track the movements of a lecturer. But colleges have spent far less time and money giving professors the skills to use even the simplest technology effectively.
The result: Students say technology actually makes some of their professors less effective than they would be if they stuck to a lecture at the chalkboard.
There’s a lot of truth to this. It really doesn’t make much sense to spend hundreds of thousands of dollars on digital technology if you don’t have support people to take care of the technology (always budget for maintenance, updates, and support including training), and staff to help faculty and students use the technology. As I say in my “About This Blog” page for this blog:
As much as I am personally fond of things geeky and digital, quite often I see technology in instruction being used simply because it’s there, rather than because it’s a better way, or because the technology enhances student learning. I also see a lot of silliness in terms of technology and instruction, where a particular technology is forced on end users and faculty because some manager or administrative person thinks it’s cool, or will draw fame and fortune to his career, rather than because it’s effective or appropriate. Frequently faculty who would like to use technology are bewildered by the jargon and by the unfortunate arrogance of the technical experts they must work with, who, for all their technical expertise are, not surprisingly, sometimes woefully ignorant about pedagogy, and have no interest or understanding of the humanities.
Let’s just step back a minute and think about the phrase “Instructional Technology.” The whole point of the “technology” use is to enhance instruction. If the students are frustrated, if the faculty are frustrated, if the technology interferes with rather than enhances learning, then there’s very little point to the technology. It’s the instruction, and its efficacy that’s important, not the means used to implement the instruction.
Back in 1997, when the world was new and I was young, UCLA brought forth the Instructional Enhancement Initiative. Among other things, the initiative funds and mandates a class web site for every undergraduate class in the humanities and social sciences. The I.E.I. was largely established by fiat (I suspect if it hadn’t been, we’d still be forming committees and task forces to discuss the possibility of class web sites). Here’s part of what I wrote my departmental chair in 1998 about that initiative, and digital instructional technology, in the ADE Bulletin article I mentioned up-rant:
The problem with the way digital technology is being implemented is that the university has put the cart before the course. In the “real world” of commercial software and technology implementation, you start with the data, the “content,” and then you look for the most suitable technology to use with it. You do not start with the technology and then tell the content expert (jargon for scholars and teachers) to find some use for it. That strategy is completely ineffective and any commercial enterprise that proceeded in that fashion would quickly be in Chapter 11. . . . Someone needs to evangelize, so that the concerns and needs of faculty and students are met. The content must be emphasized, and there need to be reasons for using digital technology. It isn’t enough to put something on the Web or on a CD-ROM just because you can. The point is to enhance scholarship and pedagogy, not to take scholars and teachers away from what they do best so that they can learn to use constantly changing technology.
Faculty are hired for their skills as teachers and scholars. I’m happy that they see a use for digital instructional technology, delighted to help them incorporate it in their courses, or teach them to create their own digital content. But I don’t see mastering digital instructional technology as something they should feel compelled to do. They are content experts, they were hired because they’re content experts, and not, (until fairly recently) because of their technical skills.
Here’s another quotation from that Chronicle Colloquy:
“The support systems are not in place right now to really promote effective use of technology,” says Mr. Loomis. “I basically waited until I was a tenured full professor until I started getting into this kind of stuff.”
There’s a solution. Hire experienced instructional designers and developers with a background in instruction and in the specific content areas to support faculty. Hire graduate students to assist faculty with content development. We’re generally fairly comfortable with technology, some of us are downright proficient. We often have teaching experience, or look forward to gaining it, and working with faculty to develop content for their classes will help us learn about pedagogy from experienced scholars and teachers. Train these graduate student assistants in using the technology to create content. I know this will work— Dr. Wayne Miller set up just such a program for the humanities at UCLA in 1996. In addition to hiring and training graduate students to support technology content development for instruction, include teaching with technology in all the pedagogy experiences graduate students have. Teach future teachers so that they understand the effective use of all the technologies available to them, not just the really cool and rapidly outdated digital ones.
Finally, don’t make the mistake of thinking that the only good technology for teaching is silicon based. That’s all too common, and it’s just plain wrong. Think about the word technology; the Indo-European root *teks is the same one that gives us the words text, and textile, among a host of others. Writing itself is a technology. How we write is less important than what we write. Writing, on an (analog) whiteboard, blackboard, or overhead transparency, showing a video—these all still work, they’re low maintenance, and sometimes, may be more effective than a multimedia production or yet-another-Powerpoint-presentation. Think about the purpose of the class, or exercise, or lesson, and the technology best suited to express that purpose; it may end up being a living teacher, pacing up and down while leading a discussion of involved, thinking students. The appropriate technology may be a brainstorming session on a board—white, black,or digital; it may be a Moo, a discussion forum, or email list, even a wiki or blog.
The best technology might, possibly, even be Powerpoint, if you’re including a variety of media, or Keynote for the aesthetically sensitive. It might even be that marvel of two thousand years of technology, the codex book.