Does Course Content Matter for Instructional Design?

| | Comments (5)

I used to be involved in a project which created Flash animations and graphics for different courses. One question I was asked was how applicable it was across disciplines.

For instance, do I really expect a philosophy instructor be interested in an animation of supercritical fluids? Actually I don't...But would a philosophy course focusing on Greco-Roman schools of philosophies be interested in a set of historical maps, like the one we did for a Jewish history course? Maybe they would.

This leads to the larger question of whether academic discipline matters when considering tools. On the one hand, it doesn't matter. All courses have target learning outcomes (changes in skills/attitudes you want to see in your students), and the process for mapping objectives and tools should be the same no matter which course you are designing.

But here's a caveat - courses vary widely in their objectives. Even in the philosophy department, a course that focuses on ancient philosophy may share objectives with a history course as well as a course in modern policy, while a formal logic course may have goals similar to an algebra course.

I think that to expect the same courses to use tools in the same ways is doing them a disservice. So instructors naturally benchmark themselves with similar to theirs (i.e. a logic instructor is probably interested inother logic courses).

There are many tools like blogs, images and audio that can be applied in many disciplines, but the uses may have different nuances. Podcasting is a great way for students to create their own interviews (journalism), but is also a great way to capture the sounds of a natural environment (biology) or compare dialect samples (linguistics).

I can truly see three different courses in which students are creating audio, but it's not the same audio. I can also see courses where students aren't necessarily creating audio (maybe blogging is better because you need to learn the craft of writing concrete poetry, include phonetic symbols or explain still photos).

As an instructional designer, I like to look at examples from different disciplines because I do learn more about the capabilities and possibilities of a new tool. And maybe I will find that a technique from math can also work in a philosophy course like logic.

5 Comments

It is a great question ... my only comment is related to higher educations' investment in course management systems. We don't seem to worry if they are the right tool to use, we just use them -- sometimes, even for what they are good at. All of these tools are just that -- tools in a toolbox. At PSU we are lucky to have a fully stocked workbench to select from. Why do we feel the need to critically question blogs, wikis, podcasts, and other emerging tools when they are really no different than any of the other tools we rely on? Why not post content to a blog, an audio clip as a podcast, and send students into the learning management system to take a quiz?

It is a great post and a great line of thinking ... I am just struggling with what makes us view this "new stuff" as being an "or" statement ... as designers we need to select with an "and" attitude. Unless of course I am completely off base. Again, I am digging the line of thinking here, just trying to push the conversation forward (that is one of the things blogs are great at, right ;-) )

Shouldn't we critically question blogs, wikis, podcasts and other emerging tools because they are really no different than any of the other tools we rely on?

ELIZABETH J PYATT Author Profile Page said:

Interesting comments from Cole. First, I'm a believer in "and" thinking and not "or" thinking - http://www.personal.psu.edu/ejp10/blogs/thinking/2007/12/bothand-vs-eitheror-thinking.html

When I teach, I use multiple tools - last semester I used blogs and ANGEL (to host lecture notes in a password protected area), but another semester I might add dropboxes. It all depends on the course.

I think my point may be a traditional instructional design point that when you design instruction, you do have to take learning objectives into account and pick the right tools.

It's interesting that you mention that we "don't worry" if a course management system should be used, but it wasn't always that way. When ANGEL first came to Penn State, the instructional designers spent a lot of time exploring the tool and working out best practices to share with faculty. And many faculty asked us why they should bother with ANGEL.

Now ANGEL is taken for granted (and maybe we sometimes forget why). I think blogs, podcasts and wikis will also be part of the "normal" toolset, but like all tools (even the chalkboard), you have to consider the pluses and the minuses...for your learning objectives.

I'm pretty sure both chalkboards and ANGEL are used differently across different disciplines - and I don't expect the new tools to be any different. Figuring this all out is what excites me.

EP ... your point is well made and taken. I agree there are very notable differences in why you'd use a chalkboard in a class as opposed to an overhead projector (or any other tool for that matter). I had this exact discussion with a group of students just two weeks ago -- we were discussing how long it took for the chalkboard to catch on and how it changed the classroom experience for the teacher and students ... chalkboards make you slow down as you write and require you to be much more of a master teacher than does PowerPoint. Teachers had a hard time with it -- when to use it, what to use it for, etc. It changed a lot about classroom dynamics -- for really the first time teachers put their backs to students (as an example). Technology is disruptive and as Dave points out we do need to asses its value with a very critical eye.

My only point is that we tend to hold new tools in a much more skeptical hand than we do the ones we take for granted ... and that may seem obvious as we are attempting to understand the value and usage of the new stuff, but unless we listen to our guts and go for it (maybe only in small doses) we are missing opportunities. I am all for critically questioning the tools Dave mentions, but I am also all for working with faculty to see how they actually work in practice. We learn more through the failures most times than via the successes ... it is very cool that we do have faculty partners who are willing to walk out on that tightrope with us.

I also agree that learning objectives should be the critical decider ... I will also contend that some faculty bring a very different perspective on how to meet those objectives though. Some are interested in students only learning the facts, while others want to take a much more open discussion based approach ... and I know that the way we write those objectives should dictate that -- we use words like "discuss" to indicate the need to gain a deeper appreciation for the content, but not all faculty get that. Some see objectives and instantly want different things from terms we understand to indicate levels or domain depth. The only approach I know how to go with is working with the faculty so I understand their needs in delivering the content ... so at the end of the day (for me) it is the need to understand how they want to approach the learning and offer the right tool for a combination of their desired outcomes and those of the objectives. Did that make any sense at all?

ELIZABETH J PYATT Author Profile Page said:

Ah - now we're getting into the interesting area of instructor style as well as your theory of learning.

I have to admit I've gotten better at incorporating active learning into the classroom, but I still notice a time lag between learning the low-level content versus being able to analyze it coherently.

For instance, in my last class, I would teach syllable analysis in one week and that week's assignment would ask students to analyze syllable types based on data from an exotic language textbook. Results were sketchy in terms of analysis. I don't think the students had the experience or skill base to make to make a coherent analysis.

On the other hand, I did a "capstone" assignment of having students report a pronunciation quirk during Thanksgiving break and the did quite well - by then they'd been hearing this stuff for weeks. Learning to time the boring memorization (and there are times when memorization MUST happen) or some procedural skill with higher-order learning skills is something I'm still learning.

Again, there may be courses where this is not as much of an issue.

Leave a comment