July 2009 Archives

Remembering 1979

| | Comments (3)

Here at ETS we are reflecting somewhat on the meaning of 1969 on its 40th anniversary and remembering where we were. To be honest, I was a little young for 1969 (although my mother swears I saw the moon landing). My first "9" year I remember was 1979.

This is an interesting year because quite frankly, it was not a good one. Inflation was around 12% around, gold was way up (maybe higher in absolute dollars than it was now), and gas lines were forming in the summer. This was when the rust belt really began to earn its nickname.

And there was the Iran hostage crisis which is rarely discussed these days (and which I have blanked out a bit), but was like an overwhelming wave of absolute darkness back then. This is where Nightline got started, and in our household, was required viewing at least once a week...for over a year that the crisis went on. It was truly, truly awful.

I was also not enthralled with the artistic output from either radio ("Heartache Tonight" is depressing and neither the "Logical Song" or "Reunited" have ever worked for me) or TV. Even 70s TV classics like Happy Days and Charlie's Angels were losing steam. All that was left was reruns of Star Trek, Get Smart and the OLD Battlestar Galactica (which was quite excellent later in the season)...and WKRP in Cincinnati which had a running theme of Johnny Fever bemoaning the death of rock and roll (we hear you man).

But there is one shining moment. One Saturday evening as we were preparing to watch Saturday Night Live (which was in its first glory period), we saw the dreaded news logo meaning that SNL would be interrupted to bring us the latest on the crisis.

Just as were preparing the change the channel, someone yelled "Wait!". It wasn't the Iran crisis, but the "Tokyo Crisis" (when Paul McCartney was arrested in Japan for possession of weed). Finally, someone willing to snort in the face of danger. Saturday Night Live also gave us the "Pepsi Syndrome" (when a can of Pepsi caused a nuclear meltdown), Steve Martin, "Jane You Ignorant Slut" and John Belushi as Captain Kirk (there...was...none...better). Good memories. Precious memories.

Looking back at 1979 in retrospect, I see that there are other seeds of change that were beginning to sprout and waiting for the 80s to arrive. 1979 was the year of "Heart of Glass" (Blondie) and others like "Cruel to be Kind" (Nick Lowe), "Rock Lobster (The B-52s) and "Let's Go" (The Cars). Somewhere in a practice rink, the U.S. Olympic hockey team was learning to work together to win the gold in 1980 (sweet). And apparently there are some musical classics I forgot were from 1979 like Cheap Trick's "I Want You to Want Me."

Is this a lesson that I should wait for better days or see the brighter side? Sort of. But more importantly, I think it's a reminder that while life can rob you of hope, confidence, the will to fight and your creativity (all while still in graduate school)...it can never rob you of your sense of humor. At least not mine. As long as channels like Bravo TV and VH1 show the ridiculous and The Soup and The Dish are there to lampoon it...I feel confident we will somehow survive as a nation.

I don't think I want to repeat 1979, but I do thank it for The Tokyo crisis and some of those obscure singles I bought in iTunes while I wait for the unknown future of 2010.

Please Turn Off the Cell Phone When Driving

| | Comments (2)

The New York Times has a fairly comprehensive article on the research showing that driving while phoning is pretty darn dangerous. How dangerous? It's as bad as driving drunk...literally. You're about 4 times more likely to cause an accident while talking on the phone - even if it's a hands-free device (bummer).

I like this article because it talks about WHY it's so bad. The upshot is that the brain can't multitask efficiently enough, but it's also a visual processing issue. You would think that you would be able to look at the road and talk on the phone at the same time, but in fact eye-tracking studies show that your eyes DO wander off the road while talking on the phone (eye movements can be related to processing linguistic data - fascinating...unless you're running a red light). As you can imagine, texting (where you DO consciously look) is even worse.

I admit that I am intrigued that talking to passengers isn't as distracting (or at least not that I've seen). I can't tell if it's because the audio is better (i.e. live vs over the wire) or if the other passenger is able to watch the road too.

The other aspect that is or isn't interesting according to your perspective is that some researchers feel that there's enough of a dopamine kick in using these devices to fuel the practice...even though most people realize it's a bad idea when others do it. Who knew being connected could give you such a buzz?

But believe or not, I don't want to take the cell phone away - I'm a user myself. I also don't want to disparage multitasking. I am writing this post with 8 applications open and iTunes playing classic pop in my headphones. No, I have nothing against multitasking in the proper context of an office cubicle. I just don't want to be the person you hit while you were talking on the cell phone...

Red Button - Please Don't Quit Flash CS4

| | Comments (0)

This is an entry in which I finally realize why Dave was having a rant about Flash CS4. Adobe CS4 has some annoying new "features" in which Adobe seems to be hoping that Mac users will see the beauty of the Windows style document interface and mainstream ourselves (not bloody likely - I want windows, not tabs). It's a good thing that Adobe left us with Classic mode and Preferences to disable the new features (thank you Adobe).

However, there is still a serious problem in Flash CS4. On the Mac, the little red button in most applications means Close Document (Command-W) but leave the application running...except in Flash CS4. There, Flash thinks it's in Windows and quits and closes all at once (Arggh!)

There are solutions to avoid this issue if you don't like the Flash red button. Mine is Command-W to close and Command-Q to quit. Dave's is Command-H to hide. In any case DON'T CLICK THE RED BUTTON until it's time to leave for the day...unless you feel like grabbing a soda while the application reboots for your second document.

FYI - Most Windows apps also include the Control+W command for closing a document, but not quitting the application. This is handy if your life is too short to watch IE boot up AGAIN .... just because you closed a window after visiting a Web site.

A Year of Twitter

| | Comments (1)

As we celebrate the 1st anniversary of the ETS Learning Design Summer Camp, I remember that I have another anniversary to celebrate - a year of Twitter. It was at a dinner last year that Robin2Go finally convinced me to give Twitter a spin.

One year later, I still have my account, but what do I think of it? I have to disappoint some people and admit that I still don't LOVE it, but I have come to appreciate it. As with many media (including, I suspect, the phone), the initial user base managed to put up the the silliest of messages, and we all know Twitter was no exception. Quite frankly, with a restriction of 140 characters, I didn't think it could ever evolve beyond the trivial.

But an amazingly short year proved me wrong. First it was a way for people to track the movement of tornadoes, then it became a way for a people to get out news of a government crackdown despite the shutdown of other media channels. Here in ETS it's a great way for people to quickly pass on interesting articles and traffic alerts, and yes a little gossip.

So why don't I love it? For the same reason I still don't love the phone or mail - signal to noise ratio. There are ways to manage the ratio in other media, but I haven't quite managed it yet here.

There's another issue which I haven't been sure if I should bring up, but maybe I will today. I think one reason people find Twitter and other forms "chatting" silly or annoying is that they may be observing a set of conversations that are "forced" upon them, yet cannot really participate in.

Consider something I think we all find annoying - overhearing someone on the cell phone. You may be forced to listen to intimate details of grocery lists, plans for the evening, or a review of last nights game you didn't watch. Not only is it banal, but it disrupts whatever internal thought process you may prefer instead (reading, meditation, blogging...).

Twitter shouldn't be the same, but what if it feels "mandatory"? Sure there are messages on Twitter that are relevant to me, but there are a lot of messages that aren't so relevant about what people are planning, games they watched or places they are going to. Maybe people do feel a genuine sense of community, but it reminds me that ... well I really am a dedicated introvert.

Don't get me wrong. I really enjoy conversations over coffee and in the hall. But they only take up a small fraction of my day and generally with a much smaller pool of people. I actually am able to absorb what I hear and appreciate it so much more.

Am I saying you shouldn't love Twitter? Not really. I think I'm saying that I wish I could filter out business Twitter from personal Twitter (like we can filter out personal e-mail from work e-mail). At that point I may really LOVE both sides of Twitter.

Postscript: Alternate Views (Jul 23)

Apparently I wasn't the only one reflecting on the user of Twitter at this event. Here are some interesting posts from Jeff Swain (who likes the community aspect of Twitter) and TK Lee who reflects on his Twitter note taking in a blog. TK comments that "communication has its cost" - we need communication to learn more information, yet paradoxically it's often a distraction to both produce and consume. I guess that's why "poor communication" remains a perennial in many work environments.

Finding Images in the Google Life Magazine Archive

| | Comments (0)

Google announced recently that they are making photos from Life Magazine (and actually Time-Life) available to the public. You can search the archive itself or you can go to the Google Image search page and add "source:life" to your search string. There usually lots of hits and each has excellent meta-data on who took the picture and when.

Although I am excited to see what photos from historic events will be available, I think I am more excited at the possibilities of being able to find legal images of celebrities, and maybe classic covers like the 1970s Time cover of the actresses from Charlie's Angels (it's there under "charlie's angels source:life". As odd as it sounds there are several course projects in which celebrity photos is a plus.

There was one multimedia quiz from African American history that we never released in the MTO archives because we could not efficiently clear celebrity images (e.g. James Brown, Tracy Chapman). Now, we could get many more legal images quickly. In addition, many foreign language classes use celebrity photos as a talking point (American and non-American).

There are other photos in the archive including the explosion of the Hindeberg, the taking of Iwo Jima (including the flag image), the various Olympics and many cultural touchstones from the 20th century.

All in all, it's a great complement to the news photo archives available from the University Libraries.

Trivia Quizzes for Learning

| | Comments (0)

Can a quiz help you learn content? Yes...if it's low stakes and you can repeat it. I was reminded of this when my colleague Brett Bixler sent this link to a European geography game from Lufthansa. This one is really fun because you have to point to a given city on the map before the plane "lands".

The challenge builds from giving you cities and national borders to just national borders then finally a blank map of Europe in the background. I can guarantee that if you play this enough, that not only will you learn where Hanover is, but where it is in relation to Cologne and Hamburg. It's about the quickest way to learn German geography without actually flying to Germany.

But I have to say that you don't need Flash or sophisticated game play to take advantage of this. For instance, I've learned a lot about African American history by testing an interactive quiz designed for that course. I knew I could get a perfect 10 pretty quickly by memorizing the answers.

The more I see games, the more I appreciate the factor of motivation in learning. Sometimes you learn, not to learn but in order to beat the system and gain point. It's somewhere between pure intrinsic and extrinsic motivation for me. I would be curious about retention, but I bet something does stick. I definitely remember pivotal answers (both right and wrong) from my high school trivia competitions. Some were actually those learned from my team mates who knew better than me in sports (there is peer to peer learning here), but many were actually mistakes I made (D'oh!).

I would add that I think there's an art to writing trivia questions. Questions asking just for canned answers (e.g. What's the French word for "again"?) are boring, but those that ask you to guess based on dropped clues can be interesting (What French word for "again" is used to refer to a request for a repeat performance? Answer: Encore).

I think that's one reason why Jeopardy is so watchable. Even if you don't know, you have a fair chance of guessing in many cases, and guessing does often involve synthesizing information. When I do remember my correct answers from that long ago, they were usually the good guesses (because I triumphed over pure memorization).

As I'm writing this, I do realize that there are limitations. First, trivia games generally appeal to a certain "trivia geek" who usually DO have lots of random information memorized and are building on it for more complicated answers. The European map game is fun for me because I have a certain knowledge of European geography, but could be totally frustrating to someone with absolutely no knowledge (or no interest) of European geography whatsoever.

I think there are a lot of people who will enjoy this game, but I can imagine someone in a course uninterested in the content who will be uninterested in the game. Or will they? I did actually use a "game" where I asked students to guess what language a blue patch of non-English speakers somewhere in the U.S. might mean. Even though the class didn't always get the right answer, I think most found the challenge interesting (especially since no grades were attached). The game also showed where some low level knowledge (e.g. knowing state borders) was useful to know.

Could the spirit of competition (or challenge) help students find the trill of academic victory beyond the trivia buffs? A lot of educational theory speaks to the benefit of cooperation, but maybe a little friendly competition isn't so bad either (especially between teams). It could be more interesting than listening to another 5th grade "My Presentation About A Random State" presentation.

Why Can't We Define "Instructional Design"

| | Comments (2)

Every now and again, our instructional designer community has a discussion about how we define ourselves to world at large who haven't had any courses in INSYS yet. There are a lot of reasons suggested including that we're a relatively new field and that our duties are variable. I think those are both factors, but I also wonder if it's because most people can't imagine what we do.

When you get down to it, our title IS what we do - we design instruction (or learning enviornments)...but in a systematic manner. For most of our collective history, the "design of instruction" has been intuitive and followed a set of cultural norms. One norm is apprenticeship (popular in many crafts) in which a student learns skills and builds towards mastery of an art form. Another norm is to have an instructor/guru/wiseperson lecture or tell a story. We now call this "sage on the stage", but I maintain that it can be effective in certain situations. Other models include case studies (business school), discussion (law school) and others.

There have been many traditions, but, until recently, very little empirical research on what works, and if it did work, WHY it worked. Teaching was not really considered a science, and still isn't outside the College of Education. How else do you explain the fact that prestigious universities charging very high tuition require little to no formal pedagogical training in its faculty? Even in something like crafting (embroidery, cooking, embroidery...), there is very little thought on what would work best - most instructors imitate methods they were taught in or grasp intuitively at strategies which may or may not work. Teaching is really considered more an art than a science.

It's easy to blame non-professionals for their lack of awareness on ignorance, but I do think there's more. Maybe the real problem is that we as humans are so used to learning (or trying to learn) that we don't understand how complicated the process is. That is, we don't realize that instruction should be designed, until an instructional designer points it out. Most of us (especially those of us in a university setting) take learning for granted, like walking or digestion. It just happens (or doesn't...in which case we feel totally stupid). Conversely, a student may know that instruction is poor, but may not be able to articulate WHY it's poor.

I think the problem is similar to my other field - linguistics. Most people do not realize how complex language is...because it's something most of us do without thinking. The only way to realize it's complexity is either to take a linguistics class or...try to program software that uses real language input (I think you programmers can relate).

The idea that you should consider designing instruction systematically may really be counterintuitive...kind of like the idea that walking is controlled falling or that laughter is a "species-specific cry" as one anthropology book described it. I think instructional design requires a certain amount of anthropological/zoological thinking about what humans do when they're not thinking about it. Personally, I find this aspect of instructional design very appealing...but it can be weird and counter-intuitive to the general public.