Skip to content

My Walk in Wonderland: ER

February 1, 2011

Ok, ok, ok, ok, ok, ok, ok.  Enough with the literary stuff.

What do we know—most ironically reported by the media about the media? Life support and last (w)rites are imminent.  Rush the patient to the ER.

The siren song of sirens as the gurney bursts through the doors to bright lights and cutting edge technology.  Why, it’s wonderland, the place to be, in any sort of e-mer-gency.

Too much? Thought so.

I’ve had to settle for a much smaller version of ER.  I call it Eroding Resistance.  A method I’ve had to adopt for the digital natives (print journalists) who inhabit my classes but who seem unable and/or unwilling to bridge the chasm between their everyday online lives on Facebook, Twitter, and the Web and the application of such to journalism.

Most of this resistance is to producing original multimedia, particularly sound and video. The analogy remains simple:  I know how to post a video, but I’m bamboozled by the thought of creating one.  (Yes, it’s the dreaded “student mode,” where intelligent people revert to blanked-out slate status. I bet you’ve seen it, too, where perfectly good writers and reporters can’t seem to identify the lead for the Second Coming.)

Thus, the intervention.  ER baby steps. (Or, in academic-ese, effective pedagogical strategies for knowledge transfer.  Gotta love it.) Build analogies that connect what they can do with what you want them to do. Mark Briggs does a fine job of doing just that in Journalism 2.0, the once-upon-a-time free monograph available as PDF download because funded by the Knight Foundation, that’s still available if you look really hard. Tired already? Ok, go here: Journalism 2.0.  Or, you can opt for the big brother edition (JournalismNEXT) published in 2009.

Briggs points out that editing video and sound relies pretty much on an operation as simple as cut and paste. That is, if you’ve used a computer at all, you have the basic technical skill for editing video and sound.  And if you’ve bought a computer recently, then the software to edit video is already installed or is available as a free download.  Same for audio.

So I give my digital natives the ultimate challenge: If someone as old and un-hip (would a hip replacement cure this?) as I am can do this …

They get the point and short audio and video clips from me to play with.  Then I usually bring in someone with lots and lots of experience editing audio and video to tell them exactly the same things I’ve just told them (meaning, wow, if only she had said this).  Thus, we build to the crescendo—so that I can deflate them by observing (thanks to a certain run of Microsoft commercials) that a 5-year-old can do what they just did.

But what the 5-year-old can’t do is understand the rhetorical application of sound and video to tell journalistically acceptable stories that are, at heart, text-based. And I remind that that when they tell those stories really, really well, they truly change people’s lives.

Cogitate on that, I say.

Maybe Wonderland ain’t so bad after all.

Pat Miller

Valdosta State University


Why do we continue to teach journalism?

January 25, 2011

The time elapsed since the last post on this blog is due to the start of the spring  semester, no doubt. As a relatively new professor, I find it overwhelming but fun to prepare new syllabi. Overwhelming because one’s plans for the entire semester must be in place at the first class – that’s a tall order! And fun because there’s a chance to try new things, repeat things that worked well last time and generally become a better teacher.

How to do that last thing is the challenge: In this period of flux and flummox in the journalism world one sometimes wonders, “Why am I teaching journalism at all?” Here is the answer I give to myself, my students and my friends in the business who think it’s not fair to encourage young people to pursue this dream. The dream is over, they insist. And I say it can’t be. That simply cannot be true.

It cannot be “over” to keep an eye on government, schools, businesses and the (in)justice system to ensure they are serving the public with integrity, grappling with the problems we face, and ensuring our future rights and prosperity. Surely people who know how to do this well – even, dare I say, professionally – are still needed?

It cannot be “over” to have the skills to translate government and scientific jargon into plain language that any semi-literate person can understand. Can it?

It cannot be “over” to engage in a discipline of verification that holds public officials accountable for what they say and do. Do we not need this now, more than ever?

It cannot be “over” to know how to capture a moment on film or video or in words, preserving it for posterity, explaining it for those present in this day and time – in short, writing “the first draft of history,” as someone famous once said. Oh wait, I know how to check that and find out! I have those skills. Surely they still matter. Don’t they?

That is what I teach, and what inspires my passion. We have a financial model that is no longer working, that used to pay people adequately to allow them to do this important work. But the loss of that financial model does not mean the work itself is not essential. It’s the Fourth Estate: Edmund Burke was quoted by Thomas Carlyle as having said that “there were Three Estates in Parliament; but, in the Reporters’ Gallery yonder, there sat a Fourth Estate more important far than they all.” If you’re interested in running down the origin of that expression, I found a great write-up on Wikipedia that looks quite well sourced. It is possible that Burke did not originate this phrase!

By the way, the correct form of the quotation mentioned a paragraph earlier is “the first rough draft of history” and columnist Jack Shafer did a column on his search for the original utterer of the phrase. It is widely attributed to Phil Graham, a brilliant publisher of the Washington Post who died too young but was succeeded by his very capable widow, Katherine Graham. But no one is absolutely sure.

No one is sure, either, about the future of journalism. Clear writing, capturing the moment in all media, holding public leaders to account and verifying what people say: These are the essentials of journalism and these are what I teach. They are as important now as ever. We need people who know these things, who care about them and practice them. We’re just not sure if we can pay them for doing it.

I have faith that given the obvious need for it, we will find a way to make professional journalism possible once again. Like J.M. Barrie’s Tinker Bell, journalism requires that we continue to believe in it.

Carrie Buchanan, John Carroll University

Tim Russert Department of Communication and Theatre Arts

My Walk in Wonderland: Alas, poor Yorick

January 9, 2011

Alas, poor Yorick, I must admit to a birth defect:  I was not born a digital native.

 I, too, like Yorick , am a fellow of infinite jest, of most excellent fancy, but my birth defect makes extrinsic what is intrinsic to the current generation of news consumers.  So what do my wit and imagination matter?

  Plenty, as it turns out, especially as a teacher.  Wit provides perspective and imagination possibility.  Both are more important than technology, per se, because technology simply serves as the tool for communicating fact and narrative. 

So what does this digital immigrant have to offer students with digital in their cognitive and social DNA?  How about an understanding of audience.  For example, I can assume that most people reading this blog—based both on age and level of education—know who Yorick is.  Some, I know, will understand the allusion (after all, in dog years I’m dead; in computer years, I’m ancient history). And some will ask why I’ve referred to a programming language used to create scientific simulations. (Yorick as a bit player? Sorry, couldn’t resist.) Allusion, after all, relies on a shared frame of reference.  Technology changes that.

But keep this in mind: Our digital natives will likely face a similar influx of technology after their DNA has set. (Granted, the brain is plastic, but it’s not silly putty—with perhaps the exception of the current crop of politicians and celebrities, but, again, I digress.) We must model how to handle that level of change.

Moreover, if (big if) someone clicks on the link to Yorick (technology in action), he or she (or they, depending on your politics of grammar*) will grimace when he/she/they find it links to Wikipedia. Wikipedia, for Pete’s sake, where anybody (with a capital anybody) can purport to be an authority. It’s not academically acceptable.  It’s not journalistically acceptable. Except collaborative knowledge is acceptable (not to mention democratically inclusive) to the contemporary news audience who acknowledges the value (but perhaps not the authority) of information as process rather than product.

 So what do digital immigrants have to offer the digital natives? An understanding of the changing rhetorical paradigm that explains (if not emphasizes) how audience engagement changes the communication process, and, in the process, every step in how we define and communicate “news.”

*Check out Grammar Girl’s take on this controversy.

A Very Short Teaching Experiment:


1.  Give your students 15 minutes to use whatever technology they have at hand in the classroom (computers, smart phones, iPads, etc.) to get the information that will allow them to answer this question: “Who is this Yorick dude in Shakespeare and why is he important?” Make sure they cite the sources of their information.

2.  List all major uncontested assertions on the blackboard (or its equivalent—say a discussion board in a classroom management system).

3.  Check those uncontested assertions against the entry in Wikipedia. Note discrepancies.

4.  As the professor, run the list of assertions by your local Shakespeare expert and note his or her comments. (Or perhaps you could offer this task as extra credit.)

5.  Report your findings to the class as the basis for a short discussion on the value and pitfalls of collaborative knowledge. Pay special attention to the factual part of the question (“who is Yorick”) versus the interpretive part of the question (“why is he important”).

Pat Miller

Valdosta State University

Once More Unto the Digital Breach

January 8, 2011

Does a cluttered desktop lead to computer phobia?

Teaching journalism in an English department makes for some interesting cross-fertilization. I’ve been putting together a panel proposal on literacy in a digital age for the November 2011 National Council of Teachers of English conference in Chicago.

And some of the material I’ve been reading is by doomsayers (or celebrants – depending upon their perspective) predicting the death of the written word by 2050 (see for example, “The Dawn of the Postliterate Age” in The Nov.-Dec. 2009 The Futurist [OK, so all my research isn’t deeply academic]). The article’s author, Patrick Tucker, writes of the advent of an oral/visual age in which we speak into computers, which will do our writing, and share thoughts with each other by a kind of ESP, courtesy of the microchips planted in our brains. Many writers foresee a less radical road leading to the future but speak of their fears that IM and digital technology are drastically changing our language. Naomi S. Baron, however, writing in the March 2009 Educational Leadership, describes a study she did of 11,718 words of text messaging by college students in which only 31 words were “‘online lingo’ abbreviations” (42) and “only 90 were acronyms (of which 76 were LOL)” (42.).

According to Baron, the list of how “electronically mediated communication” is changing writing “is relatively short” (43).  Her list includes:

  • Acronyms like “lol” (laughing out loud) and “brb” (be right back) are creeping into oral and written language (43);
  • Writers seem less sure about relatively minor grammar rules, such as when to hyphenate a word, when to use an apostrophe, and how to spell. (When Spell Check does the heavy lifting, a writer’s correct spelling becomes less important) (43-44).
  • Many writers seem to be adopting a “whatever” attitude toward linguistic rules so that “a wide swatch of educated speakers of English (at least American English) simply don’t worry about the niceties of such rules any more” (44). Baron, however, says the shift in caring about grammar rules “predates personal computers” (44).
  • Digital devices make it possible for readers and writers to control whether they talk or text, and with whom, and when, as well as allow readers to determine what information they receive. Individuals have more control over the communication they receive and send than ever before. This, in turn, “has shifted our attitude about who holds the power in linguistic exchange” (44-45).

Gunther Kress, in a 2005 interview in Discourse: Studies in the Cultural Politics of Education, picks up on this notion of the shift in power in the linguistic exchange. He says students using the Internet are  “looking for information, which that person will turn into knowledge in relation to the issue that engages them at the moment” (294). But many schools and educators are operating from the idea that “we provide the order, pedagogic order, we also provide the epistemological order, the order of knowledge” (294). These differing epistemological viewpoints between students and educators lead to tuned-out students, according to Kress.

What’s a poor pedagogue, who cut her journalistic teeth on 20th century technology, supposed to do? Well, adopting the “don’t worry, be happy” approach to what lies ahead is one approach. After all, if we all line up for microchip brain implants, we won’t have to worry about getting Alzheimer’s disease, will we?

There’s also the “If you can’t beat them, join them” approach. In her Winter 2011 article, “Then & Now” in the literary journal, Creative Nonfiction, Sarah Z. Wexler writes about the trying times she spent at New York magazines trying to persuade her editors, “the tech-clueless,” to add blogs and social media to the magazine Web sites. She admits she has learned from such editors “how to shape a story, how to edit a piece while maintaining the writer’s voice, how to navigate industry politics and much more” (9). But she says those “experienced editors also have some things to learn, from SEO to how to Tweet. Don’t know what I mean? Ask one of your subordinates ….” (9). She’s right, although her attitude grates on one who’s done a good bit of hard time trying to bone up so she can launch herself into digital warp speed. (A typo just made me realize attitude and grrrr-attitude are closely related.)

Obviously, we all must work together because none of us knows everything or even enough of anything to keep this complex world of ours humming. We just need to keep on keeping on with a smile, a hope, a prayer, good intentions, interesting colleagues, hard work, willing students, loving relatives, chocolate, and our trusty megatetragigabigabyte computers. We should all do just fine.

Margo Wilson
California University of Pennsylvania

My Walk in Wonderland: Sound

December 10, 2010

I’m sitting in a classroom listening to the sustained “clitter” (more onomatopoeically accurate than “clatter,” I would argue) of students hovering over keyboards as they negotiate the final exam for “Feature Writing.”  The final product will be a table outlining and justifying at least three critical concepts (ah, the sound of alliteration) about reporting, structure, style and revision they have learned from writing features, which they will then translate into a short, personalized feature writing guide they can walk out of the room with.

To set this up, I have them complete a short survey the last day of class that asks them to identify the most useful exercise we’ve completed this semester.  One interesting reaction:  “I hate to admit it, but the exercise where we read our stories out loud actually opened my eyes to certain things about my writing that needed to be changed.”  Why, I wondered, would she hate to say it?

(Really important detail:  Before we tackle the sound assignment, students have analyzed Jon Franklin’s “Mrs. Kelly’s Monster,” the story that won the first Pulitzer for feature writing and remains an exemplar for how to use onomatopoeia for structure and pacing.)

Perhaps she hated to say it because it’s an exercise that makes students uncomfortable.  It requires a bit of performance in that students record a 500-word segment (sans quotes, if possible) of their story and listen to it.  Then they evaluate for

  1. Onomatopoeia (If you use it, how often and for what purpose?)
  1. Alliteration/assonance (If you use it, how often and for what purpose?)
  2. Repetition (words, phrases or sentence structures. For what purpose?)
  3. Pacing (How fast is the action moving?)
  4. Rhythm.
  5. Sentence lengths. (What’s the relationship between the lengths of your sentences and content, for example?  Or rhythm? Or pacing?)


Finally, they must apply:

  1. Based on the questions you’ve just answered, write a short paragraph articulating what effect you want your style to have on readers.
  2. Go back to your original story.  Turn on track changes in Microsoft Word.  Edit at least the first 500 words of your story based on what you have just learned about your style.  For every style change, insert a comment explaining the function of that change.


More likely, students are uncomfortable because they’re forced to articulate concepts about sound as style, which (highly ironically) we almost never talk about, which (even more ironically) means that they never hear it-neither concept nor example.

(Typographical devices to re-create sound are annoying, don’t you think? Makes you wonder what gets lost in the translation from sound to text.  Equally interesting is the shift in voice—textual, in this case—and how that redefines the relationship to the reader.  But I digress. Or perhaps I don’t. Darn those deconstructionists—not to mention the alliteration alert.)

So here’s the point—the sound advice, for the punsters among us:  Teach the sound of language as a stylistic tool to your writers and then use it as the bridge for introducing sound as multimedia.  Once they write it, they can hear it.  Once they hear it, they can reproduce it.

Just a thought.  But, I would argue, a good one.

 Pat Miller
Valdosta State University

My Walk in Wonderland

December 1, 2010

It’s only if I take a step back that I feel like Alice.

That’s how long I’ve been in the rabbit hole. Long enough for the semi-miraculous to feel normal.

Long enough to be habituated to seeing the world pass by at 60 miles an hour (all right, 70) as though that’s how the world’s designed.

Long enough, finally, to be annoyed by laggard electrons in my classroom where I depend on multimedia to communicate ideas once the singular inhabitants of my imagination made presentable only by the resonance of the language I could clothe them in.

Long enough, perhaps.

But then it’s time to cash the reality check. I have students who want to be journalists.  That is, I have students who want to tell the truth, who want to hold those in power accountable, who want to make meaning by telling stories about what it’s like to be in this time and this place.

Wonderland makes that possible.

The latest foray has been to Second Life, a virtual world where almost anything you can imagine is possible.  I’ve been in the company of my JOUR 4560 Converged Journalism class where we are building (at least metaphorically) the newsroom of the future that engages and enlightens our NUCS—news users and consumers.  We have toyed with a community interaction desk, a place for community members to chill and talk and tell us what’s important in their world.  We have toyed with recreating the traditional structure (news, features, sports) by redefining news by its time element– breaking, developing, enterprise—and tying each to a desk that determines what tools (text, image, sound, interactivity) and means (mobile, Internet, paper) to use to tell each.

More important, we have toyed with articulating how we retain core journalistic values as we redefine what it means to be a journalist in the 21st century.

And we have played.  We have played in order to innovate.  We have played in order to evolve.  We have played in order to glimpse the possible revolution in how news gets defined, refined and communicated.

We have played, quite seriously, because it’s wonderland.

We hope others invested in journalism as a democratic force join us.

Tactical Teaching Tips:

If you’re contemplating using Second Life with uninitiated students, note that the learning curve is steep so that you may find you’ll spend an inordinate about of time teaching software rather than content.  In addition, you and your students must have computers equipped with enough power and video card capability to run Second Life so that you don’t fall victim to the frustration funk.

What Second Life is really good at, however, is letting you make the metaphorical (virtually) physical.  My students, for example, had to confront the management problem of how the actual design of the newsroom communicated the hierarchical relationship between editors, reporters, graphic artists and multimedia experts. That is, how should staffers with primarily conceptual and primarily technical skills relate? They had to figure out how to maintain open and efficient lines of communication, especially since their newsroom was actually a two-story floating pod above VSU’s virtual campus. (They did design a really neat vertical escalator, however, which made them realize that they assumed everyone in the newsroom would be young and agile enough to use it. Kind of an eye-opener about their assumptions.)

Finally, digital immigrants like me might already be overwhelmed by a “first” life that doesn’t allow much time to learn how to create and maintain a virtual world.  But I would also argue that Second Life is worth a look if, like the clock in the Tower of Pisa, you now have both the time and the inclination.

— Pat Miller

Is Prayer of Thanksgiving a Shout-Out for Body Scanners?

November 26, 2010

Sure, Sweetie, Your Uncle Sam Knows What's Best for You. From

I had my opportunity for my 15 minutes of fame over the weekend, but I decided that someone, anyone, else might want it more. I didn’t speak up, I didn’t rebel in any way when the Tampa, Fla., TSA agent selected me for the full body scan on my trip back to Pittsburgh. What’s a little more radiation? What’s a little more humiliation if my body scan can make the skies safe for democracy? Ask not what your body scan can do for you. Ask what your scan can do for your country.

Well, what can it do? At least some travelers over the Thanksgiving holiday seemed to feel their scans and those of others might help keep flying secure from terrorists. See for example, the Washington Post’s stories, photos and video about the lack of scanner protests.

Perhaps I wouldn’t be dwelling on this topic if a colleague and I hadn’t taken a side trip to Disneyworld while attending the National Council of Teachers of English annual conference. When the park gestapo insisted we be fingerprinted, I pondered whether I should choose to make a scene with my colleague or sheepishly slip inside Epcot.

“What do you mean, slide my finger in there?” my friend snarled at the “cast member” when the turnstyle refused to admit my colleague, even with our pricey ticket.

“Just put it right there, Ma’am. Right to the end,” the cast member told her, pointing to a sleek plastic finger-sized scanning device. Children were eagerly giving up their fingerprints to the futuristic beam.

My friend tentatively poked the scanner’s translucent trough. No luck. We weren’t going anywhere unless we fully committed our digits to the scanner’s light. No matter where you go, someone or some machine thinks you are terrorist material.

“Why do you need my fingerprint?” my former reporter friend inquired.

“Because, if you lose your pass in the park, it will make it easier to identify you and get a new one,” the cast member chirped.

Say what?

Neither my friend nor I was buying this explanation for one little Disneyliscious minute. Here was another defining moment. We could either stand up to the tyranny of The Mouse and refuse to enter the whorls and swirls on our index fingers into the rodent’s databases or we could submit to the fear that some terrorist, somewhere, right now was headed to the park with canisters of bubonic plague toxins to spray all over the American Adventure Pavilion. Would we do the patriotic thing and stand up against terrorism and be fingerprinted, or would we do the brave thing and stand up for our Fourth Amendment rights and refuse to submit to this unusual search and seizure of our pointer fingers? Would we refuse to have the privacy of our own fingers invaded?

Sometimes I teach media law, and I know that when a person is in a public place — and actually, we were in a public place but on private property — well, then, most privacy rights are forfeited. How badly did we want to ride the Maelstrom Adventure Cruise in the Norway Pavilion?  How badly did we want to buy a glow-in-the-dark light saber?

Badly enough, I guess. We caved. Or we signed up to serve with the forces or righteousness. It depends on how you look at it. The Mouse now has our fingerprints. We gave him permission to keep samples of our DNA. During the laser show, he secretly implanted computer chips in our temporal lobes to remind us that we live in the Happiest Place on Earth. (Just kidding about the last two — I think.)

When Disneyworld makes fingerprinting ordinary citizens seem like playtime and TSA makes going anywhere in the air an undress rehearsal for disaster, it’s hard to get one’s bearings and know what it is we believe in, stand up for, and respect these days.

I’m glad I made it home without a hijacking of my jet. I’m grateful for the turkey who gave his life so I might celebrate the freedom and abundance that still makes the U.S. a land I love. But really, how do we balance our need for security with our need for dignity and privacy? To paraphrase/parody Dylan Thomas, should we willingly push the turnstyle gently into that good 24-hour searchlight of our future, or should we rage, rage against the dying of our former way of life? I wish I knew.

— Margo Wilson