Wednesday, April 30, 2008

Listening


In the most famous, most useful, and most unheeded of his essays, George Orwell remarked that "the slovenliness of our language makes it easier for us to have foolish thoughts", the way a man will turn to drink because he thinks he's a failure, and then actually become a failure because he has turned to drink. Among the more slovenly phrases to enter the language in the last decade is "continuous partial attention" — corporate doublespeak for not paying attention — and it has led to the foolish thought that it's neither rude nor stupid to surf the web when people are trying to talk to you.
In many universities professors are beginning to voice serious concerns about the consequences of the "wired" classroom, where lecturers must compete with the lure of the Internet for students' attention. I can confirm that it is demoralizing and distressing to look up at an amphitheater and realize that perhaps one laptop user in five is paying attention. Relax, says one of my colleagues. "A student's freedom to think their (sic) own thoughts, to structure their own mental activity, is a far greater good than trying to compel some semblance to attention."

Well, I share Churchill's conclusion that one volunteer is worth ten recruits, and I consider it axiomatic that how you think is more important than what you think. But I wonder how one of my students would feel if he was giving a presentation and looked up to discover my laptop and I structuring our own mental activity. With justification he'd call my professional conduct into question, and this raises the crucial point. We talk a great deal about teaching excellence, as indeed we should, but not nearly enough about student excellence. Why is the balance tilted so heavily to one side of the scale? Why do I have the obligation to speak, and to speak well, while the apostles of digital education inform me that students have no obligation to listen, or to listen well? I'm being paid, of course, but make no mistake: the students are being paid, too, even if they're paying for it. As Bertrand Russell wrote in 1923,
"A boy or girl of eighteen, who has a good school education, is capable of doing useful work. If he or she is to be exempted for a further period of three or four years, the community has a right to expect that the time will be profitably employed."

It is for this reason that the degrees granted by my institution refer not just to rights and privileges but also to obligations. In Ontario alone, more than ten billion dollars of public funding flows into the universities every year. Nationwide, the figure exceeds thirty billion dollars. For every tuition dollar students slap down, taxpayers pay three or four more, and people who work to send others to school can reasonably expect that they're subsidizing students to learn the curriculum they signed up for, as opposed to underwriting the cost of seeing what's new on Facebook.

As always, the question with any teaching innovation is this: are students smarter because of it? We have expended countless billions of dollars on the digital classroom - what proof do we have that those dollars have produced better students? These are empirical questions. Prove to me that students who surf the web during my lectures are better students (and telling me that they think they are isn't proof) and I'll consider it my professional obligation to advise them to do so.

Recently I was confronted with the rather smug argument that if I'm concerned about students surfing the web during my lectures, it's a sign that I need to be more interesting. Well, I'm considered a good lecturer, and have the awards and evaluations to prove it. I can compete with daydreamers and doodlers and with students reading the newspaper. I can't compete with Facebook and Youtube and Call of Duty 4, and neither can you. If I could do that, I'd create a website called grahambroad.com, upload my lectures, and be a billionaire by the end of the year. My challenge to any professor who thinks that she can compete with wireless web is this: put your lectures on Youtube and see how many hits you get. If it's one-tenth of one percent of the number received by a simpering lunatic telling people to leave Britney alone, I'll buy you a beer. A case of beer.

Tuesday, April 22, 2008

Progress

I turned 38 today, and birthdays, like climbing stairs, are opportunities for reflection. An on-line calculator informs me that my estimated life expectancy is a rather mediocre 77 years. If I drop some weight, exercise more, and eat better, I can push it to 79, but the biggest problem stems from something totally beyond my control: my mother and two of my grandparents died before the age of 70, which suggests a less-than-stellar genetic predisposition to longevity. Pick your parents carefully, children. More drastic changes: more exercise, dietary supplements, less stress, less alcohol (explain to me how that's going to help with the stress), can give me another couple of years. One calculator says that more sleep can add another year, but I did the math and I see no particular advantage in gaining a year if I have to spend it sleeping.

Hence, my plan to live forever, which had been going so well, is looking less and less tenable. In fact, I'm confronted by the rather depressing realization that there may be only about as many days ahead as behind. Still, better the days to come than most of the days in the past. One question I put to my students, year after year, is this: given the choice between living today and living at any time in the past, which would you choose? Most females are rather grateful to live in the modern era, when they can attain higher education, vote, own property, and not die painfully in childbirth. As for males, most in their late teens and early twenties can easily be persuaded that, all things been equal, it was better not do die in the mud and blood and on the wire along the Somme, for instance. And all are very pleased to live in a era of soap and iPods and cell phones and painless dentistry and multi-ethnic cuisine. In short, if there's a lesson of history, it's that there's no time like the present.

So, I personally am hoping for some medical breakthroughs in the next two or three decades that can prolong my life - I have a lot to do, after all. It could happen. In fact, given the medical and scientific progress over the last few decades it's eminently probable. Advocates of holistic, herbal, homoeopathic, and other forms of alternative and faith-based medicine can never quite reconcile themselves to the fact that the life expectancy of the average Canadian nearly doubled in the 20th century, that smallpox and polio and tuberculosis were either eradicated or cured, that the infant morality rate fell from ten percent in 1920 to half a percent in 2005, and not because of herbal tea and chiropractics and magnetic wrist bracelets. It happened because of the application of science and technology to medical practice.

I have postmodernist friends who tell me that progress doesn't exist, that it's just another social construct, a paradigm waiting to be overthrown like any other. In fact, they'll hop on jet liners guided by the global positioning system, cross the Atlantic in seven hours, check into a hotel they booked over the Internet, phone and e-mail their loved ones to tell them that they arrived safely; and then, the next day, speaking over a microphone to a roomful of scholars who have not been burned at the stake by their own governments, they'll say that progress does not exist. And have the PowerPoint slides to prove it.

Wednesday, April 16, 2008

Surveillance

I began my first blog four years ago but aborted it after three weeks. I had some notion that my class — it was the first class I ever taught, actually — might start blogging about American history. I abandoned the idea in favour of an on-line discussion board to supplement classroom discussion, which we had depressingly little time for. My other reason for abandoning the blog was my impression that most blogs are not merely trivial but banal and badly written, and of no conceivable interest to anyone other than their authors, and I didn't want to jump on the latest digital bandwagon just because I could. I learned my lesson about tech-junk when I rushed out and bought a Commodore 128 in 1986.

A great deal has happened in the blogosphere since 2004, and it's no exaggeration to say that some of best journalism — and most useful muckraking — in the world today is being produced by bloggers. But the fact remains that the overwhelming majority of blogs are time wasters for readers, and time is the one resource we have less of with every passing second. And it feels good to say, once more, and with feeling: that information does not equal knowledge. I have absolutely no interest in the mundane musings of the typical blogging suburbanite — nor indeed those of otherwise decent journalists who blog because they're forced to — and this goes doubly for the legions of semiliterate bloggers whose only encounters with books seem to be in the form of Facebook. Add the huge numbers of blogs by the denizens of the lunatic fringe and I find little that can justify keeping me from Northanger Abbey for much longer.

I always suspected that blogging often serves as an outlet for some of the worst qualities of the exhibitionist mentality, but I never suspected that it could be a form of voyeurism, too. How wrong I was. When I wondered aloud one day how many people might actually read my blog, I was informed that it was possible to know - and in fact to know a great deal more than that. Programs such as Statcounter make it possible, in effect, for the writer to surveil the reader. By the simple expedient of cutting and pasting a line of code, bloggers can learn their readers' IP addresses, the time and length of their visits, the keyword searches they used to find the page, and — I don't exaggerate when I say that my heart skipped a beat upon this discovery — in some cases the name of the person to whom the connecting computer belongs.

So this gives me one further reason to like books and used books in particular. The moral and ethical implications of Internet monitoring are the subject of serious study in certain fields, but in the case of most bloggers I've talked to, their defense of the practice amounts to nothing more sophisticated than arguing that their curiosity is more important than my privacy, which seems both shady and seedy to me. I don't wish to sound naive: I know it's the same almost everywhere on the Internet, and I understand that gathering such data can be useful for commercial purposes, where ethics is usually a secondary consideration.

But I fail to see how surveiling the reader serves a useful purpose for serious writers - by which I mean people who write for themselves and for the sake of their craft and not to pander to a known constituency. Last night I finished Doris Lessing's extraordinary short story The Temptation of Jack Orkney and — this is the crucial point — Lessing had no idea. I hate to think of the story being different than it was, but I can't quite see past the fact that it might have been had Lessing been checking up on her potential readership on a daily basis while writing it.

I always tell my students that if they want to write, that if they feel that they must write, then they should never, never, never give it up, even if their work goes unpublished. I have a friend who keeps half-finished plays and short stories tucked away in drawers, where they await a better day, and this has always struck me as being immensely admirable. This blog is my digital drawer for tucking things away: whether or not it is being read is of no consequence to me. I write because I must, because I wouldn't be me if I didn't.

So, gentle reader, you can take your anonymity for granted. I know not who you are or from whence you came. But elsewhere you must contend with the rather frightening possibility that Big Blogger is watching you.

Tuesday, April 8, 2008

Writing

Annie Dillard, who won the Pulitzer Prize at age twenty-nine, once said that writing a book is like sitting up with a dying friend. George Orwell internalized the process a further degree when he likened it to a long, painful illness – something with which he had actual experience. I have never quite grasped the existential agonies that some writers put themselves through but, then, I'm not much of a writer – at least as writers go. I will grant them this, though – writing is hard. It is not as hard as teaching (readers don't interrupt while you're writing, asking you to go over that again) but it is hard enough.


Difficult things (e.g. cooking, dieting, exercising, travelling, lovemaking, learning German, deciding what to read next, dying) spawn books that promise to make them easier, and so it's not surprising that there is a whole industry producing books on how to write. Stephen King wrote one, and so did Margaret Atwood, and even, in an uncharacteristically helpful mood, did Norman Mailer. After the classics — Strunk and White's The Elements of Style and William Zinnser's On Writing Well — the best such work, in my estimation, is Richard Rhodes's How to Write. An unbeatable title, you must agree. At least it was an unbeatable title until Paul J. Silva came up with How to Write a Lot. Imagine two books, side by side in the bookstore: How to Lose Weight and How to Lose A Lot of Weight. Which would you chose?


But my enthusiasm for the book ends there. How do you write a lot? According to Silva — and brace yourself, here — the key is to write a lot. It does not, I'm afraid, amount to much more than that. (How could it? The book is 126 pages.) Schedule time, every day, to write. Make it as much a part of the daily routine as daily routines are, and it will become habitual. Word counts will mount, and these must be meticulously recorded, preferably on a spreadsheet. Don't worry too much about style, because academic peer reviewers certainly don't. By this expedient, articles and reviews will get finished and get submitted, and so will a book every couple of years or so. Silva himself claims to average something over 700 words per day.


And there you have it: the distinction between writing and typing not just blurred but erased; the ancient, honoured, and indeed sacred craft of composition played out statistically on spreadsheets, with daily mounting tolls, like victories (or perhaps losses) in a U-boat campaign.


Silva is a psychologist with an impressive slate of publications, especially for someone so young. You can't argue with results: the method works for him. Perhaps it will, too, for other social scientists who don't regard their discipline as a literary one. I prefer to chart a middle course. I don't much like the idea of bleeding myself to write, but I can see the point in devoting some care to one's craft. And there are evenings when, after a tiring day's work that has yielded just one good paragraph, I feel slightly glad to sit back, exhale, and thank goodness for small miseries

Wednesday, April 2, 2008

Karate

I took my first lesson when I was thirteen, around the time that The Karate Kid came out. (It is very strange to reflect that I once considered Ralph Macchio a role model.) I have practiced martial arts on and off ever since, but I've never been sure why. It was one thing in my teens or early twenties, but as I begin to approach a rather jiggly middle-age my doubts about it all have been steadily mounting. I can think of much better kinds of exercise and the claim to be learning self-defense seems hollow somehow. Thirty-something professionals living in the suburbs don't get into a lot of fights. My chances of being the victim of a random act of violence are about as great as my chances of driving into a lake, but I don't spend hours every week practicing underwater evacuations from my car in case I ever need to. In my adult life, I've never been in a fight and have never felt remotely threatened by physical violence, except once while learning martial arts. So I'll take my chances. If anyone ever did break into our house, there are plenty of things I could brain him with before it became necessary to pull out the old karate moves: wine bottles, frying pans, the Oxford Companion to Canadian History.

"Why do we do this?" I once asked a friend, both of us wheezing and battered after a session in which some young jock without a lick of training had positively mopped the floor with us. He looked at me incredulously. "Because it's fun," he said. And there you have it: two suburban martial-arts consumers with fantasies of invincibility, stripped positively bare.

Probably just as well. Unarmed combat gets you killed. While most martial arts claim to have their roots in ancient battlefield techniques, the fact is that no army has ever invested much in unarmed combat training for the simple reason that people have always had weapons. You could train for decades in the Kung-fu Mantis Death Touch and still have less than an even chance against a peasant conscript with a pointy stick — let alone one with an assault rifle. In Tokugawa Japan, the socially leveling implications of firearms were so great that most such weapons were rounded up and destroyed by the Samurai.

Later, I took up kung-fu, in an effort to hugely increase the mumbo-jumbo quotient in my life. People who've never done it but who have seen the t.v. show tell me that they're interested in the "philosophy" of kung-fu. About the only philosophy I encountered was slightly less profound than what you'd find stuffed into a fortune cookie. The odd time you'd meet a master who thought he was Yoda, but when it boiled right down to it most of them were just Mark Hamill.

I did meet a lot of very funny people — adults who spent a large portion of their waking hours stomping around barefoot in their pajamas and shouting out the only ten words of Chinese they knew — and they were all the funnier because they took themselves Very Seriously. One even made murmurs about dire retribution if anything unkind was said about the Grand Master. "I'll hit anyone who disrespects him," he said. (In the rational world, that's called assault, and it's a criminal code offense.) Well, the Grand Master, as it turns out, was an immensely overweight crackpot whose power in the martial arts was so vast than he passed from this Earthly realm only sixteen years earlier than the predicted lifespan of an average North American. This particular club pretty much met all of the definitions of a cult, including the attribution of supernatural powers to the Grand Master. I exited rather abruptly, only to receive a nasty phone call a couple of months later — when they finally clued in that I was gone, I suppose — informing me that I wasn't welcome back.

Anyway, attributing supernatural powers to "the master" is actually much more common in the martial arts than you might think. Yup, you heard it here first, folks: the accumulated scientific knowledge of Western civilization — and that stooge Newton's 2nd Law in particular — have been overthrown by kung-fu masters in strip malls across America. News at 11.

One instructor I knew swore that his teacher could convey the day's lessons through touch - transmitting knowledge through his chi. (In my profession, teachers who try to convey lessons through touch get into serious trouble, and rightly so.) And then there were the various claims about healing powers and the ability to move objects without touching them.

Take this group, for instance:

http://www.yellowbamboo.net/levels.htm

Notice that the level 3 training video includes lessons in "telepathy", which you have to admit could be handy in a fight. The person offering the video is himself at level 6. God knows what he can do. Sell videos to suckers, probably.

Oh, don't get me wrong. In my time in the martial arts, I also met some lovely people, made some friends, got some exercise and had some fun. Perhaps I'm just feeling my age, and just need to go with it. As a friend of mine used to say, "you're just skeptical because you've never been on the receiving end of the Chinese Death Kick. I have - and it worked every time."