Recent:

 

This is the personal weblog of Craig Melhoff. I'm a Canadian teacher, writer, reader, composer of website-bio bathos, and hominid.

Powered by Squarespace
Tuesday
Sep272011

Answering Some of Classic Popular Music's Great Questions

Van Halen - "Why Can't This Be Love?"

To begin with, you're Sammy Hagar. Even if you'd been David Lee Roth, I don't think this could have been love— what chance could you really have? Also, you're the lead singer of Van Halen, which means your romantic life involves a winding procession of dazed dipstick groupies. Not much room for love in that arrangement, sir. And then, of course, there's the fact that we're both heterosexual men. To be fair, though, you didn't specify whether it was eros, philia, agape, or storge that you had in mind, so I shouldn't jump to conclusions. In alphabetical order, then:

 

  • "Why Can't This Be Agape?": Because you aren't David Lee Roth.
  • "Why Can't This Be Eros?": Because we're both heterosexual men.
  • "Why Can't This Be Philia?": Well, I suppose it could be, in a fairly loose sense. All right, Sammy: this can be philia.
  • "Why Can't This Be Storge?": What the hell is storge?

 

 The Jimi Hendrix Experience - "Are You Experienced?"

I'm sure you and your drug-fuddled hippie friends wouldn't think I was especially experienced, but yes, I suppose I've had a few experiences. What kind of thing do you have in mind?  Are we talking like in the résumé sense? I suppose if you count watching the sunrise from the bottom of the sea as an experience, then no, I'm not that experienced. How does that even work, from hundreds of fathoms down? The average ocean depth is almost a mile, Jimi. To see that kind of sunrise—well, I think you mean "Not necessarily beautiful, but stoned."

 

Elvis Costello - "(What's So Funny 'Bout) Peace Love and Understanding?"

*snicker* Oh, Elvis. You're so naive.

 

David Bowie - "Life on Mars?"

Not according to recent data, no. NASA stumbles on suggestive evidence from time to time, but Mars has a pretty thin atmosphere, so its environment can't really support liquid water. There's probably no life on Mars, Dave. And if there is, it's probably just microbes, and not mice, Ibizans, socialists, Beatles, or sailors fighting in dance halls.

 

Frank Zappa - "Why Does It Hurt When I Pee?"

Considering the folks you hang around with, Frankie, I couldn't even begin to guess.

 

Sunday
Sep182011

Interface to Face

Bridging Digital Natives and Digital Immigrants

 

Ten years ago this fall, Marc Prensky announced a revolution in education. In his 2001 essay, "Digital Natives, Digital Immigrants," he argued that as Western culture shifted to digital communication in the last years of the twentieth-century, a divide opened between the older "Digital Immigrants" newly arrived in the wired world, and the young "Digital Natives" born and raised in this new frontier. With the rise of the internet, Prensky argued, came a generation gap wider than the usual one that exists between the young and their befuddled parents; as technology, media culture, and the Internet ruled the day, young people who grew up in the new environment began to "think and process information fundamentally differently from their predecessors." According to Prensky, the Digital Natives are steeped in and perfectly adapted to life in the new world which for their parents, the Digital Immigrants, remains to some extent an unfamiliar environment. Prensky's famous essay is a call to educators to respond to the observation that "Digital-Immigrant instructors, who speak an outdated language (that of the pre-digital age), are struggling to teach a population that speaks an entirely new language"—and that education, as a consequence, is an "old-world" construct incapable of seizing the attention or stimulating the minds of a population reared in the "new world."

Whether it has been the sea change Prensky writes about is a point of some disagreement, of course. Some argue that it is in our nature to lack historical perspective, to enlarge what is immediately in front of us, and to believe that our own moment eclipses all that came before it. But what may be different in this case is that everybody seems to be saying this about the past twenty years, not just young people caught up in the excitement of their hour, and so in any case it might as well be true. As the story goes, the cultural landscape has been radically altered within a very short time, the standard generational twenty years having brought the Internet, smartphones, social networking, YouTube celebrity, etc. As a result, there is a wider gulf between the native environments of the high-school and university students of today and those of the late Baby Boomer and Generation X parents who raised them. It really does seem sometimes like a singularity, a revolutionary break from the old world and the sudden imposition of an unrecognizable new one.

Technological change is nothing new, of course (we have been here before, courtesy of the Gutenbergs, the Edisons, the Bells, etc.), and neither are revolutionary shifts in values and prevailing ideas. But the two together seem to have scrambled intergenerational communication a bit more than usual; Prensky's immigrants, especially when they are educators, often stare backward mournfully at the vanishing world of the past while the natives stare only forward, rejecting the lessons and value of history, emboldened in such singularly futurist thinking by a media and advertising culture desperate for their attention and their gadget dollar. There are exceptions to both of these rules, naturally, but by and large these two generations do sometimes seem to speak radically different languages, and they value entirely different things.

On closing the generation gaps of history and managing the death of "contemporary forms of social order," Aleksandr Herzen wrote in the nineteenth century that "what is frightening is that the departing world leaves behind it not an heir, but a pregnant widow. Between the death of one and the birth of the other much water will flow by, a long night of chaos and desolation will pass." With revolutionary change, then, comes not a clean, straightforward handoff of dominance from the overthrown to the inheritors of power, but an uneasy interregnum, a period of adaptation to the new order before it is quite fully in place. The call that Prensky made a decade ago to educators to end their "grousing" and accept and begin adapting to the new order was an attempt to close the gap Herzen describes, to induce labour in the pregnant widow and complete the shift to the ostensibly inevitable technological future.

The good news, however, is that in this case there are intermediaries, regents tending to the office as the heir is made ready for the post. In Herzen's terms, they function as midwives and nurses. In Prensky's terms, they work like immigration-officer go-betweens, smoothing the transition for new arrivals, and helping to dovetail old with new. In terms of technology itself, they operate like an interface, the device that can translate the movements of a hand into those of a mouse pointer on a screen.

 

I was born in 1976. The world at the time included computers, but they had only recently shrunk from great warehouse-sized beasts into things consumers could own, and most people didn't have one. I didn't spend my early childhood with a computer in front of me, and I had only a vague science-fiction sense of them. My father brought home a Commodore 64 when I was 9 or 10 years old, and this was the first computer I got to use myself. I played games on it, and I borrowed books on BASIC programming from the library to help me create primitive programs of my own. It was a thrilling new world for me, to be sure, but it was still a tiny one compared to the wired world a nine- or ten-year-old child encounters today:  this was an isolated machine (modems existed for them, but my family didn't have one, and there were only boring bulletin-board systems to access anyway), and it was anything but user-friendly. Turn on a Commodore 64 and all you see is a "READY" prompt and a blinking cursor—no icons, no tooltips, no help documentation, nothing to orient you. Prompt notwithstanding, it all made you feel quite unready indeed. To make any use of these machines was to be by contemporary standards something of a specialist, and most certainly to be on your own.

My first encounter with the internet was when I was seventeen years old, in 1993. I was at the home of an acquaintance whose affluent family were always early-adopters of new technologies, and their most recent adoption was a dial-up internet connection on a second telephone line. We looked at a couple of business websites and other things his father had bookmarked in Netscape Navigator, and while none of the information itself was especially interesting to me, the idea itself seemed momentous: to have a pandect of information just a few keystrokes away was a revelation. I didn't get my own regular access to the internet until a couple of years later at university, but in the meantime, like everybody else at the time, I was spellbound by the idea of an information commons, and of immediate exchange of ideas between the like-minded.

But I saw this advent with the strange combination of both youthfully giddy eyes and a print-culture mindset cultivated by my 1980s education. By the time I saw the internet for the first time in 1993, I'd only ever read a book in the original Book 1.0, Dead Trees Edition. I knew how to use a card catalogue in the library. My music collection was on CDs and a few lingering mixes on old TDK cassettes, and the whole lot certainly didn't fit in my pocket. I'd never owned a mobile phone, of course—in fact, nobody I'd ever met while I was growing up had one, and though they existed, they were either the laughable bricks still seen in 1980s period comedies, or they were permanently installed in rich people's cars. I was thus without cell phone or iPod, and apart from an hour here and there doing homework or playing games, I was almost never at a computer. If I was out with friends and wanted to know something, I had no way of looking it up. I had to make a mental note to look into it later, and wait not just until I got home, but until I could ask a teacher or someone else in the know, or else look it up in print sources in the library.

In short, then, I received the same education and ultimately the same values as Prensky's "Digital Immigrants," but the alignment of my date of birth and the rise of the internet meant that I was young and hungry just as the famous technological revolution was happening. I was part of an evanescent generation that had the benefit of both immersion in the new world and a firm grounding in the old one.

Now, I'm a thirty-five year old university instructor, and I'm confronted with Prensky's controversial challenge of a decade ago. My classroom is full of his Digital Natives, and it is my job to find ways to communicate with them, and to unite the old world and the new, adapting print-culture content for the consumption of net-culture minds so that it won't die on the e-altar. And indeed this isn't always simple and straightforward.

According to one recent study, my students spend, on average, seven and a half hours a day consuming media content—consuming, in the end, almost 11 hours of content when you consider "multitasked" channels of input separately. One in three of them sends and receives more than 100 text messages a day, and 75% of them have a cell phone. Most of their bedrooms contain a TV, and nearly all of them contain a computer or two.  They have turned out more or less exactly how Prensky imagined they would a decade ago, when today's first-year students were just eight years old. Their world is one that the designers of the traditional education system could not have imagined and certainly weren't prepared for, a submoronic hive-mind in which the presiding authority is not authority at all but, as Mark Bauerlein argued in The Dumbest Generation, the intense and inescapable presence of peerthink itself: the social panopticon, the media obsession and other centripetal tugs of youth consciousness, and above all the self-reinforcing obsession with and dependence upon what comes in ones and zeroes.

This is how the natives live in the new tech landscape. Now think about how the way the immigrants live in this world, the ones who designed the curricula in place in most schools, who raised the natives and in many cases the natives' parents. They're the fastest growing segment of the Facebook population, but just think of the way many of them use it. When they post comments on their children's status updates, they often start with "Hi sweetie," and end with "Love, Mom," just like they did when writing letters by hand. We wince as they SOMETIMES WRITE IN ALL-CAPS, JUST AS THEIR FIRST COMPUTERS IN THE 1980S ALWAYS DID. We cringe at their candour and guilelessness. We watch as they derail comment threads with interjected reminders to pick up milk on the way home from school. We hear in these immigrants' digital voices what Prensky described as the "accent," the vestiges of old-world language. Some immigrants have adapted rather well to the new speech, but they have all done so long after their own habits, worldview, and values have formed, and as Prensky says, "a language learned later in life, scientists tell us, goes into a different part of the brain."

But the Interface folks were still figuring themselves out when all of this began, and we learned both ways of speaking in parallel, on user-unfriendly computers and with stacks of old books piled on our desks beside them. All adolescents work at trying to integrate themselves into the world that surrounds them, but people born a few years on either side of me were tasked with integrating themselves into two worlds, and in engaging that task, we cultivated two sets of intuitions about language, communication and culture. Translation between the idioms of stone tablets and iPad tablets is more natural for this "Interface Generation" than it is for those on either side of them, and it shows. My students often find my world baffling, but I'm not baffled by theirs, and I'm still at home in the one I inherited myself—I'm as good on my iPhone as they are on theirs, and yet I'm as committed to the value of the traditional book as the most curmudgeonly antiquarian.

 

The students Prensky saw coming in 2001 have arrived, and so has the world of parallel processes, random access, instant gratification and reward that he predicted. The old social order lay on its deathbed, or at least so say many, and Herzen's long night of chaos and desolation has thus begun. It does look as if we need new ways of speaking and teaching, or at least to consider ways of honing the ones we have, as Prensky said we would a decade ago, and many of us are already well on our way to linking the wires. The people forming the Interface have a vital responsibility in this process, because they represent a bridge between the departing world and that of the unborn heir. There will never be another generation that was young when they were young and the game was changing, and that is info-bilingual in a world of info-monoglots. They will have to complete the act of translation and soon, before, for better or worse, the old world is taken offline completely.

 

Thursday
Sep082011

Dial C for Cliche: A Brief Reflection

Plot elements destroyed by the invention of the cell phone:

 

  1. Mistaken-identity situations that turn on the inability to see who is calling.
  2. Picking up the house phone to make a call, only to discover that someone is already on the line on another extension, whereupon a portion of the conversation is overheard and misunderstood to comic or dramatic effect.
  3. A stratagem: one roommate pretending to be the other when a boyfriend calls.
  4. Teenage siblings intercepting calls from each other's love interests for the sake of cruel embarrassment.
  5. Any uncontrollable spiral of events precipitated by a telephone message left on a piece of paper, with a concierge, etc., and never received.
  6. Romantic tragedies involving missed encounters and the inability to make subsequent rearrangements (She thought it was for 3pm, and he thought it was for 4pm; she thought the meeting was in Portland, Oregon, and he thought it was in Portland, Maine; etc.)
  7. Prank calls, pretty well altogether.

 

Plot elements introduced with the invention of the cell phone:

 

  1. Accident victim struck by vehicle while staring down at cell phone.
  2. Romantic tragedy: lovers fail to meet in line at the grocery store, as both are beguiling the tedious minutes of waiting by playing Angry Birds.
  3. Deeper exploration of psychological interiority: long scenes of passive-aggressive silence, while two lovers sit at a table in a bar, texting others not present. Variation: texting one another.
  4. Missing-person case solved shortly after disappearance by using "Find My iPhone" app.

 

Tuesday
Aug022011

Letters: solitude in good company

I've been working on a few things lately—a couple of short-fiction pieces, some things for here, and of course the bill-paying gig. I'm also travelling again next week, this time to Ohio to visit family and store up energies before the next sortie in preparation for the fall semester. More to come soon, in other words. In the meantime, though, a little something.

I've been reading a lot of Saul Bellow this summer, with one of his novels in one hand and a copy of last year's magnificent edition of his letters in the other, dipping in and out of the latter as I plough through the former. Bellow's correspondence is charming and often itself quite literary, and as usual with this sort of thing, in addition to enchanting notes to his various ex-wives and business exchanges of literary-historical interest, there are loads of wonderful letters to other writers, and these I've found spellbinding.

I have a few old-fashioned qualities—ones that were more noteworthy when I was younger, since I think I'm growing into them as I get older. The zest for literature is one of these, probably the main one. I still read books in their original dead-tree editions, for example, and don't see that changing soon; I'd sooner read kindling than be Kindle-ing, if you like. I often think I'd do fairly well if I had to go back to pre-computerized ways of doing things. But in one way I'm very much a child of the modern world: nobody would ever hear from me if it weren't for electronic means of communication. I've tried with likeminded friends a few times over the years to keep up the habit of writing actual letters—not just writing on paper, but striving for some depth and intrinsic merit. It never takes. This is partly because the postal system isn't what it used to be (in late-19th-century England, for example, you could dispatch letters to your friends first thing in the morning to invite them to afternoon tea, and they'd all be on your doorstep at 3:00; mail someone a letter now and it might not actually arrive until years after your death, like the yellowed missive Doc sent Marty from 1885 in Back to the Future II). But I reluctantly admit that laziness and convenience are key factors too, even for those of us who are otherwise committed to spending all their time writing.

Anyway, as both a print-culture nerd and a woefully unsuccessful letter writer, I'm luxuriating in Bellow's correspondence with other American-lit titans (a couple of samples below—and, as I hope folks with good examples at hand will point out, these are not even especially remarkable examples of writers' letters, but rather just a few that have made me smile along the way), and lamenting that here as everywhere else, modernity has brought about the death of an old art. Don't hold your breath for a beautiful hardcover edition of the message history from Jonathan Lethem's Blackberry; I'm sure it isn't coming. You couldn't possibly get excited about reading someone's lols and oks and gtfos; it's a shame, too, that none of the participants in a BBM conversation can get as excited about any one of their messages as one used to feel on the way back from the mailbox carrying a long-anticipated letter from a good friend.

Anyway, I'll send you all postcards from Ohio. You should receive them sometime before your grandchildren finish college.

 

 

From Saul Bellow: Letters, edited by Benjamin Taylor and published by Viking:

 

To John Cheever, November 10, 1976. Cheever had asked Bellow to review proofs of Falconer:

Dear John,

Will I read your book? Would I accept a free trip to Xanadu with Helen of Troy as my valet? I am longing to read the galleys. Since I have to go to New York this weekend, and also to Princeton to see my son Adam playing Antonio, the heavy in The Tempest, I shall get Harriet Wasserman at Russell and Volkening to obtain a set of galleys for me from Knopf. I would like to see you too, but I don't know when I will be free from this mixture of glory and horror [Bellow's Nobel Prize in Literature]. But I will write to you pronto about the book, which I'm sure to read with the greatest pleasure.

Yours,

 

To Bernard Malamud, May 10, 1959.

Dear Bern,

I shy away from all writers' organizations. The PEN is about my limit, and I have doubts about that. No doubt the [Authors] League is fine, but the publisher and the agent aren't the enemy. The enemy (and I'm not horribly hostile towards them, either) is a hundred sixty million people who read nothing. What's the League going to do about them, about Orville Prescott, about TV and Hollywood? It may increase my income by six hundred per annum. I don't care about increasing my income by six hundred per annum. It isn't worth joining an organization for. [...]

Best,

 

To Philip Roth, July 20, 1993. Roth had recently suffered both illness and separation from wife Claire Bloom; Bellow wrote the following: 

Dear Philip,

Curious how futile good intentions feel in a case like this. The whole of one's personal morality is on the line—a tug-of-war in which I am outweighed a million to one by the imponderables. If you were to ask I'd come down to see you, though I've never seen myself as a bearer of remedies. I can't think of a single cure I ever worked. My idea of a mitzvah was to tell you a joke, which was like offering to install a Ferris wheel in your basement. Certainly not a useful idea.

This may seem to be a greeting from the horizon but I'm really not all that far. I feel anything but distant.

Affectionately,

 

 

Sunday
Jul242011

More Machine Now Than Man

In a 1950 article titled "Computing Machinery and Intelligence," English mathematician and computer scientist Alan Turing asked whether there could be "imaginable digital computers" that could imitate human behaviour convincingly enough that one could not tell the difference between them and real human beings. The hypothetical test he described has since come to be known as the Turing test: a human subject engages in normal conversation with both a computer and a human being in separate rooms, and the computer is said to have passed the test if the subject is unable to tell which converser is the computer and which is the human being.

Turing acknowledged that such machines were still the stuff of science fiction, but he didn't think we were very far off from creating them—he put their arrival at before the end of the century. A couple of generations later, futurist Ray Kurzweil predicted in 1990 that computers would pass the Turing test by 2020 (in 2005 he pushed it out to 2029). Most seem to agree that the ability of a computer to imitate human behaviour is still some way off, for one or the other of two different but related reasons: either the computer's capacity to match human traits (not just intelligence and sophisticated language, of course, but also the other things that make us human, such as emotion, error, inconsistency, and so on) is still relatively low, or human behaviour is somehow too "high," beyond the reach of imitative technologies.

Computers have come a long way since Turing died in 1954 (a tragic story well worth reading, by the way—what an embarrassment was the public mistreatment of the man whose cryptography skills helped crack the German Enigma machine in World War II, and all that mistreatment for nothing more than eccentricity and homosexuality). And while our computers have evolved and worked their way into the centre of our lives, we have evolved alongside them. They have attained much more of the sophistication necessary to imitate our language and thought patterns, and so pass the Turing test, and as we have hunkered down more or less permanently at their keyboards, our language and thought patterns themselves seem also to have changed a little: grown a bit more abbreviated, systematic, formulaic, and telegraph-like, simplified like a machine-language of subroutines. At our brave posts behind those keyboards, free from at least some of the complex and nuanced immediate social pressure that used to partly fill our days, we have nudged human nature a little more toward the mechanical extremities of hostility, partisanship, and stark binaries. Our computers have become increasingly capable of imitating us, and at the same time, to some extent, we have obligingly begun imitating them in little ways.

The talk of tests and machines calls to mind other such diagnostics, such as the fictional Voight-Kampff test from Philip K. Dick's Do Androids Dream of Electric Sheep?, the basis for the 1982 film Blade Runner. The Voight-Kampff is a test for the presence of human empathy, which helps determine whether the subject is a human being or an android. Subjects are asked a series of questions designed to provoke empathetic emotional responses while those administering the test watch for the physiological signs of such responses. The test rests on the presumption that human empathy cannot be faked at this level; if a creature exhibits no signs of empathetic response, it must be a machine.

The trick in Dick's novel, of course, is that no faking is required: only the androids display any genuine human empathy anyway. Dick's postmodern, futuristic world is one in which such human traits are on the way out. The utility of the Voight-Kampff test is challenged partly by advances in android technology, and partly by the fact that the disappearance of human empathy ("flattening of affect") moves the bar down, so that it is a less remarkable achievement to display as much empathy as the average human being. Just about any machine can manage that, so the Voight-Kampff test gets easier to pass.

A computer that passes the Turing test is a victory for silicon and a defeat for flesh and blood, of course, since a passed test means a human failure to detect humanness. If we bring down the bar of human nature and make it easier for the computer to win, just as the "flattening of affect" makes it easier to pass the Voight-Kampff test in Blade Runner, then we lose twice, since we have been beaten by our creations and also by ourselves, having given up what made us marvellous and inimitable in the first place.

I think about tests of our nature like these sometimes when everyone takes to the internet after some tragic event, so naturally they came to mind this past week amidst news of explosions in Norway and the death of Amy Winehouse. These are the moments, after all, when we glimpse ourselves at our most guileless and authentic, if not necessarily at our best.

One could easily have programmed a computer to produce the predictable responses that flooded people's Twitter feeds after these events—the hollow tributes, the jingoism, the geopolitical oversimplifications, the cynical cracks about the 27 Club, the tasteless too-soon humour, the excoriations of mourners of the latter tragedy for ignoring the greater enormity of the former. These tweeters had all clearly passed the reverse Turing test at least once when they filled in CAPTCHAs while signing up for their Twitter accounts; it wasn't at all clear where they stood with the regular Turing test, or whether they might rise out of the mechanical chatter of the site's equally predictable and formulaic spambot white-noise. Above all, and most disconcertingly, there was that "flattening of affect," the lowered bar of empathy against which all machines are suddenly judged as people, and all people suddenly machines. Rather than warmly demonstrating empathy—understanding and sharing the feelings of others—so many simply carried out the usual cold, pre-programmed scripts: Detached Internet Cynicism 1.0, Celebrity Judgment 2.0, Hairtearing & Teethgnashing 3.0, Bitter Recrimination Lite, Panic 2000.

That our computers might one day effortlessly pass the Turing test is an ambitious goal for artificial-intelligence research; if it happens, here's hoping it's because our machines managed to become more human, and not the other way around.