Pages

Write like a man

Hu-man, that is. The March issue of the Atlantic has a piece about the Loebner Prize, which is essentially a competition for "most human human" and "most human computer" that follows the basic rules of the Turing test. Humans try, through instant messaging, to convince a human interlocutor that they're real people and not programs, while programs are run that are designed to convince the same human judges that they--the programs--are actually human.

It's a good article, and it leaves me wondering whether any of these programmers have tried studying literature. After all, if your job is to create a clump of text that gives the illusion of a human consciousness, there have been people trying to do that for millennia--they're called fiction writers. Critics of the structuralist school loved to point out that what we called "characters" are in fact simply repeated words/signs (e.g., "Maggie Tulliver") that tend to be clustered close to other repeated signs ("duty," "brown eyes," "unruly hair," "love for Tom") and skillfully arranged to give the illusion of a real human psyche. If you studied how the idiosyncratic agglomeration of traits and preoccupations makes a character seem real, and how a slightly distracted (i.e., not "on the nose") dialogic style makes conversation feel human, it could help a lot in designing a program that could fool judges for five minutes.

To all you programmers reading this: we can split the prize money.

Women Who Look Extraordinary

Henry James and OK Cupid make for an unusual match. But the online dating site recently released statistical findings that confirm, more than a century later, James's observation of what made certain women particularly beguiling.

In his 1880 novel Portrait of a Lady, James describes his inexpressibly intriguing heroine, Isabel Archer, in comparison to her more traditionally beautiful older sister Edith:
Nineteen persons out of twenty (including the younger sister herself) pronounced Edith infinitely the prettier of the two; but the twentieth, besides reversing this judgment, had the entertainment of thinking all the others aesthetic vulgarians.
In other words, the physical allure of Isabel is not that she's universally considered beautiful, but that she is considered extremely beautiful by a minority that recognizes that most people wouldn't agree--and gets a little puff of confident self-satisfaction from that very fact.

If only more online daters read James. OK Cupid's blog (and informal statistical wing), OK Trends, recently announced results that suggest that the Jamesian prescription holds true for profile pictures: women who post pictures that prompt widespread disagreement about their beauty actually get substantially more messages than traditionally attractive women.

Maybe OK Cupid's stats aren't exactly rigorous science, but they're interesting. I'm still willing to classify this as a literary insight belatedly discovered by science...someone call Jonah Lehrer.


Writing As a Spectator Sport

Over at the Atlantic, James Somers reports on software called Etherpad that adds a temporal dimension to writing, tracking every keystroke and allowing users to "play back" the entire writing process. As Somers points out, that functionality--despite the fact that the Etherpad software is more or less defunct--holds a lot of promise for academia, both as a way to track writers' revisions over time and as a way to make sure that students are doing their own work.

I'm sort of partial to the idea that the reason we have rough drafts is so that no one has to see them, not so that everyone will have access to them forever--so I also have my doubts about this catching on among literary types. But it would definitely be useful for academic integrity, for tutoring, and--if actual literary writers used it--as a tool for understanding the cognitive process of writing at a level of granularity that prior technologies never allowed us to. After all, with pen and ink, typewriters, and computer files, we could only see drafts that the writers officially considered finished drafts, not the writing as it was constructed at a word-by-word level.

I, for one, was shocked to see the sketchy, outlinish way that Somers himself writes, which he shares here via the open-source descendant of the original product. I don't write my posts that way--they just come out in the first draft the way that they come out, and then I cut/paste/delete as necessary--but that is, actually, how I write research papers, which makes me wonder if we approach writing differently based on how we generically categorize it: "professional/research" vs. "casual/interpersonal".

Anyway, check it out.

The New Republic: Dickens's Writing "Difficult, Obscure"

Over at The New Republic, Hillary Kelly lashes out at Oprah for choosing Dickens as her next book club reading. While obviously I share some of the shock and awe at Oprah's power, Kelly lets it cloud her judgment, ranting that the average reader will have trouble unpacking "Dickens’s obscure dialectical styling and his long-lost euphemisms." Huh? Dickens's euphemisms and ironies are often painfully straightforward, and "dialectical stylings" is a phrase that confuses me, even though I just came back from a lunch spent reading Fredric Jameson. My best guess at what she's referencing here is the doubleness of "It was the best of times, it was the worst of times," but honestly, it's Kelly's thoughts on Dickens, not Dickens's own writings, that are difficult to understand.

It's sad, because Kelly's sense of Oprah's ridiculousness is right, but her reasoning is actually backwards. In her fervor to protect the classics from Oprah's rapacious grasp, she perversely bashes Winfrey for "her sentimentalized view of Dickens," who was, in case anyone forgot, an author of sentimental fiction. His books are chock full of dying orphans, for Pete's--or should I say Little Nell's?--sake. Then Kelly worries about Winfrey's "ignorance of Dickens’s authorial intentions," as though those should be the cornerstone of any reading experience.

Weirdest of all, though, is when Kelly says that Dickens wrote "some of the more difficult prose to come out of the nineteenth century." In addition to being false by any standard, this claim contributes to the sense that these old books can't be read without some kind of literary guide. On this point, at least, Kelly and Oprah seem to be united: Oprah offers extensive character guides and other paraphernalia for the reading through her website, as though the literature itself is so alien that it can only be approached through some kind of spacesuit-like extra-textual apparatus. In one of the funnier parts of Kelly's piece, she quotes from the Dickens conversation going on at O's website:

A glance at the discussion boards on Oprah’s website confirms my worst fears. “I have read all the print-outs and character materials and the first two pages,” said one reader, referring to supplementary reading guides produced by the Book Club. “The first two pages are laden with political snips and I am trying to grasp what it is saying. I was able to look up cock-lane and figure that out, but where do I go to figure out the innuendos?”
If only this person would read the novel, rather than spending time trying to diagram the meaning of its every word. In a weird way, the problem with Oprah's selection is not her decision to approach a canonical classic from a popular standpoint, but her decision to approach a canonical classic as A Canonical Classic. Sentimentalizing Dickens is only appropriate, and finding your "self" in a Victorian novel, while sort of silly, isn't really at odds with what many critics claim (rightly or wrongly) was one of the functions of the novel in the first place--the creation of the reader as a "unique" "individual" subject, etc., etc.

Book clubs, whether Oprah's or not, often plunge into books with minimal context, and that's fine. It's the idea that readers need to tiptoe around the big boys, and approach them with a semblance of historical or scholarly understanding that they can't possibly attain in the time allotted, that makes this whole thing so painful. Oprah and her followers are foreigners to this particular cultural soil, but they have an embarrassingly sincere desire to show that they know how lucky they are to tread on sacred ground, so they work too hard to behave with what they conceive of as humility and cultural respect, nodding knowingly to show that they understand why This Stuff Is Important. A more candidly naive approach, one less fraught with the sense that there's some mystery Oprah & co. need to show they understand, would make the whole affair a lot less ridiculous.

Oprah loves the Dickensian aspect

Victorians have been making lots of headlines recently. Last week, the NYT reported on the application of search software to Victorian titles as an aid to scholarship, even as Oprah announced that her next book club picks will be two works by Charles Dickens: A Tale of Two Cities and Great Expectations. Members of the audience appear to be filled with nearly childlike delight at the announcement.

Of course, when Oprah preceded her announcement with the hint that she was going with a selection that was "old, OLD school, people!", I immediately began chanting to myself, "Please let it be Pliny, please let it be Pliny." But, in a stunning confirmation of the fact that different people perceive time differently, Oprah unveiled a new Oprah edition of the two classics, nicely packaged together by Penguin for just this purpose.

As Omnivoracious at Amazon points out, the rationale behind the duplex edition might have something to do with the difficulty of capitalizing on an Oprah selection that is entirely in the public domain. Jimmy Fallon put it a little bit more ironically the other night when he noted, with a tinge of jealousy, that Dickens is "gonna get rich."

It's not clear why Oprah picked these two works, but she did admit that she'd never read Dickens, and breathlessly panted to Jonathan Franzen: "Is A Tale of Two Cities what everyone says it is?" (After claiming to have read all of Dickens, Franzen acknowledged that it's "a real page-turner.") As for my guess about why she picked the novels she picked, two words: high. school. I read AToTC in high school, but other sections of our ninth-grade English class read Great Expectations, which seems to be the most widely read Dickens in the U.S., so far as I can tell. (It inspired Pip on South Park, didn't it?)



I kind of wish she'd done something more adventurous, picking a longer work that doesn't get assigned to 7/10 high schoolers in America, but oh well. One upside of this, hopefully, will be renewed interest in Great Expectations in time for the Dickens Universe conference at Santa Cruz next summer, which is spotlighting GE as it kicks off an early celebration of the Dickens bicentennial.

It made me think about Oprah a lot, though, and what she could do with her publishing power--not so much in terms of championing new authors, but in actually shifting conceptions of the canon, or affecting what titles are kept in print. (Of course, if you look at the back pages of a Penguin edition of a book from the 1980s, you realize just how arbitrary the choices of what's in print at any given time seem to be.) Could Oprah return to George Meredith the stature he had at the end of the nineteenth century? Could she bring Marius the Epicurean back into print for the next 25 years?

But perhaps it's better not to ask such questions. It leaves one desperately craving her power...

In Praise of Voyeurism

At what age do you wake up and become crotchety?

I'm afraid it's happened. In response to my last post, a friend asked whether I thought all these new-fangled machines left nothing to worry about. On the contrary, I think there are tons of problems with these dagnabbed kids and their dagnabbed Tamagotchi-ma-call-ums. It's just that Facebook is not a serious threat. It's so popular, in fact, largely because it imitates the way social interactions were already taking place or being conceived.

(Incidentally, I was glad to see that I wasn't not alone in my grumbling about Zadie Smith's article--bigger, badder responders can be found here, here, and here, some with data to back up what the others argue more deductively and intuitively.)

What really strikes me about the growing popularity of electronic forms of entertainment is how boring they are. They're good for an awkward minute--but how someone can spend longer than a half an hour wandering through landscapes and shooting at things is beyond me. Even the games I remember loving growing up now feel dull--in general, I'd rather read.

I think the problem is interactivity. A gaming culture expects to be able to provide input and to have that input be acknowledged. The problem is that games don't, in their current state, have the virtual robustness to set up interesting possibilities with unpredictable results. It's like one of those Choose-Your-Own-Adventure novels that were popular when I was young: it was exciting to have choice, but the novels never lasted long or went into much complexity, and the choice was necessarily constrained. Every opportunity the reader/player is given to make a real decision uses up informational space that could have been devoted to greater richness or complexity of the world depicted. And those decisions are hardly real--all they're sensitive to is "shoot/don't shoot," "touch object/don't touch object," etc.

It makes me feel like some cantankerous Joad Cressbeckler to say so, but I think that these kinds of interactive entertainment generate a certain set of expectations and a certain skill set in people. People who grow accustomed to them become very good at exploring the strengths and weaknesses of an impoverished set of choices given to them, and they expect their every decision or input to generate an immediate reaction. But they become unable to think of new or more complex options, and impatient with forms of entertainment or information that do not provide room for immediate feedback.

What are the results? People want to express and communicate and exchange messages before they build up enough complex information and ideas to provide significant feedback. A lot of learning takes place in idleness or passivity, when we're accepting information, ideas, and words without responding to them yet, or when we're digesting the information that we have taken in and are evaluating it. The problem, in other words, is not excessive stimulus, but excessive response. There needs to be a lag time, time to evaluate, regroup, and realize what does and doesn't make sense. The idea that technology is robbing us of this kind of time isn't new, but I think it has developmental effects on people in terms of the kind of information and interactions they get used to participating in.

I don't like to think of the solution to this kind of dwindling downtime as reflection. Reflection already seems so purposive: something you do in response to a stimulus. (Spend a couple of minutes reflecting! Then, you'll get to give your feedback!) Idleness is better, because there's a sense of total lack of activity, which is what I'm really talking about. Passive intake, mandatory idleness, and then--if ever--the possibility of some kind of response. Maybe surprisingly, I think the kind of voyeurism associated with Facebook is a step in the right direction--it involves passively absorbing other people's lives without any necessary expectation of interaction. It's different from the Internet more generally; the kind of personhood it encourages is older and more thoughtful. Compare it to, say, blogs, which prioritize speedy stimulus-and-response, both on their own comment threads and in exchanges with each other.

Those're my two cents. Absorb it, don't comment on it--I don't let you, after all--and go be idle somewhere for a while.

Facebook, Death, and Literature

Facebook is killing our souls. Or so culture pundits are claiming, as usual. Zadie Smith's review essay in the most recent NYRB trots out all the usual fears, as she reviews The Social Network and You Are Not a Gadget, the new book by Jaron Lanier, who has been discussed in these pages before.

Smith's take on The Social Network is interesting--it's when she wades into cultural critique of the impact of Facebook (buttressed of course by Lanier) that she goes astray. Smith's fears for the coming generation are somewhat incoherent--she worries, on the one hand, that Generation Facebook is more wonderfully gooey and rich than any of our measly technology allows us to show:

[T]he more time I spend with the tail end of Generation Facebook (in the shape of my students) the more convinced I become that some of the software currently shaping their generation is unworthy of them. They are more interesting than it is. They deserve better.
Sweet of her, I guess. But on the other hand, she fears that she already sees how boring our technology is making us, as she compares her old (Harvard) students with her comparatively less interesting new (NYU) crop, and concludes that "it's absolutely clear to me that the students I teach now are not like the student I once was or even the students I taught seven short years ago at Harvard"--because the new students have a "denuded selfhood" unrecognizable to her. (Sucks to be her student and to read this article, don't you think? But I digress.)

What are we losing with Facebook? Principally, depth. The most compelling part of Lanier's argument (as Smith summarizes it) is that technological systems are essentially kinds of representation, and that they can only represent or encode a very small bit of what it means to be human. As Smith rightly points out, literature does this, too, but--and here's the key "difference"--it does it less. What's really on the table here is not a difference of kind, but a difference of degree, and Smith is smart enough to know this, but still unconnected enough (only connect, Zadie!) to want it to be a more dramatic and devastating difference than it is.

At a crucial turning point in the essay, Smith broaches that most literary of topics: death. Death (as we all know from the profound novels we try to imitate) is something that broods over us, coming home to roost in sudden epiphanies like Gabriel Conroy's at the end of "The Dead". Smith models this approach to death nicely as she walks casually to a movie theater to see The Social Network. "Soon I will be forty," she realizes, "then fifty, then soon after dead; I broke out in a Zuckerberg sweat, my heart went crazy, I had to stop and lean against a trashcan." Then she brings home the fundamental question: "Can you have that feeling, on Facebook?"

That's a rhetorical question, of course, and the obvious/ominous answer is "OMG no u tots cant!!!" Having primed us for this answer, Smith looks at how death does manifest itself on Facebook, glancing over the wall of a dead girl and puzzling over the strange new sort of people who choose to leave messages there: "Do they genuinely believe, because the girl's wall is still up, that she is still, in some sense, alive?" Does Smith ask the same question when she sees flowers on tombstones, or when she hears speakers, at a funeral, addressing the dead? I'm going to guess no, but because these equivalent activities take place within social rituals Smith is used to, she doesn't feel the need to tear her hear or beat her breast over them.

The irony is that a substantial portion of Smith's fame comes from a very public conversation she herself has had with the dead. I'd like to ask her how she feels about E. M. Forster. Does she think he's dead, completely? Or, when she engages in a literary conversation with him, does she genuinely believe, because his novels continue to exist and be read, that he is still, in some sense, alive? And if so, is that a bad thing? And if we can't all become famous novelists, is it so wrong that some little bit of us should remain, as a memorial and a site of remembrance, in the space of representation? Or does everyone have to make a masterpiece to be worth that?

A radically different approach to death on Facebook appeared only a short while ago in the Atlantic, and it shows how, in a strange way, the voyeurism of Facebook allows people to feel close to each other in a fashion that would have otherwise been impossible--an example of the manner in which Facebook can actually humanize, rather than dehumanize, relationships that otherwise never would have existed at all. These are the sorts of things that paranoid pundits can't see, because they simply don't want to. People don't live in Facebook, so the lack of certain possibilities in Facebook does not mean the end of those possibilities. People may not have dizzying death epiphanies while logged onto Facebook. But people can't have sex on Facebook, either, and yet Smith doesn't seem to be concerned that we'll stop breeding.

While Facebook certainly conditions how we think of ourselves, its modes of conditioning thought aren't new; they're borrowed from older technological forms (the facebook, the visiting card, the trading card) that haven't destroyed society yet. If there are some people who are shaping themselves around Facebook personae and thereby "becoming" two-dimensional, so be it. There were people who shaped themselves around two-dimensional literary personae before Facebook; there have always been two-dimensional people who have tried to make themselves more interesting by shaping their personalities and social roles in relation to current forms of media.

That, I think, is the truth that a believer in the profundity of every human soul has a hard time accepting. It's harder to see that humanity is full of flat characters when you live in a highly literate culture, because people who borrow their quirks from Joyce and Freud appear more profound and interesting than those who borrow them from Facebook. So in a way, this breezy technology is a boon--it lets us see more of the shallowness of people than before, a vision that's only frightening if you were living in denial of it.

Literary sages are always good at denouncing the new, because it's so obviously different from what they're used to. Ruskin railed against railways in his lectures of the 1860s; by the time Forster was writing Howards End in the early 20th century, it was the railways that seemed to demonstrate a true, old, and stable connection with the land, while motorcars were fast, unpredictable, and destructive. And so Smith steps into the role of sage here, denouncing the technology she isn't yet totally familiar with and peppering it with the paranoid fears of commerce in which sages have always, somewhat paradoxically, traded. But while sages are great at finding flaws, they're incredibly bad at predicting the upside of change. The Atlantic article is one small example of that upside (although it maintains the fashionable skepticism that has to be fit into Facebook stories to make them sell); other upsides will reveal themselves over time. After all, writing itself was once a dramatic technological change, one that meant the end of oral memory--and yet somehow, we seem to have survived, even though our civilization doesn't look like an ancient bard might have predicted. Do any of us regret that?

Now that the election frenzy is over...

...it seems like a perfect time to point out that the clearest and most coherent comment on American democracy to come out in recent years is entitled "Douche and Turd." I thought about it--and yea, in moments of darkness and doubt, drew strength from it--every time I cowered beneath the onslaught of "I voted" stickers and Facebook messages that poured past us all last week. Here's a choice snippet, and the full link:


Or, if you like your critiques a little more highbrow, there's always Matthew Arnold on the subject:
[The democrat] complains with a sorrowful indignation of people who "appear to have no proper estimate of the value of the franchise"; he leads his disciples to believe,--what the Englishman is always too ready to believe,--that the having a vote, like the having a large family, or a large business, or large muscles, has in itself some edifying and perfecting effect upon human nature.
In short, I didn't vote. Anyone know where I can get a sticker for that?

Lady Gaga Meets Aubrey Beardsley

...in this drawing by John Allison. If there were cultural studies collectibles-of-the-season, I'd say something like, "This is bound to be the cultural studies collectible of the season."

Image © John Allison, obviously.

Cyborg Anniversary

September marked the 50th birthday of the term "cyborg." Who knew? I certainly didn't, until I saw this post at the Atlantic.

More interesting than that story, though, is the larger project of which it's a part:


It is what it sounds like. The thinkers who spent September writing in honor of cyborgs range from the Atlantic's Alexis Madrigal to Ryan North, founder of Dinosaur Comics. It should provide food for thought for quite a while, and from all ideological camps: a quick skim shows opinions ranging from "eek-stop-the-machines" to "we-are-all-cyborgs-now." Happy reading.

P.S.--In addition to being the title of this post, "Cyborg Anniversary" would be a good name for a band.