Over at the Atlantic, James Somers reports on software called Etherpad that adds a temporal dimension to writing, tracking every keystroke and allowing users to "play back" the entire writing process. As Somers points out, that functionality--despite the fact that the Etherpad software is more or less defunct--holds a lot of promise for academia, both as a way to track writers' revisions over time and as a way to make sure that students are doing their own work.
I'm sort of partial to the idea that the reason we have rough drafts is so that no one has to see them, not so that everyone will have access to them forever--so I also have my doubts about this catching on among literary types. But it would definitely be useful for academic integrity, for tutoring, and--if actual literary writers used it--as a tool for understanding the cognitive process of writing at a level of granularity that prior technologies never allowed us to. After all, with pen and ink, typewriters, and computer files, we could only see drafts that the writers officially considered finished drafts, not the writing as it was constructed at a word-by-word level.
I, for one, was shocked to see the sketchy, outlinish way that Somers himself writes, which he shares here via the open-source descendant of the original product. I don't write my posts that way--they just come out in the first draft the way that they come out, and then I cut/paste/delete as necessary--but that is, actually, how I write research papers, which makes me wonder if we approach writing differently based on how we generically categorize it: "professional/research" vs. "casual/interpersonal".
Anyway, check it out.
Writing As a Spectator Sport
The New Republic: Dickens's Writing "Difficult, Obscure"
Over at The New Republic, Hillary Kelly lashes out at Oprah for choosing Dickens as her next book club reading. While obviously I share some of the shock and awe at Oprah's power, Kelly lets it cloud her judgment, ranting that the average reader will have trouble unpacking "Dickens’s obscure dialectical styling and his long-lost euphemisms." Huh? Dickens's euphemisms and ironies are often painfully straightforward, and "dialectical stylings" is a phrase that confuses me, even though I just came back from a lunch spent reading Fredric Jameson. My best guess at what she's referencing here is the doubleness of "It was the best of times, it was the worst of times," but honestly, it's Kelly's thoughts on Dickens, not Dickens's own writings, that are difficult to understand.
It's sad, because Kelly's sense of Oprah's ridiculousness is right, but her reasoning is actually backwards. In her fervor to protect the classics from Oprah's rapacious grasp, she perversely bashes Winfrey for "her sentimentalized view of Dickens," who was, in case anyone forgot, an author of sentimental fiction. His books are chock full of dying orphans, for Pete's--or should I say Little Nell's?--sake. Then Kelly worries about Winfrey's "ignorance of Dickens’s authorial intentions," as though those should be the cornerstone of any reading experience.
Weirdest of all, though, is when Kelly says that Dickens wrote "some of the more difficult prose to come out of the nineteenth century." In addition to being false by any standard, this claim contributes to the sense that these old books can't be read without some kind of literary guide. On this point, at least, Kelly and Oprah seem to be united: Oprah offers extensive character guides and other paraphernalia for the reading through her website, as though the literature itself is so alien that it can only be approached through some kind of spacesuit-like extra-textual apparatus. In one of the funnier parts of Kelly's piece, she quotes from the Dickens conversation going on at O's website:
A glance at the discussion boards on Oprah’s website confirms my worst fears. “I have read all the print-outs and character materials and the first two pages,” said one reader, referring to supplementary reading guides produced by the Book Club. “The first two pages are laden with political snips and I am trying to grasp what it is saying. I was able to look up cock-lane and figure that out, but where do I go to figure out the innuendos?”If only this person would read the novel, rather than spending time trying to diagram the meaning of its every word. In a weird way, the problem with Oprah's selection is not her decision to approach a canonical classic from a popular standpoint, but her decision to approach a canonical classic as A Canonical Classic. Sentimentalizing Dickens is only appropriate, and finding your "self" in a Victorian novel, while sort of silly, isn't really at odds with what many critics claim (rightly or wrongly) was one of the functions of the novel in the first place--the creation of the reader as a "unique" "individual" subject, etc., etc.
Book clubs, whether Oprah's or not, often plunge into books with minimal context, and that's fine. It's the idea that readers need to tiptoe around the big boys, and approach them with a semblance of historical or scholarly understanding that they can't possibly attain in the time allotted, that makes this whole thing so painful. Oprah and her followers are foreigners to this particular cultural soil, but they have an embarrassingly sincere desire to show that they know how lucky they are to tread on sacred ground, so they work too hard to behave with what they conceive of as humility and cultural respect, nodding knowingly to show that they understand why This Stuff Is Important. A more candidly naive approach, one less fraught with the sense that there's some mystery Oprah & co. need to show they understand, would make the whole affair a lot less ridiculous.
Oprah loves the Dickensian aspect
Victorians have been making lots of headlines recently. Last week, the NYT reported on the application of search software to Victorian titles as an aid to scholarship, even as Oprah announced that her next book club picks will be two works by Charles Dickens: A Tale of Two Cities and Great Expectations. Members of the audience appear to be filled with nearly childlike delight at the announcement.
Of course, when Oprah preceded her announcement with the hint that she was going with a selection that was "old, OLD school, people!", I immediately began chanting to myself, "Please let it be Pliny, please let it be Pliny." But, in a stunning confirmation of the fact that different people perceive time differently, Oprah unveiled a new Oprah edition of the two classics, nicely packaged together by Penguin for just this purpose.
As Omnivoracious at Amazon points out, the rationale behind the duplex edition might have something to do with the difficulty of capitalizing on an Oprah selection that is entirely in the public domain. Jimmy Fallon put it a little bit more ironically the other night when he noted, with a tinge of jealousy, that Dickens is "gonna get rich."
It's not clear why Oprah picked these two works, but she did admit that she'd never read Dickens, and breathlessly panted to Jonathan Franzen: "Is A Tale of Two Cities what everyone says it is?" (After claiming to have read all of Dickens, Franzen acknowledged that it's "a real page-turner.") As for my guess about why she picked the novels she picked, two words: high. school. I read AToTC in high school, but other sections of our ninth-grade English class read Great Expectations, which seems to be the most widely read Dickens in the U.S., so far as I can tell. (It inspired Pip on South Park, didn't it?)
I kind of wish she'd done something more adventurous, picking a longer work that doesn't get assigned to 7/10 high schoolers in America, but oh well. One upside of this, hopefully, will be renewed interest in Great Expectations in time for the Dickens Universe conference at Santa Cruz next summer, which is spotlighting GE as it kicks off an early celebration of the Dickens bicentennial.
It made me think about Oprah a lot, though, and what she could do with her publishing power--not so much in terms of championing new authors, but in actually shifting conceptions of the canon, or affecting what titles are kept in print. (Of course, if you look at the back pages of a Penguin edition of a book from the 1980s, you realize just how arbitrary the choices of what's in print at any given time seem to be.) Could Oprah return to George Meredith the stature he had at the end of the nineteenth century? Could she bring Marius the Epicurean back into print for the next 25 years?
But perhaps it's better not to ask such questions. It leaves one desperately craving her power...
In Praise of Voyeurism
At what age do you wake up and become crotchety?
I'm afraid it's happened. In response to my last post, a friend asked whether I thought all these new-fangled machines left nothing to worry about. On the contrary, I think there are tons of problems with these dagnabbed kids and their dagnabbed Tamagotchi-ma-call-ums. It's just that Facebook is not a serious threat. It's so popular, in fact, largely because it imitates the way social interactions were already taking place or being conceived.
(Incidentally, I was glad to see that I wasn't not alone in my grumbling about Zadie Smith's article--bigger, badder responders can be found here, here, and here, some with data to back up what the others argue more deductively and intuitively.)
What really strikes me about the growing popularity of electronic forms of entertainment is how boring they are. They're good for an awkward minute--but how someone can spend longer than a half an hour wandering through landscapes and shooting at things is beyond me. Even the games I remember loving growing up now feel dull--in general, I'd rather read.
I think the problem is interactivity. A gaming culture expects to be able to provide input and to have that input be acknowledged. The problem is that games don't, in their current state, have the virtual robustness to set up interesting possibilities with unpredictable results. It's like one of those Choose-Your-Own-Adventure novels that were popular when I was young: it was exciting to have choice, but the novels never lasted long or went into much complexity, and the choice was necessarily constrained. Every opportunity the reader/player is given to make a real decision uses up informational space that could have been devoted to greater richness or complexity of the world depicted. And those decisions are hardly real--all they're sensitive to is "shoot/don't shoot," "touch object/don't touch object," etc.
It makes me feel like some cantankerous Joad Cressbeckler to say so, but I think that these kinds of interactive entertainment generate a certain set of expectations and a certain skill set in people. People who grow accustomed to them become very good at exploring the strengths and weaknesses of an impoverished set of choices given to them, and they expect their every decision or input to generate an immediate reaction. But they become unable to think of new or more complex options, and impatient with forms of entertainment or information that do not provide room for immediate feedback.
What are the results? People want to express and communicate and exchange messages before they build up enough complex information and ideas to provide significant feedback. A lot of learning takes place in idleness or passivity, when we're accepting information, ideas, and words without responding to them yet, or when we're digesting the information that we have taken in and are evaluating it. The problem, in other words, is not excessive stimulus, but excessive response. There needs to be a lag time, time to evaluate, regroup, and realize what does and doesn't make sense. The idea that technology is robbing us of this kind of time isn't new, but I think it has developmental effects on people in terms of the kind of information and interactions they get used to participating in.
I don't like to think of the solution to this kind of dwindling downtime as reflection. Reflection already seems so purposive: something you do in response to a stimulus. (Spend a couple of minutes reflecting! Then, you'll get to give your feedback!) Idleness is better, because there's a sense of total lack of activity, which is what I'm really talking about. Passive intake, mandatory idleness, and then--if ever--the possibility of some kind of response. Maybe surprisingly, I think the kind of voyeurism associated with Facebook is a step in the right direction--it involves passively absorbing other people's lives without any necessary expectation of interaction. It's different from the Internet more generally; the kind of personhood it encourages is older and more thoughtful. Compare it to, say, blogs, which prioritize speedy stimulus-and-response, both on their own comment threads and in exchanges with each other.
Those're my two cents. Absorb it, don't comment on it--I don't let you, after all--and go be idle somewhere for a while.
Facebook, Death, and Literature
Facebook is killing our souls. Or so culture pundits are claiming, as usual. Zadie Smith's review essay in the most recent NYRB trots out all the usual fears, as she reviews The Social Network and You Are Not a Gadget, the new book by Jaron Lanier, who has been discussed in these pages before.
Smith's take on The Social Network is interesting--it's when she wades into cultural critique of the impact of Facebook (buttressed of course by Lanier) that she goes astray. Smith's fears for the coming generation are somewhat incoherent--she worries, on the one hand, that Generation Facebook is more wonderfully gooey and rich than any of our measly technology allows us to show:
[T]he more time I spend with the tail end of Generation Facebook (in the shape of my students) the more convinced I become that some of the software currently shaping their generation is unworthy of them. They are more interesting than it is. They deserve better.Sweet of her, I guess. But on the other hand, she fears that she already sees how boring our technology is making us, as she compares her old (Harvard) students with her comparatively less interesting new (NYU) crop, and concludes that "it's absolutely clear to me that the students I teach now are not like the student I once was or even the students I taught seven short years ago at Harvard"--because the new students have a "denuded selfhood" unrecognizable to her. (Sucks to be her student and to read this article, don't you think? But I digress.)
What are we losing with Facebook? Principally, depth. The most compelling part of Lanier's argument (as Smith summarizes it) is that technological systems are essentially kinds of representation, and that they can only represent or encode a very small bit of what it means to be human. As Smith rightly points out, literature does this, too, but--and here's the key "difference"--it does it less. What's really on the table here is not a difference of kind, but a difference of degree, and Smith is smart enough to know this, but still unconnected enough (only connect, Zadie!) to want it to be a more dramatic and devastating difference than it is.
At a crucial turning point in the essay, Smith broaches that most literary of topics: death. Death (as we all know from the profound novels we try to imitate) is something that broods over us, coming home to roost in sudden epiphanies like Gabriel Conroy's at the end of "The Dead". Smith models this approach to death nicely as she walks casually to a movie theater to see The Social Network. "Soon I will be forty," she realizes, "then fifty, then soon after dead; I broke out in a Zuckerberg sweat, my heart went crazy, I had to stop and lean against a trashcan." Then she brings home the fundamental question: "Can you have that feeling, on Facebook?"
That's a rhetorical question, of course, and the obvious/ominous answer is "OMG no u tots cant!!!" Having primed us for this answer, Smith looks at how death does manifest itself on Facebook, glancing over the wall of a dead girl and puzzling over the strange new sort of people who choose to leave messages there: "Do they genuinely believe, because the girl's wall is still up, that she is still, in some sense, alive?" Does Smith ask the same question when she sees flowers on tombstones, or when she hears speakers, at a funeral, addressing the dead? I'm going to guess no, but because these equivalent activities take place within social rituals Smith is used to, she doesn't feel the need to tear her hear or beat her breast over them.
The irony is that a substantial portion of Smith's fame comes from a very public conversation she herself has had with the dead. I'd like to ask her how she feels about E. M. Forster. Does she think he's dead, completely? Or, when she engages in a literary conversation with him, does she genuinely believe, because his novels continue to exist and be read, that he is still, in some sense, alive? And if so, is that a bad thing? And if we can't all become famous novelists, is it so wrong that some little bit of us should remain, as a memorial and a site of remembrance, in the space of representation? Or does everyone have to make a masterpiece to be worth that?
A radically different approach to death on Facebook appeared only a short while ago in the Atlantic, and it shows how, in a strange way, the voyeurism of Facebook allows people to feel close to each other in a fashion that would have otherwise been impossible--an example of the manner in which Facebook can actually humanize, rather than dehumanize, relationships that otherwise never would have existed at all. These are the sorts of things that paranoid pundits can't see, because they simply don't want to. People don't live in Facebook, so the lack of certain possibilities in Facebook does not mean the end of those possibilities. People may not have dizzying death epiphanies while logged onto Facebook. But people can't have sex on Facebook, either, and yet Smith doesn't seem to be concerned that we'll stop breeding.
While Facebook certainly conditions how we think of ourselves, its modes of conditioning thought aren't new; they're borrowed from older technological forms (the facebook, the visiting card, the trading card) that haven't destroyed society yet. If there are some people who are shaping themselves around Facebook personae and thereby "becoming" two-dimensional, so be it. There were people who shaped themselves around two-dimensional literary personae before Facebook; there have always been two-dimensional people who have tried to make themselves more interesting by shaping their personalities and social roles in relation to current forms of media.
That, I think, is the truth that a believer in the profundity of every human soul has a hard time accepting. It's harder to see that humanity is full of flat characters when you live in a highly literate culture, because people who borrow their quirks from Joyce and Freud appear more profound and interesting than those who borrow them from Facebook. So in a way, this breezy technology is a boon--it lets us see more of the shallowness of people than before, a vision that's only frightening if you were living in denial of it.
Literary sages are always good at denouncing the new, because it's so obviously different from what they're used to. Ruskin railed against railways in his lectures of the 1860s; by the time Forster was writing Howards End in the early 20th century, it was the railways that seemed to demonstrate a true, old, and stable connection with the land, while motorcars were fast, unpredictable, and destructive. And so Smith steps into the role of sage here, denouncing the technology she isn't yet totally familiar with and peppering it with the paranoid fears of commerce in which sages have always, somewhat paradoxically, traded. But while sages are great at finding flaws, they're incredibly bad at predicting the upside of change. The Atlantic article is one small example of that upside (although it maintains the fashionable skepticism that has to be fit into Facebook stories to make them sell); other upsides will reveal themselves over time. After all, writing itself was once a dramatic technological change, one that meant the end of oral memory--and yet somehow, we seem to have survived, even though our civilization doesn't look like an ancient bard might have predicted. Do any of us regret that?
Now that the election frenzy is over...
...it seems like a perfect time to point out that the clearest and most coherent comment on American democracy to come out in recent years is entitled "Douche and Turd." I thought about it--and yea, in moments of darkness and doubt, drew strength from it--every time I cowered beneath the onslaught of "I voted" stickers and Facebook messages that poured past us all last week. Here's a choice snippet, and the full link:
Or, if you like your critiques a little more highbrow, there's always Matthew Arnold on the subject:
[The democrat] complains with a sorrowful indignation of people who "appear to have no proper estimate of the value of the franchise"; he leads his disciples to believe,--what the Englishman is always too ready to believe,--that the having a vote, like the having a large family, or a large business, or large muscles, has in itself some edifying and perfecting effect upon human nature.In short, I didn't vote. Anyone know where I can get a sticker for that?
Lady Gaga Meets Aubrey Beardsley
...in this drawing by John Allison. If there were cultural studies collectibles-of-the-season, I'd say something like, "This is bound to be the cultural studies collectible of the season."
Cyborg Anniversary
September marked the 50th birthday of the term "cyborg." Who knew? I certainly didn't, until I saw this post at the Atlantic.
More interesting than that story, though, is the larger project of which it's a part:
It is what it sounds like. The thinkers who spent September writing in honor of cyborgs range from the Atlantic's Alexis Madrigal to Ryan North, founder of Dinosaur Comics. It should provide food for thought for quite a while, and from all ideological camps: a quick skim shows opinions ranging from "eek-stop-the-machines" to "we-are-all-cyborgs-now." Happy reading.
P.S.--In addition to being the title of this post, "Cyborg Anniversary" would be a good name for a band.
The Wire and Treme: David Simon Keeps It "Real"
Every once in a while, I open to an article in the New York Review of Books that makes me feel like I've accidentally stumbled into a guest lecture at some kind of intellectual retirement home. The subject matter seems exciting and relevant, the author seems promising--then it devolves into an explanation of basic cultural phenomena to an audience assumed to be hopelessly out of touch.
That's how I felt when I read Charles Petersen's "In the World of Facebook" earlier this year. Its rants describe "the mechanics" of superficial Web-era friendships in a way that seems almost calculated to excite "why-I-never" cane-shaking among anyone still fighting the good fight against the Internet. That's also how I felt recently when reading Lorrie Moore's piece on The Wire.
I always get excited when I see writing about The Wire. That excitement comes partially from hometown pride, and partially from the fact that I think we're living in the golden age of television, and we should be talking about TV shows a lot more than we talk about, oh, Jonathan Franzen for instance. Add to that the fact that Lorrie Moore is a great writer, and you'd seem to have the perfect formula for a great article. (I don't know if Charles Petersen is great, because according to my superficial Web-era scholarship, he appears to be a deceased Danish boxer.)
Imagine my disappointment, then, when I discovered that Moore's piece more or less describes the entire series to an audience who is assumed not to have heard about it, or to have heard of it only to call out "HWAT'S THAT DEARIE?!" and turn up the volume on their hearing aids. I kept waiting for Moore's incisive account of the series to materialize, but instead I got summary sprinkled with helpful glosses that remind us that "the game" is what "the drug business is called." Thanks?
Most surprisingly, Moore never questions the blue-collar snobbery implicit in much of what The Wire thinks it's teaching us. Instead--perhaps out of guilt for her own relatively white and relatively middle-class subject matter?--she argues that the series's use of the inner city
might be seen as a quiet rebuke to its own great living novelists, Anne Tyler and John Barth, both of whose exquisitely styled prose could be accused of having turned its back on the deep inner workings of the city that executive producer David Simon, a former Baltimore reporter, and producer Ed Burns, a former Baltimore schoolteacher and cop, have excavated with such daring and success.Well, sure, the series certainly wants to be seen that way. Its form of realism is one that denies "reality" to anyone who isn't taking part in a life-or-death struggle of some kind--which makes for great drama, but leads to characters whose three-dimensionality fades as their socio-economic status climbs higher and higher above the poverty line. As a result, there's a serious omission from the otherwise breathtaking sweep of The Wire's social survey of Baltimore: the residents of leafy neighborhoods like Guilford, Roland Park, and farther north into the county, the kinds of places where I spent my first 18 years.
All of that gets classed as "Leave-It-to-Beaver Land" in the phrase from The Wire that Moore admires, but I keep asking myself what to make of that exclusion, and what it means for The Wire's ability to accurately represent the kinds of social and economic transactions that lead to urban problems like Baltimore's. In Simon's work, the upper classes are offstage presences who serve alternatively to mark social immobility (like when D'Angelo Barksdale looks despairingly around the Prime Rib in season 1) or to initiate and benefit from the misery of "real" Baltimoreans (like when Nick Sobotka heckles Carcetti's conversion of the waterfront into condos, or bemoans the re-classification of his family's rowhouse as a "Federal Hill" property). The politics of what counts as real in The Wire is one of the most insistent and problematic parts of its social vision, and it's disappointing to watch Moore let it pass with nothing but a polite golf clap.
Compare Moore's largely congratulatory descriptions of The Wire with Nicholas Lemann's take on Simon's latest project, Treme. Lemann's analysis appeared only a week earlier, but his descriptions of what goes right, and wrong, with Treme are wonderful. He recognizes the series's insecurity about proving itself as a "real" depiction of New Orleans, and the awkwardness that results as Treme repetitively flashes its street cred to audiences. Lemann also rightly identifies Simon's inability to portray wealthy families clearly. He makes a mistake, however, in assuming that issues of reality and authenticity weren't a problem for The Wire, where Simon and his writers could more easily access a local perspective. The issues are there, but they're muffled by their alignment with the typical conflation of the low with the real.
There is, in other words, a big portion of society--in fact, the portion that used to be called Society--that never really makes it into Simon's view of the world. You have to wonder how that impairs his ability, and his projects' ability, to tackle the issues they claim to depict so accurately. It's a weird reverse ghettoization: wealthy North Baltimoreans see the origins of social ills in the form of the occasional member of the underclass who comes into their neighborhoods to steal from them; advocates of the underclass see the origins of social ills in the form of the wealthy who come into their neighborhoods to steal from them. Simon is a master of the second version of the narrative, and he does it beautifully, powerfully, and unforgettably. But at some point it seems as if maybe, to effect any real kind of change, we need a new story.
Two Phrases We Should Use More
Given how much we love to build abstractions from reality, and how much we later tend to confuse those abstractions with reality, it seems like we should use these two concepts from N. Katherine Hayles a lot more often:
Platonic BackhandBoth moves are essentially idealistic--that is, they privilege abstractions over the irreducible complexity of the world itself. The former denigrates any complexity that doesn't fit the abstraction as not worth acknowledging; the latter sees that complexity, but deals with it by assuming that the only difference between its ideal world and the real world is some incomprehensible messiness, so that adding that messiness to its ideal thereby approximates "the real."
Hayles's term for the tendency to judge reality according to abstractions that were derived from it--i.e., the tendency to imagine that abstract forms of the real somehow preceded the real. From this perspective, reality appears like a collection of ideal forms that has gone noisy or static-y from the unfortunate, but essentially meaningless, clutter of individual circumstance.
Taken to an extreme, this habit leads to a worldview that prevents people from seeking new explanations for phenomena or data that don't fit pre-existing theoretical models of the world, because by definition anything that does not adhere to the model is just noise.
Platonic Forehand
This is a newer and less common problem, but an interesting one nonetheless. The Platonic Forehand is Hayles's term for the mistaken belief that any simplified model that can generate "noisy" results can be considered an accurate or realistic model: since reality is full of a bunch of noise, dirt, and static we don't understand, an abstract model that can also generate similar things we can't understand must be doing the same kind of work as the world, generating an equivalent world of its own.
(That second one seems analogous to at least some of what goes on in certain kinds of gritty realism--the addition of "grit" to an abstract model of the world's workings essentially creates an acceptably "realistic" portrayal of the world.)
Now go forth, and accuse others of these nicely named errors!
Pop Culture Brontës
I wish I could post the skit from season 30 of Saturday Night Live where Rochester (Jude Law) keeps going up to the attic for sex with Bertha (Maya Rudolph) while Jane Eyre (Rachel Dratch) listens through the door in prude Victorian confusion. But alas, NBC has taken down the YouTube video of the skit without actually posting the clip to their own site.
Still, enjoy the following:
Dude Watchin' with the Brontës
And of course:
**UPDATE:** Another classic contribution, "The Semaphore Version of Wuthering Heights," from Monty Python--begins at 1:05. (h/t Nathan Peterson!)
Book: Villette by Charlotte Brontë (1853)
"Quietly devastating" is one of those elegant journalistic clichés that pop up regularly in book and movie reviews to applaud a particular kind of stylistic understatement. It's like the hyperliterate and depressing counterpart to "feel-good comedy of the year"--a phrase that seems made for blurbing a certain form of recognizably literary literature. Regardless of how well-worn the phrase is, I can't think of any more apt way to summarize Villette, Charlotte Brontë's beloved final novel.
Like Jane Eyre, Villette is a coming-of-age story told in the first person by a plain-looking young woman struggling for financial independence. And like Jane Eyre, it is slowly absorbing. For the most part, though, plain Jane is a straightforward narrator, whereas Villette's Lucy Snowe gradually reveals to the reader that she is a puzzle, and intentionally so. Rather than describing difficult passages in her life, she wanders off into elaborate metaphors, like the famous shipwreck image that she uses in place of an actual explanation for why she leaves England for Belgium:
For many days and nights neither sun nor stars appeared; we cast with our hands the tackling out of the ship; a heavy tempest lay on us; all hope that we should be saved was taken away. In fine, the ship was lost, the crew perished.We never find out what exactly happened here, and it's the first of many times that Lucy Snowe obscures or dodges events happening in her life. Eventually the disruption of these narrative cover-ups becomes more predictable and almost pleasurable: there is a sense of getting to know Lucy Snowe, and guessing what kind of information she may be withholding from us.
If all that sounds pretty psychologically sophisticated for a novel from 1853, I think that's because it is--and it's probably why Villette has come up several times in recent conversations with professors about the greatest novels of all time. The delight in learning to second-guess a fictional speaker is something that seems more common in Victorian poetry (think the dramatic monologue, e.g.Tennyson's "Ulysses"--even if a lot of people do read it straight--and of course R. Browning) than in the period's novels. In Villette, though, narratorial unreliability becomes more of an endearing quirk than a damning "gotcha"--we get close to Lucy Snowe by learning how she tries to put distance between herself and us. The pleasure of that growing intimacy, and the final turns by which it's shattered, make the conclusion almost unbearable. There isn't much in the way of redemption here, but there is a lot in the way of power. This really is one of the greats. Highly recommended.
Vocablog: umbrageous
umbrageous, adj.
Of a thing, shady; of a person, easily offended.
(Found in Charlotte Brontë's Villette.)
Usage example: I walked through umbrageous alleys, dodging umbrageous men.
Book: What Is Posthumanism? by Cary Wolfe (2010)
The title of What Is Posthumanism? may be a little bit misleading. Wolfe's book isn't exactly a primer in the idea or methodologies of posthumanism. (With the possible exception of the introduction, I have a hard time imagining assigning this book, or most selections from it, in an undergrad course on the topic.) What it is instead is a collection of essays roughly divided into two sections.
The first section examines the way that posthumanism and systems theory fit within academic and philosophical debates of the last fifty years or so. I can't emphasize enough how useful this was to me. I don't come from a theory-heavy background, and most of the thinkers branded "post-structuralist" and associated with deconstruction--e.g., Derrida, Deleuze, etc.--have always left a bad taste in my mouth. They are obscurantists; they use densely packed jargon and flirtatious French puns to camouflage the generally unremarkable character of their thought.
Wolfe has me seriously questioning that dismissal. Derrida is a cornerstone of the first half of his book, and he successfully shows how Derrida's formulations, which seem so inaccessible, are actually struggling with the same problems that systems theory is trying to tackle. It's hard to paraphrase these problems briefly, but I'll try. They center on the question of how, in a world of largely arbitrary distinctions (the arbitrariness of language, for example, or the arbitrariness of species divisions), meaning can be generated in anything like a stable fashion. In the process of showing how Derrida and systems theorists are both working toward the same answer from opposite directions, Wolfe implies--I think--that so-called post-structuralists and systems theorists are both still striving after the dream of a theory of everything, a theory that might unite all fields of study along some common principles. That idea is one I associate with structuralism, as Robert Scholes outlines it in his 1975 book Structuralism in Literature: An Introduction. All of which means that cultural studies since 1970--which is often attacked for growing increasingly chaotic and directionless, always chasing after the "next big thing"--may actually have a more coherent and defensible story of development than is normally thought.
Wolfe's other meditations on disciplinarity are also useful. His division of various famous critics and philosophers in terms of their concerns (humanist or posthumanist) and their disciplinary relations/knowledge organization (humanist or posthumanist) is really thought-provoking. According to this rubric (which comes with its own handy chart), it's possible to tackle supposedly revolutionary posthumanist subjects--the ethics of our treatment of animals, for instance--in humanist ways that dampen or nullify what might be most interesting about the topic. Thus, someone like Peter Singer practices a humanist take on posthumanist concerns (a humanist posthumanism), whereas someone like Bruno Latour practices a posthumanist posthumanism. This seems like a crucial distinction to me, and I don't know of anyone else theorizing it as clearly as Wolfe. My one major complaint here centers on Wolfe's ideas about animal studies. He never even mentions what seems, to me, like an obvious question: if we're going to question the way we use arbitrary distinctions like species barriers, why does that questioning implicitly stop at the animal kingdom? What about our understanding of plants, fungi, etc.? And if those seem like absurd questions, I'd like to hear someone as clear-minded as Wolfe explain why that should be the case.
The second half of the book consists of studies of individual works of art--the theory of the first section put into practice. Most of Wolfe's readings consist of syntheses of a few key thinkers (Derrida, Luhmann) to overturn previous consensus about the meaning of, say, Dancer in the Dark or Emerson's idea of the self. These applications often rehash the same theoretical ideas, but not in a bad way: posthumanism and systems theory can be abstract and counterintuitive, so it helps that the ideas are repeatedly hammered home. Especially nice is the critique of visuality that bubbles, submerged, through most of these readings. Key insights or definitions pop up in these readings in ways that might be better organized, but that just makes reading the book in its entirety more vital. For example, here's a definition of systems theory (which is always hard to explain) from a chapter on architectural proposals:
[T]he conceptual apparatus of systems theory . . . is based on the central innovation of replacing the familiar ontological dualities of the philosophical tradition--chiefly, for our purposes, culture and nature--with the functional distinction between system and environment. (205)Yes, it's still wordy, and it requires examples to explain why it's so useful in understanding the world--but it's a great start. So is the rest of this book. It may not be organized as an ideal introduction, but it works as a good point of entry for someone roughly familiar with current ideas in literary study. I expect I'll be combing through its references and footnotes for a long time to come.
Computers...taking over...must...make...soulful art...
I've been thinking a lot about this recent op-ed by Jaron Lanier, which claims that we commit a "devaluation of human thought" when we treat computers like minds.
Lanier gets a number of the facts right: the artificial intelligence spin is a sure boost to any technological story, and the analogies between computer "intelligence" and human intelligence are largely overblown in order to win press coverage. He's also right that it's dangerous to imagine that we'll ever know for sure if it's possible to download human minds into software; any tests we can devise to "make sure" that consciousness has been effectively digitized will be the exact tests that programmers and designers are building systems to satisfy. (We all know that it's possible to train kids to ace a test without educating them; the same risk exists in the design of computer intelligence systems.)
For the most part, I side with N. Katherine Hayles, whose book How We Became Posthuman explains how we came to believe in the idea of a machine intelligence that could be equal to or greater than our own. Hayles has serious doubts about that possibility, namely because it depends upon a belief that the nature of human (or even animal) intelligence could be divorced from its material structure. The mind, she points out, doesn't operate on simple binary principles; until we can create structures that exactly mimic the material workings of the human brain--and to do that, they must be incorporated into a nervous system and "body" that closely parallels our own--we won't ever create an intelligence that works like ours, but better.
That said, I think Lanier has overly simplistic and idealistic images of the bottomless wonder of human intelligence and creativity. He wants to preserve our gasping admiration of "personhood," and feels threatened by the possibility that we may "begin to think of people more and more as computers."
This concern--that human individualism is under fire--has been around for about as long as the idea of human individualism, and I don't see what's compelling or new about Lanier's concern about A.I. It could just as easily be Lanier panicking about the Industrial Revolution turning people into machine parts, about capitalism turning people into "the bottom line"--hell, he could be panicking about how the invention of writing devalues the presence of human teachers, advisors, and memory. His worries that people are passively letting computers mechanically determine their aesthetic choices through Netflix or Amazon is particularly offensive, since it imagines that
(1.) anyone is actually mindlessly doing such a thing [!?] and that
(2.) the aesthetic is the realm of individual expression and taste, which--prior to the existence of evil mechanical algorithms--was some sort of sacred realm in which untarnished individuality blossomed.
I guess those of us who study the arts should be grateful for this bone being tossed our way by a member of one of the more "practical," "sensible" fields. But this is what scientists and engineers always do--they make pleas for the arts and the aesthetic by turning art into a wonderful world that defies logic and keeps us grounded, soulful, human. If this is the price to be paid to get public respect for students and producers of the arts--if we're allowed to be kept around only as the infantilized and inexplicably necessary darlings of scientists, as their extracurricular activities, toddlers and lap dogs--I think most of us would rather be put down.
Let me offer my grounded, soulful, human rebuttal to Lanier's paranoia. Any useful technological breakthrough acts as an extension or distribution of mental processing, so it does work as part of a mind--one that includes human minds in its circuit. It's ridiculous to imagine that the human mind is somehow separate from these systems--as if it weren't altered by the use of a technology like language or writing. All these technologies interact with human minds in reciprocal ways. Recognizing the importance of the human brain in this system is crucial, but we should also recognize that the line between "active mind" and "passive tool" is more arbitrary than someone like Lanier--who wants us to think of computers as "inert, passive tools" to keep his idea of the ineffable human intact--is willing to admit.
Book: Barchester Towers by Anthony Trollope (1857)
One of my first reviews on Nifty Rictus was of Anthony Trollope's The Warden. I recently had the mixed pleasure of reading its sequel, Barchester Towers--the second in the six-novel Chronicles of Barsetshire series--with an informal Victorian Fiction reading group, so it seems only appropriate that I post some thoughts on it.
The first half of Barchester Towers feels familiar. We're reintroduced to Septimus Harding, his daughter Eleanor, his son-in-law Archdeacon Grantly, and an expanded cast of characters that includes many of the minor players in The Warden. In fact, the set up is almost like an extended remix of The Warden: without giving too much away, Eleanor is once again a desirable single woman, and Mr. Harding is once again in a position where he can be tortured by the ethical implications of accepting a controversial clerical appointment.
Stylistically, the early parts of Barchester Towers show us an author who is unable to control one of the most horrible habits of nineteenth-century novelists, a habit that I'm convinced Trollope picked up from Thackeray: the mock-epic tone. Instead of giving straightforward descriptions of, say, a dinner party, Trollope endlessly ridicules his characters by comparing their every act to some heroic deed the act doesn't remotely resemble. While this is supposed to amuse us, it is clumsy and overused here, and it feels juvenile.
The second half of the book picks up substantially. The mock-epic treatment drops away as Trollope grows emotionally invested in the complex tangle of his characters' personal and professional lives. (Even this pattern of mock epic giving way to more sophisticated seriousness feels Thackerayan--I have almost exactly the same experience reading Vanity Fair.) Best of all, Eleanor goes from a daddy-loving cipher to a stubborn and realistic character--a welcome change that makes the novel's emotional climax all the more touching and satisfying.
Still, it's a bit of a slog to get there. Trollope's sketches of characters like the Thornes and the Stanhopes are brilliant throughout--the Stanhopes, our group decided, are straight out of a Wes Anderson film--but the author stumbles in his early attempts to put them into action. If you can wade through that (or if you chortle every time a dinner comment is compared to volley of arrows), it's a worthy read. Otherwise, I might pick up something else...maybe even (as I'll discuss in another post) Trollope's Autobiography.
Misbehaving Like a Professional
Katie Roiphe, who seems to be making an impressive career out of writing thought-provoking essays and books about sex, has a nice piece in the NYT that asks whether we aren't too obsessed with health and goal-oriented activity in comparison to the brilliant slackers of the past.
There does seem to be an almost unbearable focus on "professionalism" on the job today. In addition to Roiphe's piece, the NYT regularly runs features about how, if your Facebook account references or--gasp!--shows pictures of yourself enjoying your alcohol, you can pretty much say sayonara to any chance of getting that job you were hoping for--even though your recruiter has probably detagged countless similar pictures of him-/herself.
I wonder if there isn't something about the immediacy of digital media that makes us more delicate and more shocked by evidence of behavior that people have always engaged in and always told stories about. It's one thing to hear "X had too much to drink at that office party" and another to see images of X passed out on a bathroom floor with someone giving a thumbs up above her head. One provokes a chuckle, and the other a much more visceral, judgmental reaction.
The kind of devil-may-care behavior Roiphe describes hasn't exactly vanished, I don't think--it sounds a lot like most people's idea of college. But nowadays we put a lid on it after those four years, or at least demand that adults publicly pretend to have put a lid on it in professional circumstances.
There is some evidence downright contradictory to Roiphe's argument. Even "back then," in the early '60s Roiphe eulogizes here, contemporaries were reporting a sense of falsehood or disingenuousness to their supposedly carefree sinning, a sense that even their craziness was strictly scheduled. Here's Walker Percy, from his National Book Award-winning 1961 novel The Moviegoer, talking about the "malaisian," his word for modern man:
[Christians] keep talking as if everyone were a great sinner, when the truth is that nowadays one is hardly up to it . . . The highest moment of a malaisian's life can be that moment when he manages to sin like a proper human (Look at us, Binx--my vagabond friends as good as cried out to me--we're sinning! We're succeeding! We're human after all!).
Movie: "Fur: An Imaginary Portrait of Diane Arbus" (dir. Steven Shainberg, 2006)
"Fur" is intended, not as an accurate biopic, but as "an imaginary portrait" of the photographer's "inner experience," a text warns at the beginning of the film. I've never read Patricia Bosworth's biography of Diane Arbus--which supposedly inspired Erin Cressida Wilson to write "Fur"--so it's hard to judge how accurately the photographer's psyche is rendered here. But even apart from any standard of truth, the story told by "Fur" has its advantages--and shortcomings.
The biggest liberty taken in the film is the invention of Lionel Sweeney (Robert Downey, Jr.), a man who moves into an apartment above Arbus (Nicole Kidman) and her family. Lionel, who suffers from hypertrichosis--a disease that covers his entire body and face in hair--introduces Arbus to his social circle of freaks and outcasts, which gives her the courage to take up photography outside the oppressive constraints of her husband's photography studio.
The first thing to note about "Fur" is that it is gorgeous--visually and aurally, it is richly sensual, and every performance in the film is striking. The writer-director team of Wilson and Shainberg also deserves credit for pushing the envelope on themes of fear, violence, and sexuality, as they did in their earlier collaboration, "Secretary." Arbus's pleasurable and even sexual fascination with things that frighten her makes "Fur" interesting. That interest is partially undercut, however, by a storyline that we all know too well: female free spirit suffers under domestic oppression until a surprising, risky experience reveals the magical world she has been missing. It doesn't help that Kidman played Virginia Woolf in "The Hours," which also capitalized on the same old story of which Woolf--with her repressive upbringing, marriage, artistic awakening, and final conflicted suicide--is such a convenient example. It's the stuff of a simplified Woolf biography, or of Kate Chopin's "The Awakening," or of Gilbert and Gubar's Madwoman in the Attic: the story of female originality suffering under restraint is beginning to feel unoriginal and, in fact, restraining.
There's also something unrealistic about the way "Fur" portrays freakishness--as if abnormality, like normality, were an actual "thing," a common code that could unite a community. To paraphrase Tolstoy, normal people are all alike, but every abnormal person is abnormal in their own way. What makes Arbus's body of work so interesting is that she managed to make everyone look freakish, an accomplishment that said very little about the world and very much about the workings of her own mind. By embodying freakishness in the figure of Lionel and his community of freaks who all somehow know and like each other, "Fur" turns Arbus from an expressionist--someone who shared her own nightmarish ideas of the world--into a reporter on a lovably off-kilter world already in existence.
What I can't decide is whether or not this transformation was intentional. As Susan Sontag notes in On Photography, Arbus herself talked about her work this way, acting as though her subjects were always as strange as they are in her portrayals, talking about them with "the childlike wonder of the pop mentality." So far as I can tell, then, "Fur" does seem to paint an accurate picture of the way Arbus described her "inner experience"; the problem is that Arbus's descriptions don't match up with her actual project. So to see Arbus's work from the inside is almost necessarily to misunderstand it as a kind of delightful vacation to a strange world. This movie beautifully renders that misunderstanding, but without context, it threatens to propagate and popularize an unrealistically uplifting ideal of her life and photographs. The most obvious limitation of this take on her work is that it renders her eventual suicide almost incomprehensible--and indeed, "Fur" ends without mentioning it.
The Mind Is Not the Brain?
Recommended: a couple of good online reflections on Marilynne Robinson's new book, Absence of Mind: The Dispelling of Inwardness from the Modern Myth of Self, from D.G. Myers and Amelia Atlas.
While both these reviews are positive, I don't think I'll be picking up a copy anytime soon. Robinson is writing against what she calls "parascientific writing," a handy term that leaves respect for science intact while disparaging the recent trend of popular, scientifically-informed books that tend to argue against a spiritual realm, or that tend to denigrate individual experiences of consciousness.
Agreed: a lot of modern empiricism (and so-called empiricism) tends to denigrate individual accounts of what life or experience consists of. It wants to tell people what they're "really" doing, "really" thinking, and what the world is "really" like, and refuses to listen to what they themselves have to say. Robinson identifies the problem wonderfully. But the solution to this problem doesn't have to be a retrenchment in spirituality, or jaw-dropping wonder at the imponderable nature of existence/the human mind, or a dismissal of neuroscience's materialistic findings about the brain.
In fact, a number of thinkers in a variety of fields have been troubled by the same problem that disturbs Robinson, and have, I think, solved it better. For instance, Robinson's concern about the way "parascientific" writing downplays human experience has also been expressed by the sociologist Bruno Latour. Latour encourages researchers to take people seriously when they describe their sense that superhuman forces, for instance, are compelling them to do things. For someone like Latour, this information is a valuable clue to how the human mind, and how social activity, functions--it isn't some kind of charade to be exposed. (Latour makes this argument in Reassembling the Social.)
Easily my favorite book in this vein--and one of my favorite books of all time--is Gregory Bateson's Steps to an Ecology of Mind. Bateson was a polymath who dabbled in anthropology, sociology, animal communication, and a number of other fields. Steps is an anthology of his writing from the 1940s through about 1970, showing the way he worked through questions of information and communication to form a new picture of what "mind" might mean--a picture that respects human experience and tries to place it appropriately in the framework of a larger system. His view manages to include both the mystical awe that comes with true appreciation for mental processes, and an empirical view that respects and deploys scientific understanding.
Later systems theorists and posthumanists have followed up on this work in various ways that--while equally fascinating--are more academic. (I'm thinking of people like Niklas Luhmann, or Cary Wolfe, whose What is Posthumanism? I'm currently devouring.) If you're interested in this problem--the poverty of popular scientific explanation, the difficulty of reconciling certain scientific claims with felt human experience--I strongly, strongly recommend Bateson's work. But my broader point is that we're at a kind of explanatory impasse, and Robinson has done a fine job identifying it--but the impasse doesn't mean we need to retreat to one camp or another. It means it's time to scout for new paths, and I think systems theory, in its various forms, offers them.
Vocablog: megrim, megrims
megrim, megrims, n.
In the singular (megrim), a bad headache or migraine; in the plural (megrims), depression or melancholy.
(Found in George Eliot's Middlemarch.)
Usage example: Recent news about the trade deficit has given many economists the megrims. I don't have a head for numbers, though, so trying to understand it all gives me a megrim.
Other useful forms: megrimical, adj.; megrimish, adj.
Phrenology returns?
Good news for people with giant heads: the journal Neurology reports that people with bigger heads are less likely to show signs of Alzheimer's disease--even when they have the same percentage of brain cell death as people with smaller heads.
Scientists involved say that head size can reflect brain size, which in turn may be correlated with the "reserve" brainpower that subjects may be able to draw on when the brain is otherwise damaged.
I'd love to see the proposal for this study--it seems so comically old-fashioned. ("Well, see, we're goin' ta look at the egg-heads, see, and the pin-heads--ya follow?--an' we're gonna compare 'em, see...") It's actually surprising that scientists are willing to correlate brain size to head size, given that relative brain sizes between demographic groups are at the forefront of controversial debates about "intelligence." I guess the idea that character traits can be read from the appearance of a person's head will always be seductive--the question is whether it's too seductive to be realistic.
Agnostics Battle Atheists: Sabbath Spectacular
Andrew Sullivan points out an online scuffle going on between agnostics, represented by Ron Rosenbaum at Slate, and atheists, represented by Julian Sanchez, currently a fellow at the Cato Institute.
As a former agnostic, I find it hard not to agree with much of what Ron Rosenbaum says. The "New Atheists," as they're called--Richard Dawkins, Christopher Hitchens, and that whole loose camp--are a little too vocal, a little too certain, and a little too abrasive to be attractive. That said, Julian Sanchez is right-on in his attack on Rosenbaum's logic: the fact that empirically-minded atheists can't explain creation doesn't mean they have to throw up their hands and vocally admit defeat. Just because you can't prove that something DOESN'T exist doesn't mean you can never commit to a position about its existence with reasonable certainty.
Sometimes I wish that atheism hadn't been perverted by this Enlightenment-obsessed New Atheism. My personal trip to atheism went along substantially different lines. I remember learning the word "agnostic" from a classmate in junior high, and (having been raised outside of any religious tradition) thinking something silly along the lines of, "Oh. That's a neat word for what I am." I stayed in the "I-dunno-about-God" camp--even when I immaturely delighted, as the New Atheists do now, in skewering believers--until a couple of years ago, when a sudden realization changed my mind.
The realization was that agnosticism is a kind of cheating. Here I was, not giving a damn about the Ten Commandments, not even really bothering to look into religious issues, but still claiming that I was somehow on the fence about the existence of a God. I asked myself: "If I really believed there might be a God, would I be living the way that I am?"
The answer was obvious: No.
If you are open to the possibility of a God in the only sense that seems worth arguing about--an all-knowing, all-powerful being (or group of them) that made the universe and mankind--then your very first task should be to figure out what that God might be like, in order to understand how to live in accordance with Its (His? Her? Their?) program. The stakes are unbelievably high, and by acting as if they weren't, I had in effect made my choice. In practice, I was an atheist--but when confronted, I was making excuses that, in the presence of an actual God, weren't likely to hold up.
I wasn't about to dedicate my life to religious inquiry--you can't just whip up belief like you whip up a batch of cookies--so I "converted" to atheism. (One perk: no paperwork involved.) But my atheism is really about a lack of belief in any god, unlike the New Atheists, who are, as their critics everywhere note, fundamentalist believers, worshippers of the greatness of No God. According to these faithful atheists, No God (sometimes known, arguably incorrectly, as "Science") will step in and solve all the world's problems, if only we would all listen to the Good News.
It's the latest rebranding of humanism, which is, pretty obviously, a religion--but one that, as far as I can tell, replaces the worship of one beloved imaginary being ("God") with another ("Mankind"). Agnosticism is essentially another brand of humanism, too--it's just not sure whether it should give God a quick nod before it returns to its primary, human-centered concerns.
Outdated and dangerous ideas about "Man" and "the human" are, I think, the real problem with most current belief systems. As a result, I don't have a lot of love for either side of this New Atheist/New Agnostic coin.
Isolationism and American Writers
Not too long ago, Horace Engdahl--permanent secretary of the Nobel prize jury--attacked American writers, arguing that no American had a chance of winning a Nobel in literature until we stopped drinking our own urine and marrying our siblings. (At least he'd read his Faulkner.) Many American luminaries responded in kind. Personally, I was shocked that the Swedes, who are known primarily for their cooking skills and hairy, eyeless faces, would suddenly turn on our culture, which has looked so lovingly upon them for so long.
Anyway, it took about two years, but now we have a weightier, more thoughtful response to Engdahl, in the form of a recent review essay by Tim Parks in the New York Review of Books. Parks, an internationally-based writer of British birth, reviews a few recent publications designed to redress concerns about American writing and the novel in general. He comes to a number of great conclusions that ring true, among them the fact that it's actually very difficult, if you remove place names and other cultural markers, to tell the difference between writers of any nationality.Writing about the recent publication Best European Fiction 2010, he observes a broader trend:
Each writer appeals confidently to an international liberal readership at the expense of provincial bigotry and hypocrisy . . . Across the globe, the literary frame of mind is growing more homogeneous.I doubt Parks means this as the damning statement that I hear when I read it. In any case, his review does a fantastic job of addressing both the state of contemporary European writing and the statistics that would seem to support European internationalism over American isolationism. It's worth a read.
Thoughts on Ren and Stimpy
There's an awkward hour or so between the time I normally get off the phone with my fiancée and the time I go to bed--a segment between about 11pm and 12:30am when I'm too tired to read but that's too short for a movie. I used to fill this time with whatever shows were on TV, which is dangerous--both because I tend to watch them for longer than I mean to, and because they tend to feature, in the words of Jerry Seinfeld, celebrities telling bad stories about their plumbing.
Then I discovered old television series on the Netflix streaming Wii disc, and my quality of life soared.
If that's not exactly true (I recently made the mistake of watching a horrible Zach Galifianakis special), Netflix's ever-increasing library of worthy TV shows has definitely helped. That's how I just ended up watching the first episode of Ren & Stimpy--a show I hadn't seen since I was 9 or 10.
Even now, The Ren & Stimpy Show looks different from other shows. For one thing, it had much longer intervals in which characters were shown expressing over-the-top emotions while music played in the background. But even more unusual is the way the show moves back and forth between cartoon flexibility, where characters can stretch or undergo violence in impossible ways, and a disgustingly detailed fleshiness. In one shot, Ren looks like a droopy little dog; in the next, his strangely human butt is being paraded in front of the camera. Likewise, Stimpy sits on a mound of glittering color in one shot that then materializes into Gritty Kitty Litter that the characters squeeze between their toes or crunch repulsively between their (now gigantic and gummy) teeth.
I can see how Ren & Stimpy eventually got old by failing to adapt, the same way that South Park would have gotten old if it hadn't moved from being a show about children saying "f*ck" into the realm of social satire. But there's still a lot to it. Ren & Stimpy's interest in interrupting the pure, imaginary realm of cartoon interactions with reminders of gross bodily function feels very familiar, not from other shows but from life itself--it does a great job of representing the embarrassing intrusion of dirt and grime and other crap into the mental and emotional world that we all like to imagine we occupy. That gives the show its own strange brand of realism in a famously unrealistic medium--an admirable achievement.
Book: The Seymour Tapes by Tim Lott (2005)
I tore through The Seymour Tapes (2005) in about two days. It was one of my purchases at the "Exposed" gift shop at the Tate Modern, but it doesn't seem, for whatever reason, to have made it into paperback in the U.S.
The novel is written as a series of transcripts of interviews between "Tim Lott" and members of Dr. Alex Seymour's circle of family and friends in the wake of Seymour's death. Seymour, a physician living in London, has become a media sensation after a video related to his murder was leaked online. As the story unravels, we learn of Seymour's increasingly elaborate installations of surveillance equipment in his own home--an operation that he hopes will restore the kind of domestic order that he feels is his fatherly responsibility. His foray into surveillance, however, puts him beyond the pale of normal social and sexual boundaries as he becomes increasingly close to the disturbed American Sherry Thomas, the owner of the company that lends him equipment.
Lott's set-up here is intriguing and topical, and the format of the book is a stroke of genius--the transcripts bring us as close as a text can to the issues of voyeurism, truth, and immediacy that underpin the story. Lott plays nicely with questions of narrative point-of-view and journalistic ethics, adding extra turns to an already twisted tale. The turns never feel labored, but I did find some of them disappointing--without giving anything away, I'll say that I think the ending makes this a much more traditional story of ethical violation than it feels at first, a return-to-normalcy that sacrifices some of the moral complexity that makes the set-up so exciting. The final abandonment of the transcript form in the novel's coda reveals some of this disappointing information, and it should have been cut. As it stands, the coda tries to frighten in an uncharacteristically simple way, kind of like someone jumping out of a closet and yelling "OOGA BOOGA BOOGA" at the end of an otherwise mature and upsetting psychological drama.
That drama is worth the forgettable ending. I'd recommend The Seymour Tapes pretty highly to any kind of reader, especially if you're interested in suspense novels, surveillance, or the difficulty of separating truth from fiction.
Art Review: "Exposed" @ Tate Modern (London)
"Exposed" is sort of overwhelming--I had to visit twice to see the whole thing thoroughly, because the museum closed 3/4 of the way through my first visit. This sprawling, 14-room show, conceived by Sandra Phillips of SFMoMA, is a look at the ways photography has taken on issues of "exposure," particularly issues of voyeurism and surveillance. It is broadly divided into sections devoted to "The Unseen Photographer," "Celebrity and the Public Gaze," "Voyeurism & Desire," "Witnessing Violence," and "Surveillance."
The sheer amount of space devoted to "Exposed" means that the show can present some material in historically accurate ways that I, for one, have not had the chance to see before. Case in point: anyone reasonably familiar with photography as a tool for exploring sexuality has seen images from Nan Goldin's classic "Ballad of Sexual Dependency," but "Exposed" goes to the trouble of showing them in something like their original form--as a slideshow in a darkened room, set to the music of the Velvet Underground and others. Likewise, Japanese photographer Kohei Yoshiyuki's infrared images of couples and groups secretly having sex in parks in 1970s are displayed in a darkened room with spotlights--a reference to their original gallery presentation in the 1979, where guests in dark space were given flashlights to view them.
While the spread and treatment are nice, the show sometimes feels like too much. It needs some cutting and narrowing. The sporadic insertion of images by well-known streetshooters (Henri Cartier-Bresson, Robert Frank) feels especially intrustive, like big names unnecessarily larding a show that might have been more targeted. I found myself confused: were streetshooters featured because they exposed their subjects in everyday life, or because they hid their cameras for fear of their own exposure as thieves of others' images? That unexplored ambiguity seems, to me, typical of the show as a whole. It feels like the vast range of images gathered here is united only by the many things "Exposed" could mean. I would have liked to see a tighter sense of organizational principal, both historically and thematically, rather than a show that felt guided by the ambiguous pun of its verbal title.
Nevertheless, there's something to be said for capacious messiness. This is a rich show with something to delight (and, I think, surprise) almost anyone, from the photographic beginner to the more seasoned enthusiast. Alongside what I would consider "over"-exposed art historical names, It manages to include a number of less popular images and artists--it was especially impressive, I thought, on issues of celebrity fandom and sex. Cammie Toloui's pictures from the "Lusty Lady" series are a particularly fantastic choice. Shot from her point of view as she worked as a stripper in a San Francisco peepshow, Toloui's grainy black-and-white images occupy some unclear space in the network of gazes that forms between an erotic dancer and her (often exposed and masturbating) audience--a reflexive experience made visible as the reflection of Toloui's nude body in the glass divider blends uneasily with the images of her voyeurs on the other side.
Unfortunately, "Exposed" ends on a weak note, as the last section, which covers the hot-button issue of surveillance, is disappointingly cold. (The exceptions--Shizuka Yokomizo, Laurie Long--prove the rule.) That's less a curatorial blunder than a reflection of what I see as the stagnation of a lot of contemporary political photography. Historical selections can be excused for unfetching aerial images of, say, Normandy on D-Day. But contemporary photographers also seem unable to make surveillance images interesting. They tend towards either dull, grainy pictures of current surveillance tools, or conceptual work that justifies its bland visuals with chunks of text--text which, by the time you get to this room, you will probably be too tired to read. Both approaches seem like poor ways of capturing public attention on issues of pressing social importance.
The whimper that ends this show is driven home by an installation located outside the (surprisingly excellent) gift shop. While it's unrelated to "Exposed," it captures what's broadly wrong with the surveillance part of the exhibit. Called "Night Watch," by Francis Alÿs, the installation consists of a wall-sized array of closed-circuit televisions showing different rooms in the National Gallery. It appears as ponderous (and as boring) as many of the works in the surveillance section, but you stand before it a while anyway, just to look thoughtful--then something moves in one of the monitors.
It's a fox! This incongruous little animal, dashing from room to room of the museum and from television to television in the grid, embodies the kind of surprise and mystery you expect when you take on the role of voyeur. The playful artificiality of this introduction of a wild animal to the museum may offend the high moral seriousness of much of the surveillance work, but it makes you think a lot more about your expectations and desires, as a watcher and as a human being, than the more overtly political work that ends the exhibit…and isn't that, after all, what good art is supposed to do?
"Exposed" runs at the Tate Modern in London through October 3, 2010.
Other takes on the show: 3quarksdaily, the Guardian.
Images, from top, copyright Kohei Yoshiyuki, Cammie Toloui, and Francis Alÿs.
Links: London Literary Edition
I'm in London, doing research, which is fantastic--but it means that I feel guilty blogging when I should be writing something more weighty and academic.
So, until I get a free moment, enjoy the following links--with commentary:
The "Slow Reading" Movement
There's been an explosion of "anti-web-reading" articles/books recently, but the (at least attempted) movement here is interesting because it tries to link basic textual analysis ("close reading") with the sensual enjoyment of food advocated by the "slow food" movement. I like it--it addresses a serious problem I've seen in my own classes of students trying to skim texts that can't be skimmed--but I don't think any of this stuff will work unless we beef up requirements for the number of arts and literature courses at colleges and universities everywhere. (And therefore, the number of teaching jobs available…)
Digested Read
Now I contradict myself with a link that provides digests of recent books. But really, these two links cohere into a unified theme: the only thing worse than our collective inability to understand written material is our desperate urge to produce written material. Only a tiny percentage of us (here I should just blow my cover and say, "them") manage to do this and get it published, but that's still a lot of new vampire books and celebrity tell-alls per year. If you're reading the right (slow) way, you can't keep up with the flood. Luckily there's John Crace, who writes beautifully cruel, 700-word digests of recent publications. Are they fair? I don't know, because I haven't read the books…and what I love about Crace is that he convinces me that I was right not to.
"Top 100 Books of All Time"
This is old news, I guess, but I saw it under the Guardian's "Most Viewed" while browsing "Digested Read". (It's still in "Most Viewed," 8 years on!) Sadly, it appears that the people who compiled the list accidentally swapped it with another list they were compiling at the same time: "Books Literary Types Should Recommend to Others, As If They Had Read and Enjoyed Them".
Book: Wuthering Heights by Emily Brontë (1847)
Wuthering Heights has enjoyed a recent return to popularity, thanks to some intertextual references in Stephenie Meyer's Twilight series--so it seems only appropriate that I finally got around to reading it.
Since I haven't read the Twilight series, and I don't consider myself particularly sappy, I was surprised to find myself siding with (randomly selected) Twilight fan Hayley Mears, whom the Guardian quotes in an online review saying: "I was really disappointed when reading this book, it's made to believe [sic] to be one of the greatest love stories ever told and I found only five pages out of the whole book about there [sic] love and the rest filled with bitterness and pain and other peoples [sic] stories".
"Bitterness and pain" is right. (And all the [sic]s in there are right, too, because this book is pretty [sic].) Essentially, this is the story of two snarling, brutish, semi-wild children who love each other in their animalistic way, but don't get together because one of them, Cathy, thinks she needs to marry someone with class and money. Her dismissal of Heathcliff, the vicious foundling she loves, for Edgar, the wimp who loves her, sets off a chain of about three-hundred pages of slow, conniving cruelty, as Heathcliff labors to destroy everyone involved in the affair, as well as their children, houses, and distant relations.
Academically, there are a few interesting things going on here. Wuthering Heights can be read as an early Victorian response to (or reworking of) Romanticism, and its approaches to nature and love are surprising for the way they show sympathy for a form of rawness that's otherwise unappealing (and uncharacteristic of the period). As a devotee of both the Burkean sublime and Nietzsche, I was interested in the interplay of rough, calamitous nature, the animal, and Heathcliff's slow-burning vengefulness…which all fits partially, but not very neatly (that's the interesting part), under those two moral/aesthetic frameworks.
But while a book's being "interesting" is normally enough to get me to like it, it wasn't enough here. At one point in the novel, Heathcliff expresses a wish to vivisect the other characters, and vivisection is a good metaphor for the novel as a whole. Reading about Heathcliff is like watching a troubled teen slowly cut a wriggling fish apart--you get upset at the cruelty of the thing, but the whole affair doesn't have much dramatic interest; you walk away feeling bored, sad, and wondering what the point of it all was. The novel doesn't have the psychological insight of a Jane Eyre, or the social complexity of a Bleak House. Curious readers, I give this one an "avoid."