Pages

Book: Villette by Charlotte Brontë (1853)

"Quietly devastating" is one of those elegant journalistic clichés that pop up regularly in book and movie reviews to applaud a particular kind of stylistic understatement. It's like the hyperliterate and depressing counterpart to "feel-good comedy of the year"--a phrase that seems made for blurbing a certain form of recognizably literary literature. Regardless of how well-worn the phrase is, I can't think of any more apt way to summarize Villette, Charlotte Brontë's beloved final novel.

Like Jane Eyre, Villette is a coming-of-age story told in the first person by a plain-looking young woman struggling for financial independence. And like Jane Eyre, it is slowly absorbing. For the most part, though, plain Jane is a straightforward narrator, whereas Villette's Lucy Snowe gradually reveals to the reader that she is a puzzle, and intentionally so. Rather than describing difficult passages in her life, she wanders off into elaborate metaphors, like the famous shipwreck image that she uses in place of an actual explanation for why she leaves England for Belgium:

For many days and nights neither sun nor stars appeared; we cast with our hands the tackling out of the ship; a heavy tempest lay on us; all hope that we should be saved was taken away. In fine, the ship was lost, the crew perished.
We never find out what exactly happened here, and it's the first of many times that Lucy Snowe obscures or dodges events happening in her life. Eventually the disruption of these narrative cover-ups becomes more predictable and almost pleasurable: there is a sense of getting to know Lucy Snowe, and guessing what kind of information she may be withholding from us.

If all that sounds pretty psychologically sophisticated for a novel from 1853, I think that's because it is--and it's probably why Villette has come up several times in recent conversations with professors about the greatest novels of all time. The delight in learning to second-guess a fictional speaker is something that seems more common in Victorian poetry (think the dramatic monologue, e.g.Tennyson's "Ulysses"--even if a lot of people do read it straight--and of course R. Browning) than in the period's novels. In Villette, though, narratorial unreliability becomes more of an endearing quirk than a damning "gotcha"--we get close to Lucy Snowe by learning how she tries to put distance between herself and us. The pleasure of that growing intimacy, and the final turns by which it's shattered, make the conclusion almost unbearable. There isn't much in the way of redemption here, but there is a lot in the way of power. This really is one of the greats. Highly recommended.

Vocablog: umbrageous

umbrageous, adj.

Of a thing, shady; of a person, easily offended.

(Found in Charlotte Brontë's Villette.)

Usage example: I walked through umbrageous alleys, dodging umbrageous men.

Book: What Is Posthumanism? by Cary Wolfe (2010)

The title of What Is Posthumanism? may be a little bit misleading. Wolfe's book isn't exactly a primer in the idea or methodologies of posthumanism. (With the possible exception of the introduction, I have a hard time imagining assigning this book, or most selections from it, in an undergrad course on the topic.) What it is instead is a collection of essays roughly divided into two sections.

The first section examines the way that posthumanism and systems theory fit within academic and philosophical debates of the last fifty years or so. I can't emphasize enough how useful this was to me. I don't come from a theory-heavy background, and most of the thinkers branded "post-structuralist" and associated with deconstruction--e.g., Derrida, Deleuze, etc.--have always left a bad taste in my mouth. They are obscurantists; they use densely packed jargon and flirtatious French puns to camouflage the generally unremarkable character of their thought.

Wolfe has me seriously questioning that dismissal. Derrida is a cornerstone of the first half of his book, and he successfully shows how Derrida's formulations, which seem so inaccessible, are actually struggling with the same problems that systems theory is trying to tackle. It's hard to paraphrase these problems briefly, but I'll try. They center on the question of how, in a world of largely arbitrary distinctions (the arbitrariness of language, for example, or the arbitrariness of species divisions), meaning can be generated in anything like a stable fashion. In the process of showing how Derrida and systems theorists are both working toward the same answer from opposite directions, Wolfe implies--I think--that so-called post-structuralists and systems theorists are both still striving after the dream of a theory of everything, a theory that might unite all fields of study along some common principles. That idea is one I associate with structuralism, as Robert Scholes outlines it in his 1975 book Structuralism in Literature: An Introduction. All of which means that cultural studies since 1970--which is often attacked for growing increasingly chaotic and directionless, always chasing after the "next big thing"--may actually have a more coherent and defensible story of development than is normally thought.

Wolfe's other meditations on disciplinarity are also useful. His division of various famous critics and philosophers in terms of their concerns (humanist or posthumanist) and their disciplinary relations/knowledge organization (humanist or posthumanist) is really thought-provoking. According to this rubric (which comes with its own handy chart), it's possible to tackle supposedly revolutionary posthumanist subjects--the ethics of our treatment of animals, for instance--in humanist ways that dampen or nullify what might be most interesting about the topic. Thus, someone like Peter Singer practices a humanist take on posthumanist concerns (a humanist posthumanism), whereas someone like Bruno Latour practices a posthumanist posthumanism. This seems like a crucial distinction to me, and I don't know of anyone else theorizing it as clearly as Wolfe. My one major complaint here centers on Wolfe's ideas about animal studies. He never even mentions what seems, to me, like an obvious question: if we're going to question the way we use arbitrary distinctions like species barriers, why does that questioning implicitly stop at the animal kingdom? What about our understanding of plants, fungi, etc.? And if those seem like absurd questions, I'd like to hear someone as clear-minded as Wolfe explain why that should be the case.

The second half of the book consists of studies of individual works of art--the theory of the first section put into practice. Most of Wolfe's readings consist of syntheses of a few key thinkers (Derrida, Luhmann) to overturn previous consensus about the meaning of, say, Dancer in the Dark or Emerson's idea of the self. These applications often rehash the same theoretical ideas, but not in a bad way: posthumanism and systems theory can be abstract and counterintuitive, so it helps that the ideas are repeatedly hammered home. Especially nice is the critique of visuality that bubbles, submerged, through most of these readings. Key insights or definitions pop up in these readings in ways that might be better organized, but that just makes reading the book in its entirety more vital. For example, here's a definition of systems theory (which is always hard to explain) from a chapter on architectural proposals:

[T]he conceptual apparatus of systems theory . . . is based on the central innovation of replacing the familiar ontological dualities of the philosophical tradition--chiefly, for our purposes, culture and nature--with the functional distinction between system and environment. (205)
Yes, it's still wordy, and it requires examples to explain why it's so useful in understanding the world--but it's a great start. So is the rest of this book. It may not be organized as an ideal introduction, but it works as a good point of entry for someone roughly familiar with current ideas in literary study. I expect I'll be combing through its references and footnotes for a long time to come.

Computers...taking over...must...make...soulful art...

I've been thinking a lot about this recent op-ed by Jaron Lanier, which claims that we commit a "devaluation of human thought" when we treat computers like minds.

Lanier gets a number of the facts right: the artificial intelligence spin is a sure boost to any technological story, and the analogies between computer "intelligence" and human intelligence are largely overblown in order to win press coverage. He's also right that it's dangerous to imagine that we'll ever know for sure if it's possible to download human minds into software; any tests we can devise to "make sure" that consciousness has been effectively digitized will be the exact tests that programmers and designers are building systems to satisfy. (We all know that it's possible to train kids to ace a test without educating them; the same risk exists in the design of computer intelligence systems.)

For the most part, I side with N. Katherine Hayles, whose book How We Became Posthuman explains how we came to believe in the idea of a machine intelligence that could be equal to or greater than our own. Hayles has serious doubts about that possibility, namely because it depends upon a belief that the nature of human (or even animal) intelligence could be divorced from its material structure. The mind, she points out, doesn't operate on simple binary principles; until we can create structures that exactly mimic the material workings of the human brain--and to do that, they must be incorporated into a nervous system and "body" that closely parallels our own--we won't ever create an intelligence that works like ours, but better.

That said, I think Lanier has overly simplistic and idealistic images of the bottomless wonder of human intelligence and creativity. He wants to preserve our gasping admiration of "personhood," and feels threatened by the possibility that we may "begin to think of people more and more as computers."

This concern--that human individualism is under fire--has been around for about as long as the idea of human individualism, and I don't see what's compelling or new about Lanier's concern about A.I. It could just as easily be Lanier panicking about the Industrial Revolution turning people into machine parts, about capitalism turning people into "the bottom line"--hell, he could be panicking about how the invention of writing devalues the presence of human teachers, advisors, and memory. His worries that people are passively letting computers mechanically determine their aesthetic choices through Netflix or Amazon is particularly offensive, since it imagines that

(1.) anyone is actually mindlessly doing such a thing [!?] and that
(2.) the aesthetic is the realm of individual expression and taste, which--prior to the existence of evil mechanical algorithms--was some sort of sacred realm in which untarnished individuality blossomed.

I guess those of us who study the arts should be grateful for this bone being tossed our way by a member of one of the more "practical," "sensible" fields. But this is what scientists and engineers always do--they make pleas for the arts and the aesthetic by turning art into a wonderful world that defies logic and keeps us grounded, soulful, human. If this is the price to be paid to get public respect for students and producers of the arts--if we're allowed to be kept around only as the infantilized and inexplicably necessary darlings of scientists, as their extracurricular activities, toddlers and lap dogs--I think most of us would rather be put down.

Let me offer my grounded, soulful, human rebuttal to Lanier's paranoia. Any useful technological breakthrough acts as an extension or distribution of mental processing, so it does work as part of a mind--one that includes human minds in its circuit. It's ridiculous to imagine that the human mind is somehow separate from these systems--as if it weren't altered by the use of a technology like language or writing. All these technologies interact with human minds in reciprocal ways. Recognizing the importance of the human brain in this system is crucial, but we should also recognize that the line between "active mind" and "passive tool" is more arbitrary than someone like Lanier--who wants us to think of computers as "inert, passive tools" to keep his idea of the ineffable human intact--is willing to admit.

Book: Barchester Towers by Anthony Trollope (1857)

One of my first reviews on Nifty Rictus was of Anthony Trollope's The Warden. I recently had the mixed pleasure of reading its sequel, Barchester Towers--the second in the six-novel Chronicles of Barsetshire series--with an informal Victorian Fiction reading group, so it seems only appropriate that I post some thoughts on it.

The first half of Barchester Towers feels familiar. We're reintroduced to Septimus Harding, his daughter Eleanor, his son-in-law Archdeacon Grantly, and an expanded cast of characters that includes many of the minor players in The Warden. In fact, the set up is almost like an extended remix of The Warden: without giving too much away, Eleanor is once again a desirable single woman, and Mr. Harding is once again in a position where he can be tortured by the ethical implications of accepting a controversial clerical appointment.

Stylistically, the early parts of Barchester Towers show us an author who is unable to control one of the most horrible habits of nineteenth-century novelists, a habit that I'm convinced Trollope picked up from Thackeray: the mock-epic tone. Instead of giving straightforward descriptions of, say, a dinner party, Trollope endlessly ridicules his characters by comparing their every act to some heroic deed the act doesn't remotely resemble. While this is supposed to amuse us, it is clumsy and overused here, and it feels juvenile.

The second half of the book picks up substantially. The mock-epic treatment drops away as Trollope grows emotionally invested in the complex tangle of his characters' personal and professional lives. (Even this pattern of mock epic giving way to more sophisticated seriousness feels Thackerayan--I have almost exactly the same experience reading Vanity Fair.) Best of all, Eleanor goes from a daddy-loving cipher to a stubborn and realistic character--a welcome change that makes the novel's emotional climax all the more touching and satisfying.

Still, it's a bit of a slog to get there. Trollope's sketches of characters like the Thornes and the Stanhopes are brilliant throughout--the Stanhopes, our group decided, are straight out of a Wes Anderson film--but the author stumbles in his early attempts to put them into action. If you can wade through that (or if you chortle every time a dinner comment is compared to volley of arrows), it's a worthy read. Otherwise, I might pick up something else...maybe even (as I'll discuss in another post) Trollope's Autobiography.

Misbehaving Like a Professional

Katie Roiphe, who seems to be making an impressive career out of writing thought-provoking essays and books about sex, has a nice piece in the NYT that asks whether we aren't too obsessed with health and goal-oriented activity in comparison to the brilliant slackers of the past.

There does seem to be an almost unbearable focus on "professionalism" on the job today. In addition to Roiphe's piece, the NYT regularly runs features about how, if your Facebook account references or--gasp!--shows pictures of yourself enjoying your alcohol, you can pretty much say sayonara to any chance of getting that job you were hoping for--even though your recruiter has probably detagged countless similar pictures of him-/herself.

I wonder if there isn't something about the immediacy of digital media that makes us more delicate and more shocked by evidence of behavior that people have always engaged in and always told stories about. It's one thing to hear "X had too much to drink at that office party" and another to see images of X passed out on a bathroom floor with someone giving a thumbs up above her head. One provokes a chuckle, and the other a much more visceral, judgmental reaction.

The kind of devil-may-care behavior Roiphe describes hasn't exactly vanished, I don't think--it sounds a lot like most people's idea of college. But nowadays we put a lid on it after those four years, or at least demand that adults publicly pretend to have put a lid on it in professional circumstances.

There is some evidence downright contradictory to Roiphe's argument. Even "back then," in the early '60s Roiphe eulogizes here, contemporaries were reporting a sense of falsehood or disingenuousness to their supposedly carefree sinning, a sense that even their craziness was strictly scheduled. Here's Walker Percy, from his National Book Award-winning 1961 novel The Moviegoer, talking about the "malaisian," his word for modern man:

[Christians] keep talking as if everyone were a great sinner, when the truth is that nowadays one is hardly up to it . . . The highest moment of a malaisian's life can be that moment when he manages to sin like a proper human (Look at us, Binx--my vagabond friends as good as cried out to me--we're sinning! We're succeeding! We're human after all!).