Pages

Movie: "Fur: An Imaginary Portrait of Diane Arbus" (dir. Steven Shainberg, 2006)

"Fur" is intended, not as an accurate biopic, but as "an imaginary portrait" of the photographer's "inner experience," a text warns at the beginning of the film. I've never read Patricia Bosworth's biography of Diane Arbus--which supposedly inspired Erin Cressida Wilson to write "Fur"--so it's hard to judge how accurately the photographer's psyche is rendered here. But even apart from any standard of truth, the story told by "Fur" has its advantages--and shortcomings.

The biggest liberty taken in the film is the invention of Lionel Sweeney (Robert Downey, Jr.), a man who moves into an apartment above Arbus (Nicole Kidman) and her family. Lionel, who suffers from hypertrichosis--a disease that covers his entire body and face in hair--introduces Arbus to his social circle of freaks and outcasts, which gives her the courage to take up photography outside the oppressive constraints of her husband's photography studio.

The first thing to note about "Fur" is that it is gorgeous--visually and aurally, it is richly sensual, and every performance in the film is striking. The writer-director team of Wilson and Shainberg also deserves credit for pushing the envelope on themes of fear, violence, and sexuality, as they did in their earlier collaboration, "Secretary." Arbus's pleasurable and even sexual fascination with things that frighten her makes "Fur" interesting. That interest is partially undercut, however, by a storyline that we all know too well: female free spirit suffers under domestic oppression until a surprising, risky experience reveals the magical world she has been missing. It doesn't help that Kidman played Virginia Woolf in "The Hours," which also capitalized on the same old story of which Woolf--with her repressive upbringing, marriage, artistic awakening, and final conflicted suicide--is such a convenient example. It's the stuff of a simplified Woolf biography, or of Kate Chopin's "The Awakening," or of Gilbert and Gubar's Madwoman in the Attic: the story of female originality suffering under restraint is beginning to feel unoriginal and, in fact, restraining.

There's also something unrealistic about the way "Fur" portrays freakishness--as if abnormality, like normality, were an actual "thing," a common code that could unite a community. To paraphrase Tolstoy, normal people are all alike, but every abnormal person is abnormal in their own way. What makes Arbus's body of work so interesting is that she managed to make everyone look freakish, an accomplishment that said very little about the world and very much about the workings of her own mind. By embodying freakishness in the figure of Lionel and his community of freaks who all somehow know and like each other, "Fur" turns Arbus from an expressionist--someone who shared her own nightmarish ideas of the world--into a reporter on a lovably off-kilter world already in existence.

What I can't decide is whether or not this transformation was intentional. As Susan Sontag notes in On Photography, Arbus herself talked about her work this way, acting as though her subjects were always as strange as they are in her portrayals, talking about them with "the childlike wonder of the pop mentality." So far as I can tell, then, "Fur" does seem to paint an accurate picture of the way Arbus described her "inner experience"; the problem is that Arbus's descriptions don't match up with her actual project. So to see Arbus's work from the inside is almost necessarily to misunderstand it as a kind of delightful vacation to a strange world. This movie beautifully renders that misunderstanding, but without context, it threatens to propagate and popularize an unrealistically uplifting ideal of her life and photographs. The most obvious limitation of this take on her work is that it renders her eventual suicide almost incomprehensible--and indeed, "Fur" ends without mentioning it.

The Mind Is Not the Brain?

Recommended: a couple of good online reflections on Marilynne Robinson's new book, Absence of Mind: The Dispelling of Inwardness from the Modern Myth of Self, from D.G. Myers and Amelia Atlas.

While both these reviews are positive, I don't think I'll be picking up a copy anytime soon. Robinson is writing against what she calls "parascientific writing," a handy term that leaves respect for science intact while disparaging the recent trend of popular, scientifically-informed books that tend to argue against a spiritual realm, or that tend to denigrate individual experiences of consciousness.

Agreed: a lot of modern empiricism (and so-called empiricism) tends to denigrate individual accounts of what life or experience consists of. It wants to tell people what they're "really" doing, "really" thinking, and what the world is "really" like, and refuses to listen to what they themselves have to say. Robinson identifies the problem wonderfully. But the solution to this problem doesn't have to be a retrenchment in spirituality, or jaw-dropping wonder at the imponderable nature of existence/the human mind, or a dismissal of neuroscience's materialistic findings about the brain.

In fact, a number of thinkers in a variety of fields have been troubled by the same problem that disturbs Robinson, and have, I think, solved it better. For instance, Robinson's concern about the way "parascientific" writing downplays human experience has also been expressed by the sociologist Bruno Latour. Latour encourages researchers to take people seriously when they describe their sense that superhuman forces, for instance, are compelling them to do things. For someone like Latour, this information is a valuable clue to how the human mind, and how social activity, functions--it isn't some kind of charade to be exposed. (Latour makes this argument in Reassembling the Social.)

Easily my favorite book in this vein--and one of my favorite books of all time--is Gregory Bateson's Steps to an Ecology of Mind. Bateson was a polymath who dabbled in anthropology, sociology, animal communication, and a number of other fields. Steps is an anthology of his writing from the 1940s through about 1970, showing the way he worked through questions of information and communication to form a new picture of what "mind" might mean--a picture that respects human experience and tries to place it appropriately in the framework of a larger system. His view manages to include both the mystical awe that comes with true appreciation for mental processes, and an empirical view that respects and deploys scientific understanding.

Later systems theorists and posthumanists have followed up on this work in various ways that--while equally fascinating--are more academic. (I'm thinking of people like Niklas Luhmann, or Cary Wolfe, whose What is Posthumanism? I'm currently devouring.) If you're interested in this problem--the poverty of popular scientific explanation, the difficulty of reconciling certain scientific claims with felt human experience--I strongly, strongly recommend Bateson's work. But my broader point is that we're at a kind of explanatory impasse, and Robinson has done a fine job identifying it--but the impasse doesn't mean we need to retreat to one camp or another. It means it's time to scout for new paths, and I think systems theory, in its various forms, offers them.

Vocablog: megrim, megrims

megrim, megrims, n.

In the singular (megrim), a bad headache or migraine; in the plural (megrims), depression or melancholy.

(Found in George Eliot's Middlemarch.)

Usage example: Recent news about the trade deficit has given many economists the megrims. I don't have a head for numbers, though, so trying to understand it all gives me a megrim.

Other useful forms: megrimical, adj.; megrimish, adj.

Phrenology returns?

Good news for people with giant heads: the journal Neurology reports that people with bigger heads are less likely to show signs of Alzheimer's disease--even when they have the same percentage of brain cell death as people with smaller heads.

Scientists involved say that head size can reflect brain size, which in turn may be correlated with the "reserve" brainpower that subjects may be able to draw on when the brain is otherwise damaged.

I'd love to see the proposal for this study--it seems so comically old-fashioned. ("Well, see, we're goin' ta look at the egg-heads, see, and the pin-heads--ya follow?--an' we're gonna compare 'em, see...") It's actually surprising that scientists are willing to correlate brain size to head size, given that relative brain sizes between demographic groups are at the forefront of controversial debates about "intelligence." I guess the idea that character traits can be read from the appearance of a person's head will always be seductive--the question is whether it's too seductive to be realistic.

Agnostics Battle Atheists: Sabbath Spectacular

Andrew Sullivan points out an online scuffle going on between agnostics, represented by Ron Rosenbaum at Slate, and atheists, represented by Julian Sanchez, currently a fellow at the Cato Institute.

As a former agnostic, I find it hard not to agree with much of what Ron Rosenbaum says. The "New Atheists," as they're called--Richard Dawkins, Christopher Hitchens, and that whole loose camp--are a little too vocal, a little too certain, and a little too abrasive to be attractive. That said, Julian Sanchez is right-on in his attack on Rosenbaum's logic: the fact that empirically-minded atheists can't explain creation doesn't mean they have to throw up their hands and vocally admit defeat. Just because you can't prove that something DOESN'T exist doesn't mean you can never commit to a position about its existence with reasonable certainty.

Sometimes I wish that atheism hadn't been perverted by this Enlightenment-obsessed New Atheism. My personal trip to atheism went along substantially different lines. I remember learning the word "agnostic" from a classmate in junior high, and (having been raised outside of any religious tradition) thinking something silly along the lines of, "Oh. That's a neat word for what I am." I stayed in the "I-dunno-about-God" camp--even when I immaturely delighted, as the New Atheists do now, in skewering believers--until a couple of years ago, when a sudden realization changed my mind.

The realization was that agnosticism is a kind of cheating. Here I was, not giving a damn about the Ten Commandments, not even really bothering to look into religious issues, but still claiming that I was somehow on the fence about the existence of a God. I asked myself: "If I really believed there might be a God, would I be living the way that I am?"

The answer was obvious: No.

If you are open to the possibility of a God in the only sense that seems worth arguing about--an all-knowing, all-powerful being (or group of them) that made the universe and mankind--then your very first task should be to figure out what that God might be like, in order to understand how to live in accordance with Its (His? Her? Their?) program. The stakes are unbelievably high, and by acting as if they weren't, I had in effect made my choice. In practice, I was an atheist--but when confronted, I was making excuses that, in the presence of an actual God, weren't likely to hold up.

I wasn't about to dedicate my life to religious inquiry--you can't just whip up belief like you whip up a batch of cookies--so I "converted" to atheism. (One perk: no paperwork involved.) But my atheism is really about a lack of belief in any god, unlike the New Atheists, who are, as their critics everywhere note, fundamentalist believers, worshippers of the greatness of No God. According to these faithful atheists, No God (sometimes known, arguably incorrectly, as "Science") will step in and solve all the world's problems, if only we would all listen to the Good News.

It's the latest rebranding of humanism, which is, pretty obviously, a religion--but one that, as far as I can tell, replaces the worship of one beloved imaginary being ("God") with another ("Mankind"). Agnosticism is essentially another brand of humanism, too--it's just not sure whether it should give God a quick nod before it returns to its primary, human-centered concerns.

Outdated and dangerous ideas about "Man" and "the human" are, I think, the real problem with most current belief systems. As a result, I don't have a lot of love for either side of this New Atheist/New Agnostic coin.

Isolationism and American Writers

Not too long ago, Horace Engdahl--permanent secretary of the Nobel prize jury--attacked American writers, arguing that no American had a chance of winning a Nobel in literature until we stopped drinking our own urine and marrying our siblings. (At least he'd read his Faulkner.) Many American luminaries responded in kind. Personally, I was shocked that the Swedes, who are known primarily for their cooking skills and hairy, eyeless faces, would suddenly turn on our culture, which has looked so lovingly upon them for so long.

Anyway, it took about two years, but now we have a weightier, more thoughtful response to Engdahl, in the form of a recent review essay by Tim Parks in the New York Review of Books. Parks, an internationally-based writer of British birth, reviews a few recent publications designed to redress concerns about American writing and the novel in general. He comes to a number of great conclusions that ring true, among them the fact that it's actually very difficult, if you remove place names and other cultural markers, to tell the difference between writers of any nationality.Writing about the recent publication Best European Fiction 2010, he observes a broader trend:

Each writer appeals confidently to an international liberal readership at the expense of provincial bigotry and hypocrisy . . . Across the globe, the literary frame of mind is growing more homogeneous.
I doubt Parks means this as the damning statement that I hear when I read it. In any case, his review does a fantastic job of addressing both the state of contemporary European writing and the statistics that would seem to support European internationalism over American isolationism. It's worth a read.

Thoughts on Ren and Stimpy

There's an awkward hour or so between the time I normally get off the phone with my fiancée and the time I go to bed--a segment between about 11pm and 12:30am when I'm too tired to read but that's too short for a movie. I used to fill this time with whatever shows were on TV, which is dangerous--both because I tend to watch them for longer than I mean to, and because they tend to feature, in the words of Jerry Seinfeld, celebrities telling bad stories about their plumbing.

Then I discovered old television series on the Netflix streaming Wii disc, and my quality of life soared.

If that's not exactly true (I recently made the mistake of watching a horrible Zach Galifianakis special), Netflix's ever-increasing library of worthy TV shows has definitely helped. That's how I just ended up watching the first episode of Ren & Stimpy--a show I hadn't seen since I was 9 or 10.



Even now, The Ren & Stimpy Show looks different from other shows. For one thing, it had much longer intervals in which characters were shown expressing over-the-top emotions while music played in the background. But even more unusual is the way the show moves back and forth between cartoon flexibility, where characters can stretch or undergo violence in impossible ways, and a disgustingly detailed fleshiness. In one shot, Ren looks like a droopy little dog; in the next, his strangely human butt is being paraded in front of the camera. Likewise, Stimpy sits on a mound of glittering color in one shot that then materializes into Gritty Kitty Litter that the characters squeeze between their toes or crunch repulsively between their (now gigantic and gummy) teeth.

I can see how Ren & Stimpy eventually got old by failing to adapt, the same way that South Park would have gotten old if it hadn't moved from being a show about children saying "f*ck" into the realm of social satire. But there's still a lot to it. Ren & Stimpy's interest in interrupting the pure, imaginary realm of cartoon interactions with reminders of gross bodily function feels very familiar, not from other shows but from life itself--it does a great job of representing the embarrassing intrusion of dirt and grime and other crap into the mental and emotional world that we all like to imagine we occupy. That gives the show its own strange brand of realism in a famously unrealistic medium--an admirable achievement.

Book: The Seymour Tapes by Tim Lott (2005)

I tore through The Seymour Tapes (2005) in about two days. It was one of my purchases at the "Exposed" gift shop at the Tate Modern, but it doesn't seem, for whatever reason, to have made it into paperback in the U.S.

The novel is written as a series of transcripts of interviews between "Tim Lott" and members of Dr. Alex Seymour's circle of family and friends in the wake of Seymour's death. Seymour, a physician living in London, has become a media sensation after a video related to his murder was leaked online. As the story unravels, we learn of Seymour's increasingly elaborate installations of surveillance equipment in his own home--an operation that he hopes will restore the kind of domestic order that he feels is his fatherly responsibility. His foray into surveillance, however, puts him beyond the pale of normal social and sexual boundaries as he becomes increasingly close to the disturbed American Sherry Thomas, the owner of the company that lends him equipment.

Lott's set-up here is intriguing and topical, and the format of the book is a stroke of genius--the transcripts bring us as close as a text can to the issues of voyeurism, truth, and immediacy that underpin the story. Lott plays nicely with questions of narrative point-of-view and journalistic ethics, adding extra turns to an already twisted tale. The turns never feel labored, but I did find some of them disappointing--without giving anything away, I'll say that I think the ending makes this a much more traditional story of ethical violation than it feels at first, a return-to-normalcy that sacrifices some of the moral complexity that makes the set-up so exciting. The final abandonment of the transcript form in the novel's coda reveals some of this disappointing information, and it should have been cut. As it stands, the coda tries to frighten in an uncharacteristically simple way, kind of like someone jumping out of a closet and yelling "OOGA BOOGA BOOGA" at the end of an otherwise mature and upsetting psychological drama.

That drama is worth the forgettable ending. I'd recommend The Seymour Tapes pretty highly to any kind of reader, especially if you're interested in suspense novels, surveillance, or the difficulty of separating truth from fiction.

Art Review: "Exposed" @ Tate Modern (London)

"Exposed" is sort of overwhelming--I had to visit twice to see the whole thing thoroughly, because the museum closed 3/4 of the way through my first visit. This sprawling, 14-room show, conceived by Sandra Phillips of SFMoMA, is a look at the ways photography has taken on issues of "exposure," particularly issues of voyeurism and surveillance. It is broadly divided into sections devoted to "The Unseen Photographer," "Celebrity and the Public Gaze," "Voyeurism & Desire," "Witnessing Violence," and "Surveillance."

The sheer amount of space devoted to "Exposed" means that the show can present some material in historically accurate ways that I, for one, have not had the chance to see before. Case in point: anyone reasonably familiar with photography as a tool for exploring sexuality has seen images from Nan Goldin's classic "Ballad of Sexual Dependency," but "Exposed" goes to the trouble of showing them in something like their original form--as a slideshow in a darkened room, set to the music of the Velvet Underground and others. Likewise, Japanese photographer Kohei Yoshiyuki's infrared images of couples and groups secretly having sex in parks in 1970s are displayed in a darkened room with spotlights--a reference to their original gallery presentation in the 1979, where guests in dark space were given flashlights to view them.

While the spread and treatment are nice, the show sometimes feels like too much. It needs some cutting and narrowing. The sporadic insertion of images by well-known streetshooters (Henri Cartier-Bresson, Robert Frank) feels especially intrustive, like big names unnecessarily larding a show that might have been more targeted. I found myself confused: were streetshooters featured because they exposed their subjects in everyday life, or because they hid their cameras for fear of their own exposure as thieves of others' images? That unexplored ambiguity seems, to me, typical of the show as a whole. It feels like the vast range of images gathered here is united only by the many things "Exposed" could mean. I would have liked to see a tighter sense of organizational principal, both historically and thematically, rather than a show that felt guided by the ambiguous pun of its verbal title.

Nevertheless, there's something to be said for capacious messiness. This is a rich show with something to delight (and, I think, surprise) almost anyone, from the photographic beginner to the more seasoned enthusiast. Alongside what I would consider "over"-exposed art historical names, It manages to include a number of less popular images and artists--it was especially impressive, I thought, on issues of celebrity fandom and sex. Cammie Toloui's pictures from the "Lusty Lady" series are a particularly fantastic choice. Shot from her point of view as she worked as a stripper in a San Francisco peepshow, Toloui's grainy black-and-white images occupy some unclear space in the network of gazes that forms between an erotic dancer and her (often exposed and masturbating) audience--a reflexive experience made visible as the reflection of Toloui's nude body in the glass divider blends uneasily with the images of her voyeurs on the other side.

Unfortunately, "Exposed" ends on a weak note, as the last section, which covers the hot-button issue of surveillance, is disappointingly cold. (The exceptions--Shizuka Yokomizo, Laurie Long--prove the rule.) That's less a curatorial blunder than a reflection of what I see as the stagnation of a lot of contemporary political photography. Historical selections can be excused for unfetching aerial images of, say, Normandy on D-Day. But contemporary photographers also seem unable to make surveillance images interesting. They tend towards either dull, grainy pictures of current surveillance tools, or conceptual work that justifies its bland visuals with chunks of text--text which, by the time you get to this room, you will probably be too tired to read. Both approaches seem like poor ways of capturing public attention on issues of pressing social importance.

The whimper that ends this show is driven home by an installation located outside the (surprisingly excellent) gift shop. While it's unrelated to "Exposed," it captures what's broadly wrong with the surveillance part of the exhibit. Called "Night Watch," by Francis Alÿs, the installation consists of a wall-sized array of closed-circuit televisions showing different rooms in the National Gallery. It appears as ponderous (and as boring) as many of the works in the surveillance section, but you stand before it a while anyway, just to look thoughtful--then something moves in one of the monitors.

It's a fox! This incongruous little animal, dashing from room to room of the museum and from television to television in the grid, embodies the kind of surprise and mystery you expect when you take on the role of voyeur. The playful artificiality of this introduction of a wild animal to the museum may offend the high moral seriousness of much of the surveillance work, but it makes you think a lot more about your expectations and desires, as a watcher and as a human being, than the more overtly political work that ends the exhibit…and isn't that, after all, what good art is supposed to do?



"Exposed" runs at the Tate Modern in London through October 3, 2010.

Other takes on the show:
3quarksdaily, the Guardian.

Images, from top, copyright Kohei Yoshiyuki, Cammie Toloui, and Francis Alÿs.