Workers Never Act, But Are Merely Acted Upon
February 19, 2012

Today’s New York Times reports that conditions are improving at China’s infamous Foxconn plant. For this, they credit: Foxconn management for raising salaries and cutting overtime; anonymous “critics” of Foxconn management; “labor rights groups”; an audit by the Fair Labor Association; and, by the transitive property, Apple, for requesting the audit.

Oddly enough, the only people to not get any credit at all are the workers at the plant. This despite the fact that we’re only talking about Foxconn right now because hundreds of the plant’s employees threatened mass suicide in protest of appalling labor conditions.

In other words, that higher pay and reduced overtime is a concession that the workers won through a remarkable act of defiance and solidarity. That sounds like a pretty good story! How odd that the Times decided to tell a different story, in which the workers are merely passive objects. (Even the article’s single oblique acknowledgement of worker agency is framed in the passive tense: “Foxconn facilities in China have experienced a series of worker suicides.” Poor Foxconn facilities!)

Enhanced by Zemanta

Our Serious Intellectuals
March 8, 2011

There are intellectuals, and there are Intellectuals. There’s no real consensus over who qualifies for the former category, but a capital-I Intellectual is way easier to spot. These are people situated roughly within the mid-to-upper brow of mainstream American culture who expound on the important matters of the day and are often referred to in public as intellectuals.

David Brooks is such an Intellectual. In fact, he’s an extraordinarily accomplished Intellectual. And if PZ Meyers’ scorching review is any indication, then Brooks’ new book — called The Social Animal — might be his greatest accomplishment yet. Consider this passage from the review:

The plot is deadly dull: Erica, for instance, ascends smoothly from private school to business management to business leader to significant government functionary to the inner circles of Davos to a blissful retirement spent wallowing in high culture, with only brief stutters — losing a tennis match, a failed business, a brief marital infidelity — which she powers through with the discipline of her will, pausing only long enough for David Brooks to lecture the reader on how the mind overcomes adversity. What story there is here is pure mainlined bourgeois wish fulfillment, a kind of yuppie Mary Sue for the whole of the trust-fund set. There aren’t even any losers to contrast with Erica’s unending winningness, because everyone around them seems to be rising on the same cheerful bubble of privilege.

Nothing changes. In the introduction, Brooks even mentions this, that the story “takes place perpetually in the current moment, the early twenty-first century,” so the characters are born in this decade, grow up in this decade, work in this decade, die in this decade. Brooks has created a world where history doesn’t matter and there are no troubling external intrusions on the blithe reality of Harold and Erica. If ever you are in the market for the antithesis of the Great Russian Novel, here it is, the petty provincial string of anecdotes about only two characters who never experience a moment of self-doubt or inner turmoil.

That’s as good an encapsulation as you’ll find of the Intellectual master narrative: affluent yuppies succeed in their endeavors and feel generally good about themselves, forever. If Meyers’ review is to be believed, then Brooks tells a particularly naked, archetypal version of that narrative in The Social Animal. But just as the salvation narrative reveals itself in so much Christian philosophy, the foundational myth of Harold and Erica provides a conceptual skeleton for nearly every work in the Intellectual canon.

Take another example from the work of David Brooks, a recent New York Times column called “Make Everybody Hurt.” In the limited column space given him, Brooks (1) reaffirms soothing truisms his audience will recognize; (2) presents a not-particularly-dire problem which won’t alarm the audience (thereby making them shrill) but will concern them (thereby making them serious); and (3) offers up a solution that costs the audience nothing but will make them feel brave and clear-eyed for demanding sacrifices from others. In other words, Erica and Harold will have once again saved the world without breaking a sweat.

And then there’s a less obvious example of Intellectual dogma, courtesy of Simon Critchley and, yes, The New York Times (which seems to be a central organ of modern Intellectual thought). In “What Is a Philosopher?” Critchley dazzles the audience with some clever parables from ancient times and a couple of self-deprecating remarks about his profession, but never gets to anywhere substantial or challenging. That’s because the point of his essay has absolutely nothing to do with explaining philosophy and everything to do with flattering the intelligence of his readers. Erica and Harold are intended to walk away from “What Is a Philosopher?” having learned just enough to feel educated witty, but not enough to feel troubled or challenged.

And here we begin to see that the project of the Intellectual is in many ways antithetical to anything approaching a genuine intellectual endeavor. Though few of their kind remain anywhere near the cultural mainstream, many of the intellectuals of yore were pugnacious radicals, rightly reviled by the Ericas and Harolds of their time. Their politics and philosophies were often extreme, their writing sometimes offensive. But their work bled, which was the important part. It sunk its burrs into your spine and clung to you long after you thought you had walked away.

David Brooks is the quintessential Intellectual in that he promises you, in PZ Meyers’ words, a whole life without “a moment of self-doubt or inner turmoil.” But those moments of self-doubt and inner turmoil are where the real intellectual work begins. The intellectuals we need — always, but I suspect now more than ever — are the ones who stir up that turmoil and then kick their readers in the ass just hard enough to make them pull themselves out of it. To those people, the Intellectuals are the enemy.

Enhanced by Zemanta

Culture Crit War
August 24, 2010

El Rey Theater in Los Angeles on Wilshire Blvd...
Image via Wikipedia

My friend Emily has a thoughtful response to my earlier kvetching about the state of left-leaning criticism, but it’s one that I think must be less about the actual charges I made and more about issues Emily was already thinking about and wanted an opportunity to discuss. Either way, we’re clearly talking about different things.

For one thing, I think there’s some confusion over what constitutes left-leaning criticism. In defending its current state, Emily points to “the New York Review of Books, the London Review of Books, the ObserverHarper’s, the New Yorker, the Atlantic, the New York Times Magazine, the Village Voice, the Paris Review and countless other publications,” that, she says, offer up “very good cultural criticism coming more-or-less from the left.”

I’m not exactly sure what more-or-less from the left means, but most of the good arts criticism I’ve read from these sources (and make no mistake, I’m a fan of most of the publications she listed) doesn’t have much of a political slant at all. In almost all cases there’s a very good chance that a certain review was written by someone who holds liberal views, but the privately held opinions of the author alone don’t necessarily make all of her work “from the left.” My critique was targeted at some of those pieces that are, first and foremost, left-leaning political readings; they were from the Atlantic and the New York Times Magazine, and they were really, really bad.

I should also note that they were not, as Emily implies, written by Boomers—Katie Roiphe, Sady Doyle and Justin Miller are all members of Generation Y. None of them can critique Mad Men’s accuracy based on firsthand experience with its setting. (Aside: Funnily enough, Roiphe’s whole shtick seems to be based around a conscious rejection of her mother’s second-wave feminism.) This isn’t about some young whippersnapper spurning the wisdom of his elders. If anything, I’m arguing for a more traditionalist approach to literary and pop culture criticism—an approach that refocuses on some of the actual components of storytelling. To paraphrase Dara, co-author of my original kvetch, I would be delighted if Tony Judt were still alive and writing about Mad Men. But of all the people I’ve kvetched about, none of them come remotely close to being Tony Judt.

Enhanced by Zemanta

The Left’s Poverty of Good Cultural Criticism
August 8, 2010

Don Draper (played by Jon Hamm in Mad Men) of ...
Image via Wikipedia

There’s a reason why I’ve been dedicating so much Internet to push back on bad, or just plain lazy, commentary on this season of Mad Men; and it’s not because I’m a slobbering fanboy (well, not solely because I’m a slobbering fanboy). Because the show is such a locus for cultural criticism right now, it’s a good jumping off point for discussing the condition of pop culture crit at large, especially left-oriented pop culture crit. And as you might be able to guess from my previous posts on the subject, I would evaluate that condition as, “pretty poor.”

Of course, not all the criticism I’ve been hitting comes from the left. Katie Roiphe is, well, Katie Roiphe; her whole shtick revolves around being “counterintuitive” enough to hook in her New York Times-reading audience, but still bland enough that her writing won’t actually challenge them on any level. As for the National Review piece, conservative criticism is just too easy a target (Big Hollywood, anyone?).

(Aside: I am being slightly unfair here. As with conservatism at large, there is some smart and interesting stuff going on around the margins, albeit from people who have been explicitly exiled from the tent. For example, I do enjoy right-leaning libertarian Peter Suderman’s movie reviews.)

But the left could stand to learn some cautionary lessons from the right’s excesses. For example: their complete subjugation of art to ideology, so that every work of film and literature is evaluated on the basis of how conservative it is. This is little more than aesthetic Stalinism, and it’s why evidently sentient, self-aware human beings will sometimes end up championing Red Dawn as a cinematic masterpiece.

On the left, we have not done much better, and Mad Men is a perfect example of what went wrong; rather than being dug into as the rich subject for literary criticism it is, it’s been batted around like an ideological tetherball. Are the female characters sufficiently feminist? Does this show glorify drinking and adultery? Is Christina Hendricks too sexy or not sexy enough? And so on. It’s criticism by checklist.

There is obvious merit to evaluating the gender politics of a work of art, but only if we acknowledge some shades in tone a little more subtle than sexist/anti-sexist. And besides, treating any work of art as if how it approaches these questions is the whole conversation is ludicrous. The anti-Semitism on display in The Great Gatsby, The Sun Also Rises, and The Merchant of Venice is, in all three cases, contemptible; but no one who claims to respect and appreciate literature should ever deign to classify The Merchant of Venice as “an anti-Semitic play,” as if every single line of dialogue was just another morsel of crude, anti-Jewish invective. Doing so would constrict our understanding of the play’s considerable merits; and, indeed, our understanding of ourselves, and art itself.

The same goes for popular culture. It’s not enough to say, “This movie is good because it features strong female characters.” That may be a part of what makes it good, yes—but as I wrote in my previous post, real art is less about answers that it is about impossible questions. Good criticism grapples with those questions, and, in doing so, challenges the critic’s most deeply held principles. Maybe one day, we’ll see more of that on the left and fewer of these dull-as-rocks reaffirmations of our policy positions.

Enhanced by Zemanta

Racism is the New “Enhanced Interrogation”
July 31, 2010

Arguing with Idiots was published by Simon and...
Image via Wikipedia

I take no joy in writing this, but if the latest column from the always-frustrating Charles Blow is an indication, then Glenn Beck, Andrew Breitbart and their comrades are winning the racism debate.

Well, that’s not exactly true. They’re winning insofar as there’s a debate in the first place, once which, predictably, plays out like so in Blow’s column:

Americans are engaged in a war over a word: racism.

Mature commentary on the subject has descended into tribal tirades, hypersensitive defenses and rapid-fire finger-pointing. The very definition of the word seems under assault, being bent and twisted back on itself and stretched and pulled beyond recognition.

Many on the left have taken an absolutist stance, that the anti-Obama sentiment reeks of racism and denial only served to confirm guilt. Many on the right feel as though they have been convicted without proof — that tossing “racism” their way is itself racist.

And so on. This is how it plays out, and will continue to play out, in every major newspaper and on every major television network: “Both sides are calling each other racist! What a crazy debate! And who am I, just a humble columnist for the most prestigious Op-Ed page on the planet, to evaluate their claims against one another? All I know is that they’re both being very, very indecorous.”

If that pungent aroma you smell is bringing back memories of the Bush era, it’s because the right has used this exact same tactic before—most famously when they successfully obfuscated the meaning of the word “torture,” and passive, compliant news agencies played along. Now they’re doing the same thing with “racism,” and even “lynching” (For those keeping score at home, “to lynch” now means “To criticize a white person on the Internet.”).

I have slightly more respect for the nihilists who at least admit their complete lack of moral principles. This is something different: Rather than ever admit to violating a moral principle, or even engaging in a debate over whether or not they violated a moral principle, they instead argue over the meaning of the words used to articulate that principle.

It’s pretty amazing how far they’ve taken it, but I think they could go further. If Breitbart were ever caught beating an unarmed homeless man to death, he could probably extend the trial by at least a few months by calling it, “enhanced robust preemptive self-defense,” and accusing liberal bloggers “high-tech murder” for condemning his actions. Then Charles Blow could write a column about how nobody can agree on the definition of the word “murder,” and we should just agree that no American is a murderer anymore, ever.

 

Enhanced by Zemanta

Objective Journalism and the Politics of Language
July 2, 2010

Cover of "The Oxford English Dictionary (...
Cover via Amazon

I’ll stop annotating my column attacking “objective” journalism soon, but there’s one important point in there I want to expand on. I wrote, “Human language is too complex, too subjective, and too ambiguous to express non-mathematical propositions in wholly mathematical, objective terms,” but word count restraints kept me from presenting a more detailed argument for that claim.

Conveniently, the past couple of days have presented me with the perfect example of what I was talking about: Adam Serwer (whose new solo blog you should bookmark straightaway) writes about a study showing that the New York Times would only refer to waterboarding as “torture” outside of its opinion pages when the waterboarders were not American agents.

The Times’ defense of this position is worth reading, because it makes clear the impossible choice with which they were presented. Impossible, at least, if you have an ideological commitment to “objectivity,” because, in this case, the whole notion of an objective option is ludicrous.

The definitions of words, after all, are not natural facts about the universe. They can shift and mutate. They’re formed by consensus, by context, by speaker and listener. With that in mind, the Times has a point in one respect: yes, to call waterboarding torture is to take an ideological position. Sure, you could marshal all kinds of evidence to suggest that the practice of waterboarding conforms to the definition of the word “torture”; you could point to historical precedent and cite the Oxford English Dictionary. But historical precedent doesn’t mean much given how mutable language actually is, and to argue that the Oxford English Dictionary is the set-in-stone record of the whole English language as it exists right now is, itself, a hotly contested (and, I think, faintly ridiculous) claim.

So the Times finds itself unable to call waterboarding “torture.” They’re objective! And to take a position on the precise definition of a politically electrified word is wholly inappropriate when done from a place of objectivity. There’s simply no objective authority or phenomenon you can refer to in advancing your claim.

On the other hand: waterboarding is torture. It has been torture. People like Joe Lieberman started shifting their own definition of the word “torture” after it was discovered that the United States waterboards. It was a craven, naked manipulation of the English language, a deliberate attempt to undermine the relative stability of language and meaning in order to cover for, well, torture. And the Times, by couching references to waterboarding in euphemisms like “harsh interrogation techniques,” aided and abetted that process. By refusing to call the process by what it was universally understood to be before the United States started doing it, they were providing cover.

The Times’ defense suggests that this charge doesn’t bother them a great deal. But consider this: by providing cover, and tacitly accepting a warping of the English language also accepted by barely a quarter of English-speaking Americans, they are taking a political position—and, for that matter, a minority position.

So this was the impossible position for the Times: on the one hand, they could take a position that would superficially appear “objective,” but would also be morally atrocious. Their other option was a position that does not appear objective, but would at least be morally permissible and conform to the overwhelming consensus on the meaning of English words.

But the Times, seeing a position where there was no “objective” choice, went for the option that would at least maintain the illusion of objectivity. Maintaining this illusion was more important to them than doing the morally right thing.

But in attempting to maintain the illusion, the Times revealed something else: that the doctrine of objective journalism is not some perfectly cool, dispassionate filter through which to view the world, but a dogma of its own that can have just as great a distortive effect on one’s observations as any acknowledged bias.

Enhanced by Zemanta

The Stone Hits a Bullseye
June 28, 2010

Image representing New York Times as depicted ...
Image via CrunchBase

I’ve done my fair share of ragging on The Stone, the New York Times’ paved-with-good-intentions attempt at bringing philosophy to a wider audience, but this pair of essays, intended to respond to the all-too-common complaint that philosophy is too abstract and esoteric to have anything to do with the interests and concerns of real, non-PhD-holding people, is quite good. At the very least, it’s a nice antidote to Simon Critchley’s embarrassing nonsense. And a couple parts in each column nearly had me pumping my fist in the air. To whit, here’s a great excerpt from AskPhilosophers.org founder Alexander George’s entry:

It certainly doesn’t help that philosophy is rarely taught or read in schools.  Despite the fact that children have an intense interest in philosophical issues, and that a training in philosophy sharpens one’s analytical abilities, with few exceptions our schools are de-philosophized zones.  This has as a knock-on effect that students entering college shy away from philosophy courses.  Bookstores — those that remain — boast philosophy sections cluttered with self-help guides.  It is no wonder that the educated public shows no interest in, or perhaps even finds alien, the fully ripened fruits of philosophy.

Yes!

And here’s Frieda Klotz:

Plutarch thought philosophy should be taught at dinner parties. It should be taught through literature, or written in letters giving advice to friends. Good philosophy does not occur in isolation; it is about friendship, inherently social and shared. The philosopher should engage in politics, and he should be busy, for he knows, as Plutarch sternly puts it, that idleness is no remedy for distress.

Also yes!

The point that philosophers should be involved in politics is, I think, a particularly good one; if you believe that the study of ethics and political philosophy has any merit at all, then surely you believe that the people who devote their lives to studying it have something to contribute to, and gain from, the political sphere.

Enhanced by Zemanta

Peter Singer Asks If It’s Ethical to Reproduce
June 6, 2010

Cover of "The Life You Can Save: Acting N...
Cover via Amazon

Here’s a happy surprise: The New York Times’ series of philosophy columns, dubbed the Stone, finally includes one work of actual philosophy, courtesy of ethical philosopher and author of The Life You Can Save Peter Singer.

In the column, Singer asks whether, given the problems future generations would be sure to face—the fallout from climate change being chief among them—it is ethical to bring those future generations into the world. Would it just be better if we all universally agreed to stop having kids?

Unsurprisingly, he concludes that the answer is “no.” But he takes some interesting detours along the way, including a passage on Schopenhauer that serves as an intriguing contrast to some of the existentialist stuff about projecting towards ends we’ve been discussing:

The 19th-century German philosopher Arthur Schopenhauer held that even the best life possible for humans is one in which we strive for ends that, once achieved, bring only fleeting satisfaction. New desires then lead us on to further futile struggle and the cycle repeats itself.

I think you can see the seeds of some existentialist thought in there, although the existentialist would argue that it’s not about achieving those goals—it’s about defining yourself and seeking fulfillment through the act of projecting towards them.

Reblog this post [with Zemanta]

“Bracketing” and The Second Sex
May 29, 2010

Le Deuxieme Sexe
Image by ainudil via Flickr

Fortuitously, the day on which I was planning about writing about Simone de Beauvoir’s philosophy of love as articulated briefly in The Ethics of Ambiguity also happens to be the day that the New York Times decides to run a review of the newest translation of her landmark feminist work, The Second Sex. I haven’t read The Second Sex, but it’s the work that Beauvoir is most well-known for, so I was curious to learn a little about the content.

Here’s what I learned: If the passages excerpted in the review are any indication, then The Ethics of Ambiguity is the better of the two, by far. Here’s a taste from the review:

 Females of all living species are “first violated … then alienated” by the process of fertilization. Derogatory phrases like “the servitude of maternity,” “woman’s absurd fertility,” the “exhausting servitude” of breast-feeding, abound. (How could they not, since the author sees heterosexual love in general as “a mortal danger?”) According to Beauvoir, a girl’s first menstruation, which many of us welcomed with excitement and pride, is met instead with “disgust and fear. ” It “ inspires horror” and “signifies illness, suffering and death.”

Yeeeeah. Wow. But the misstep I find most intriguing is when Beauvoir insists that, “one is not born, but rather becomes, a woman.” That sounds a lot like Sartre’s argument—which I criticized yesterday—that biology plays no causal role when it comes to sexual desire. Nor, Beauvoir argues, does it play any causal role when it comes to gender identity. Even if you think gender is fluid and malleable (which I do), that seems kind of absurd.

And it’s absurd for the same reason. Beauvoir and Sartre both, I think, ignore the lessons of the existentialist’s best friend: the phenomenologist.

Phenomenology is a philosophical tradition very closely tied to existentialism; so much so that many major works of phenomenology contain pretty major traces of existentialist thought (such as in Heidegger’s Being and Time), and vice versa (such as in Sartre’s own Being and Nothingness). This cousin to existentialism concerns itself with the study of being-in-the-world, or the organizational features of our experience of the world. Most works of phenomenology do this without making any judgments about the character of the world itself.

Husserl, one of the fathers of phenomenology, called this “bracketing,” by which he meant setting aside questions about the world around us. For example, a phenomenologist might describe the experience and sensation I have of typing out this blog post on a keyboard and staring at a screen, but he is describing it only as it exists in my head, without making any judgments about the nature of the actual keyboard or screen … or, for that matter, about whether they even exist at all.

Existentialism works best, I think, when it, too, brackets these questions. That’s because, while existentialist notions of the nothingness and infinite freedom of consciousness don’t seem to bear out from empirical study—human beings are fairly predictable and causally determined, by and large—it is an incredibly potent description of the phenomenological experience of being a person.

So Beauvoir and Sartre are correct to a certain extent. Sartre is correct that one doesn’t experience sexual desire as a purely biological sensation (what would it even mean to have a sensation like that?), and Beauvoir is correct that one experiences the behavior and social conventions we attach to gender identity as completely conscious, voluntary things that we could shed at a moment’s notice. But just because we experience the sensation of freedom, doesn’t mean we are free in anything beyond the phenomenological sense.

Update: Okay. Maybe I’m just digging myself deeper, but I feel like I need to clarify some things. So for the record, here’s what I was not doing in the above post:

1.) Offering a critique, in any way, shape, or form, of feminism.

2.) Offering any meaningful critique of the whole of The Second Sex. (It’s no sleight to say that it’s not as good as The Ethics of Ambiguity on a conceptual level, because The Ethics of Ambiguity is damn near perfect.)

3.) Dismissing Beauvoir (who, I should stress again, is one of my favorite philosophers).

4.) Suggesting that gendered behavior is solely the result of rigid biological determinism.

This post was meant mostly as a critique of existentialism. For that critique, I took as a premise that while much of gender is fluid and culturally constructed, biological/physiological factors are not completely irrelevant. That doesn’t just apply to gendered behavior but to a whole host of different character traits; I just chose to focus on gender specifically because it seemed timely with the review coming out today.

Reblog this post [with Zemanta]

Follow

Get every new post delivered to your Inbox.

Join 81 other followers

%d bloggers like this: