Monday, December 20, 2004

Technologies of Knowledge, Pt. 2

The Google Library Shockwave keeps rolling along. See my first "Technologies of Knowledge" post for my preliminary comments, but also read this new essay "Paradise is paper, vellum, and dust" in the London Times Online.

Wednesday, December 15, 2004

The Desert Island Philosopher

Recently, and after a long interval, I renewed my acquaintance with Friedrich Nietzsche. I was instantly reminded why he was my favorite philosopher when I was eighteen -- he probably is still so today.

During my freshman year in college, like a lot of other freshman, I wore out my paperback copy of the Viking Portable Nietzsche. I had already been pretty well snagged by philosophy already, through reading Plato, Pascal, Hobbes, and David Hume, among others, but Nietzsche was (and remains) different -- not just for the famous conclusions, although they're formidable: the death of God, the emptiness of liberal morality, a growing distrust in reason, science, and progress, and consequently wariness of an encroaching nihilism against which some new, positive affirmation must be made. It's the ethos with which those conclusions are reached: a probably self-contradictory scientific drive to eliminate anything resembling predetermined conclusions or received wisdom: mind-shattering blasphemy as experiment. It's what makes Nietzsche still probably the only avant-garde philosopher.

It's nevertheless surprising just how often Nietzsche refers to experiment and its counterconcept, habit. This is especially noteworthy in Nietzsche's critique of truth, less a postmodern break than the most brilliant and sustained reprisal of the argument Hume had advanced more than a century before -- that what most people (including virtually all philosophers) take to be metaphysical truths are really just our lazy, habitual prejudices and errors of perception, rephrased by way of a linguistic mistake into scholarly-philosophical terms. Hence Nietzsche's practice as the counterconcept to philosophy as practiced hitherto: philosophy as experiment, coupling an abhorrence of systems with a polemic wit designed both to discomfort others (and oneself) and explore intriguing and explosive alternatives, wherever they might lead.

If there's anything that distinguishes the use or misuse of philosophy in the study of literature from its use in mainstream philosophy departments, it's this desire/willingness to experiment, to (with all due d/reference to Kant) reason outside the limits of reason alone. Often in the study of literary texts, scholarship is nonempirical, if not antiempirical altogether: it's pure argument, with innovative interpretation as the stakes. Just as Kierkegaard argued with respect to faith in God, you have to leap. Which is why Nietzsche, with his light feet and dancing wit, is the philosopher who just may still have the most to offer -- even seven years after I left home, renounced religion, and dreamed of becoming the Übermensch.

Tuesday, December 14, 2004

Technologies of Knowledge

Every so often in a blogger's life, the stars are aligned, and the idea you've been sitting on for a while anyway suddenly becomes newsworthy. While the dynamic duo at Snarkmarket have sent their "EPIC 2014" doomsday predictions scattering across the blogosphere, there was another print-vs.-digital clash well worth a little consequence-exploration.

This week, the Chronicle of Higher Education features an article titled "College Libraries: The Long Goodbye." Apparently computers, having finally made ghastly research libraries manageable, now may be on the way to doing away with them altogether.

The bullet-point summary of the state of affairs might read something like this:

  1. Compared to software, books are expensive;
  2. Likewise, books take up too much space;
  3. And compared to outsourced tech specialists, so do librarians.
The author doesn't endorse these positions, but rather gives some arguments for the continued relevance of the research library. Some of these are quite ingenious: for example, the main problem for libraries is much less the obsolescence of print media then the sudden glut of new materials: more than three times as many books are published now versus just thirty years ago, not to mention new journals, periodicals, etc.

This might be a good moment to mention what some of us are calling the dematerialization thesis -- the argument, whether in valediction or lamentation, that digital media has overcome the meaningful physical limitations that characterized earlier forms. As far as I can tell, the locus classicus of this thesis (with respect to modernity writ large) is German sociologist Georg Simmel's 1899 book The Philosophy of Money, but another good source is Friedrich Kittler's more recent Discourse Networks 1800/1900, and its follow-up Gramophone, Film, Typewriter. There are all sorts of problems with this argument (which I'll detail at another time) but it's undeniable that faced with a space crunch, institutions are opting for what I'll call the greater material flexibility of digital media. Many libraries, perhaps most notably that of Stanford University, have made huge investments in converting or copying their collections into digital formats, and publishers have likewise targeted libraries as prime consumers of electronic texts, whether as a backup to or substitute for "the real thing."

Then today, the New York Times made the remarkable EPIC-esque announcement: "Google Is Adding Major Libraries To Its Database." It's true -- and there are no slouches, either. Stanford, Michigan, Harvard, and the always-difficult New York Public Library are all on board.

Since in-print books are going to be off the table, the real scoop is going to be in integrating these libraries remarkable collections of rare, out-of-print, and manuscript material. 99% of Americans won't care, but these are pure gold to most scholars, and until recently, most university libraries were known for hanging onto these texts like they were their balls: you used to have to either be a big name or get a big time research fellowship to even see these babies. (And I'm sure the real cream-of-the-crop will probably continue to be withheld.)

They also happen to be the texts whose conspicuous materiality (there he goes again) actually makes them best suited for popular digitization. Imagine -- now not just scholars, but undergrads and even middle and high schoolers can see and examine rare, delicate, or simply unavailable primary documents from anywhere in the world without having to travel long distances or actually get their grubby little hands all over them. For my money, the real steal won't be in electronic texts as such, but digital facsimilies of the real thing. Not only will books no longer go out of print -- they'll no longer even need to be printed. Yet we'll be able to maintain a significant degree of contact (ha ha) with the now-outmoded print culture of the past.

This is where Google has really surprised me. It may have been expected that Google would enter into the spheres of e-mail, blogging, social networks, and the like: these are the sort of fields that a start-up can start up, with the now-industry-standard limited exposure among a few dedicated partisans, eventually breaking into a wider, more lucrative market. But Harvard and Stanford are about as establishment and big-time as it gets, and between this venture and the new, hopefully improving Google Scholar, the big G has found a way both to go unexpectedly highbrow and perhaps to decisively entrench itself as the search engine of choice: the monstrous, ultimate technology of knowledge, decisively putting the autonomous nodes of the research library and the limited search engine to rest.

Tuesday, December 07, 2004

Dispatches From the Hell of Electronic Memos

Leonard Ford just sent along a Times article ("What Corporate America Can't Build: A Sentence") documenting the woeful state of writing in the business world. Of course, if the problem were just aesthetic, it wouldn't matter much, but bad writing goes straight to the bottom line. The National Commission on Writing estimates that corporations spend as much as $3.1 billion each year teaching employees how to do better than "i am writing a essay on writing i work for this company and my boss want me to help improve the workers writing skills can yall help me with some information thank you" -- as a manager quoted in the article wrote.

Whatever else high schools and universities are supposed to do, they should be doing better than this. Then again, if corporations were willing to bankroll a $3.1 billion program devoted to English education, maybe high schools might be able to get it right in the first place. (Then again, maybe not.)

It's not all gross incompetence. The article tends to confound these separate threads, but part of the problem is the anything-goes atmosphere surrounding e-mail -- which in turn contaminates memoranda and official reports as well. Once careful writers let the deceptive immediacy and informality of e-mail turn their writing into a kind of mental diarrhea. "Instead of considering what to say when they write, people now just let thoughts drool out onto the screen," says a writing consultant quoted in the article.

Another problem is technically sound but irredeemably bad writing, here attributed to CEOs and upper-management types. Another consultant reports that "many of these guys write in inflated language that desperately needs a laxative." I don't know why the business world suggests so many digestive metaphors, but it seems appropriately inappropriate. Writers rightly love those metaphors too.

It's always interesting what people are willing to allow themselves to do poorly. Most of us would never admit to being bad drivers or to lacking a sense of humor, while mathematical ability and poor spelling top the list of acceptable incompetencies. Grammar and punctuation were never far behind. I hope that businesses' attempts to get their own house in order speak to a redoubled effort to defend that last line in the sand, rather than another barrier washed away by the tide.

Sunday, December 05, 2004

Visionary Centrism -- An Alternative

Tom Friedman's got an op-ed in today's Times exhorting President Bush to increase science funding. Why? Well, it's not just any old science funding that he has in mind: he mounts a spirited and fairly convincing case that developing a viable alternative energy source can save the frickin' world, and give Americans a genuinely universal issue to rally around.

In my note on visionary centrism, I argued that universal health care was the silver-bullet issue, combining good politics with good policy: genuinely throwing the government's weight behind alternative energy may just be the other way to go.

Saturday, December 04, 2004

Mmm -- that's good grief

Hmm -- should have gotten there before A&L Daily, but...

Read Jonathan Franzen on Charles Schultz's Peanuts in last week's New Yorker. Mixing personal reflections on his troubled family c. 1970 with thoughtful appreciation of Schultz's literary and aesthetic achievement, I think Franzen gets what's special about Peanuts: it offered and continues to offer children (especially smart, sensitive children) a simultaneous escape from and recognition of their anxiety-filled lives.

Franzen's nonfiction continually wins me over. I should get on board and read his essays (and The Corrections) before the train leaves the station altogether.

Thursday, December 02, 2004

Fragment 2: On Visionary Centrism

(This post-election schrift just never came together, even after more than a two weeks' worth of work. It was originally dated November 12, 2004, then re-dated November 19.)

In brooding about the fate of our nation (see Monday's post), I did what I often do: turned a sketchy idea into a catchy phrase, then tried to make that phrase 1) mean something and 2) make that something make sense.

In this case, the phrase is "visionary centrism," and it may be either a contradiction in terms or exactly what our country needs.

Like most Americans, I generally like moderates from either party. A healthy group of moderates makes parties not just less likely to move to one to extreme or another, but also ensures that parties stay more honest: members are more free to criticize party leadership without fear of reprisals, and are less likely to stand by one another no matter what.

This year, I voted for a moderate Republican in the Senate: Philadelphia's own Arlen Specter. There was a lot of debate among the Democrats I know as to for whom to vote, especially since it was possible that a vote for Specter could give the GOP a majority in the Senate. But Specter faced a tough primary challenge this year from Pat Toomey, an extreme conservative slightly to the right of Rick Santorum. Specter is a wiley one, a master fundraiser who knows how to cozy up to the leading Philly Democrats to get the votes he needs. And he's a pro-Labor, moderately pro-choice Republican. I'd like to keep this group from going extinct.

I smiled when I first read that Specter was in line for chair of the Judiciary committee and smiled wider when I read that he admitted publicly what everyone knows: that Democrats and moderates weren't likely to allow extreme pro-life judiciary nominees a free pass. Then I cringed at what I knew would come: public admonishment, calls for another Republican to fill the chair, being forced to promise a speedy confirmation process.

(Here's the problem -- I want to talk about visionary centrism -- that thing that Kennedy, Truman, LBJ, and even Nixon all had, that John McCain has today, however limited in magnitude or scope. Then I want to use this idea to suggest that the next Democratic presidential candidate needs to campaign on a big issue that most Americans can get behind, then convince them that they're the candidate who can get it done. This is what the Republicans currently have with the war on terrorism. For the Democrats, the silver bullet issue could be several things, but I'm betting on universal health care.)

(Instead I'm talking about Arlen Specter -- already old news -- and the advantages and disadvantages of cynically voting across party lines. This is actually close to the opposite of what I mean -- but there's no way back once you're off the rails like this.)

Wednesday, December 01, 2004

Fragment 1: On Speaking and Writing

For nearly all of November, I've been starting posts that I can't quite finish, whether due to time, space, or because I just don't know how to write them yet. This week I'll be pumping out those fragments in the hope that their presence on the web will free me up either to bring them to a conclusion or to move on to other things.

...

I like to say that success as a scholar means doing three things well: reading, writing, and speaking. Of course, I'm scholar enough to remember that this is a modification of Nietzsche's fourfold educational prescription in Twilight of the Idols: one must learn how to see, how to think, how to speak, and how to write.

When I was eighteen, I internalized this completely; only later I realized that in reading, writing, and speaking, seeing and thinking happen, and happen differently there than elsewhere. Reading, writing, and especially speaking are always potentially opportunities for rediscovery -- that is, for seeing and conceptualizing the world anew. This has particular meaning for me as at least for the moment I've devoted my intellectual life to texts' ability to connect with a lost material world. From an early age, nearsightedness and an overactive imagination led me to experience reading spatially as well as linguistically -- for as long as I can remember, I've not only read, but read silently, internally and abstractly. I also closely identify with the tradition of biblical scholarship (Jewish and Christian) in which preaching and written commentary (bound to a bodily ritual) shed light on a text that itself is somehow more real than reality itself.


Monday, November 22, 2004

Detroit: The New and Improved #2 (and the New #1)

Morgan Quinto has released its list of the 25 most dangerous (and safest) American cities. Normally, I would be inclined to ask critical questions of such a list, probing its methodology and asking (for starters) "who the hell is Morgan Quinto?" But instead I'm struck by something that seems nearly karmic in its implications: Detroit, my hometown and the reigning most-dangerous champ has lost its throne to Camden, NJ, a city a good swim across the Delaware away from my Philly home (and where I momentarily but seriously considered moving this year).

Is it possible that my very presence somehow invites criminality? Less than three years of Tim in Philly has somehow deepened Camden's cesspool status, whereas Detroit, while not exactly recovered from 20-plus years of Tim (with the prerequisite racial tension, industrial decay, and Eddie Murphy movies), seems at least to be enjoying a respite from the Robocop-style chaos that once reigned in its streets. I have a deleterious effect on even relatively safe cities. When I lived in Oakland County, the lower burbs were spawning Eileen Wuornos, Jack Kevorkian, and Eminem. Now "Home of the Sloans" Troy, MI is the eleventh safest stateside city.

The alternative -- that I'm somehow drawn, either organically or electromagnetically, to civilization on its last legs, bears consideration. But I hope Philadelphia, currently in the midst of urban renewal, doesn't take notice and throw this Jonah overboard.

Monday, November 15, 2004

Rethink, Remake, Remodel

It's been more than two weeks since my last schrift, but it isn't like I haven't been busy -- I handed out candy to South Philly's kids and teens last Sunday, voted at the local AME church on Tuesday, turned 25 (with relatively little hoopla) on Wednesday, saw houses on Thursday, made an offer on one on Friday (rejected like the longshot it was), moved to make an offer on a second (smaller but cheaper and in a better neighborhood) on Saturday, and bought furniture today. In between, I saw my doctor (no problems, I just need to lose some weight), presented a conference paper in Texas ("De Sica's Fragile Object World: Reading Bicycle Thief, Umberto D, and Two Women as Films About Things") and prepped an abstract for another ("Refashioned Modernities: New Materialism and Primal History in Renaissance and Modernist Studies"), and co-planned a lecture by a hot Princeton prof who specializes in media studies (Mark Hansen). Oh, and brooded about the fate of our nation.

Immediately after the election, I thought I had the answers as to why things went the way they went, the course American politics was taking, and how alternatives could be charted. I should have written them down, here -- for a goof, at least. (Hey, in August I predicted Bush would win, and except for the bit about turnout, I was mostly right.)

Two weeks later, things don't seem as clear. I don't think it's true that Bush was re-elected strictly because of domestic issues: fear of terrorism and Bush's lingering post-9/11/2001 aura (as Paul Krugman put it) were probably nearly as important. To put it another way -- none of the explanations work as total, pat judgements, but they all make sense as semi-autonomous tipping points: added up, they were just enough to get Bush over 50% of the voting electorate, which is all he needed.

I'll have more Schrift when I have the answers. Which may be sooner than you think.

Thursday, October 28, 2004

Taking Your Eye Off the Ball

I read The New York Review of Books for the now-scarce reviews and scholarly articles, not the knee-jerk NYC-fashionable lefty politics, but this week's multi-authored "The Election and America's Future" is riveting stuff.

I do have one suggestion: skip Norman Mailer's vitriol and zero in on the two Englishmen, Alan Ryan and Ian Buruma, and one American, Brian Urquhart. Each manages to touch on much more acute problems those suggested by typical antiwar rhetoric. Buruma writes both lucidly and with feeling on the wane of his (and the world's) "Americophilia," the love of Americans and all things American. More and more, Buruma writes,

I hear the clichés of my own Americophilia being spouted in ways that sound false, as though I'm listening to a favorite tune being distorted by a faulty player. The rhetoric of freedom, fighting tyranny, and liberating the enslaved peoples of the world speaks louder than ever. But too often it is laced with a fear of foreigners, with a nasty edge of chauvinism and a surly belligerence. The US has always had mood swings from active intervention abroad to sour isolation. What appears to be the current mood in Washington is a peculiar mixture of both: a desire to fix the world alone, whether the world likes it or not.
The loss of American prestige in the world may be the inevitable consequence of being the world's only military and economic superpower, but the policies of the last four years have certainly given the countermyth of America -- the religious, uncouth, greedy, warmongering provincials -- new life, not just in the Middle East, but everywhere. I don't know what we can do to supplement this loss -- it might just be, after many temptations and close calls, our country's final fall from grace.

Ryan's and Urquhart's essays might be even more timely. Ryan conveniently summarizes the strongest cases against the Bush administration in a single pithy paragraph:
The claim that reelecting President Bush will make the world safer—any part of the world, including the United States—would be laughable if the Iraqi civilian death toll was not 15,000 and rising, if peace for Israelis and Palestinians was not further away than ever, and if international cooperation on everything from global warming to fighting AIDS had not been deeply damaged by the last four years of a know-nothing presidency. If it is a joke, it is in the worst possible taste.
Urquhart also gets right to the point, adeptly shifting the problem of international terrorism in the name of Islam away from the overfocalized (if not overhyped) Osama bin Laden and Saddam Hussein. The real crisis in the middle East was not and may still no longer be Iraq, or even Iran or Saudi Arabia, but Israel and Palestine. Urquhart writes:
Allowing that situation to sink further into violence and despair while publicly favoring one side over the other has made the prospect of peace far more remote for both Israelis and Palestinians. It has also provided a powerful anti-American boost for the forces of Islamic fundamentalism and terrorism that are now our most immediate threat.
Later, Urquhart quotes Richard Clarke:
Rather than seeking to work with the majority in the Islamic world to mold Muslim opinion against the radicals' values, we did exactly what al-Qaeda said we would do. We invaded and occupied an oil-rich Arab country that posed no threat to us, while paying scant time and attention to the Israeli-Palestinian problem. We delivered to al-Qaeda the greatest recruitment propaganda imaginable and made it difficult for friendly Islamic governments to be seen working closely with us.
In his op-ed in the Oct. 25 Times, former Carter official Zbigniew Brzezinski makes a similar Iraq-Israel-Arab-Europe connection, and even offers a potential solution:
A grand American-European strategy would have three major prongs. The first would be a joint statement by the United States and the European Union outlining the basic principles of a formula for an Israeli-Palestinian peace, with the details left to negotiations between the parties. Its key elements should include no right of return; no automatic acceptance of the 1967 lines but equivalent territorial compensation for any changes; suburban settlements on the edges of the 1967 lines incorporated into Israel, but those more than a few miles inside the West Bank vacated to make room for the resettlement of some of the Palestinian refugees; a united Jerusalem serving as the capitals of the two states; and a demilitarized Palestinian state with some international peacekeeping presence.
The second and third "prongs" are European involvement in the reconstruction of Iraq and opening talks with Iran to get them to give up their nuclear ambitions, but the securing of a lasting peace in Israel is the fuel that makes the machine go.

However -- and this is a big however -- with Yasser Arafat near death and headed out of his compound for the first time in more than two years, Palestine may be ready to descend into a level of chaos we still lack the imagine to comprehend. The political and security situation there has steadily deteriorated without a word on the subject from the Bush administration; for the Kerry camp, anything even involving the word Israel is a political hot potato too hot to touch. The American public, too, has by large stopped paying attention to Palestine, resigned to continued --- what?

It's enough to make one wonder when Kerry talks about the Bush administration having taken its eye off the ball to go after Iraq if we've all managed to lose the big picture. And also: if Kerry wins and Arafat dies on November 2 -- what will happen?

Tuesday, October 26, 2004

John Peel, 1939-2004

Very sad news from the UK, with word that BBC disc jockey and alt-music legend John Peel has died of a heart attack.

I came somewhat late to alternative and independent music and even later to Peel fandom, but I have always been impressed by his nearly infallable taste, panache, and willingness to give unknown and unusual bands unheard-of exposure -- in the process, making many of them superstars, and connecting untold others to like-minded fanbases all over the world.

The BBC ran with Peel's death as a front page story. His institutional legacy is less apparent in the States, but I was able to pick up his obit on the AP wire. As radio increasingly turns to political talk and equally mindless music, he may well be remembered by future historians as the last DJ who mattered.

Monday, October 25, 2004

From Reform to Revolution (and the Backlash)

This is less a Schrift than a reading assignment (not that it's always easy to tell the difference). Do read Bruce Bawer's "The Other Sixties," from Wilson Quarterly courtesy of Arts & Letters Daily.

It's a long essay, but well worth it. Bawer does an admirable job at comprehensively surveying the political, religious, and cultural landscape of the early 1960s: the moment of Kennedy-inspired liberal optimism, before the left went counterculture (and Communist) and the right went plain nuts.

Particular highlights include the discussion of Vatican II and the move towards ecumenicalism in mainstream Protestant churches, the citation of the Nov. 1963 issue of the New Yorker that "when we think of (JFK), he is without a hat," and Bawer's readings of "The Twilight Zone" and Jack Paar's "The Tonight Show." Bawer paints a fine portait of the time he calls "classical liberalism’s last hurrah."

Wednesday, October 20, 2004

Stewartgate

Compared to Bill O'Reilly's sex scandal or Dan Rather's foolish decision to put his face on phony National Guard documents -- really, on 60 Minutes II of all places, couldn't they have put someone up there who didn't have credibility to lose? -- Jon Stewart's pants-ing of Tucker Carlson and Paul Begala on "Crossfire" has gotten a fair amount of mainstream and cable media attention, but not much. Stewart appeared on "Crossfire" on Friday -- the Times just today weighed in and, like most pundits, spent more time circumlocuting Stewart's calling Carlson "a dick" than his exposure that "Crossfire" is, at best, offers a farcical, entertaining imitation of political debate. (Stewart's full retort to Carlson: "You're just as big of a dick on your show as you are on any show.")

Carlson was trying to take Stewart to task for lobbing softball questions at John Kerry when he was on "The Daily Show." But Stewart isn't that tough with anybody, nor can he be: how is he going to push Ed Koch on his endorsement of Bush on Monday and then have to talk to Marisa Tomei on Tuesday? Stewart's not Bill Maher: he isn't going to drag people who disagree with him on his show and try to get them in a corner so he can look good in front of his audience. It's a talk show, like "The Tonight Show" or "Oprah." And Kerry certainly did talk candidly about matters of much more political substance on "The Daily Show" than he did with Regis Philben or Dr. Phil.

Actually, it may be misleading to call "The Daily Show" a talk show. It's not really even political satire, in the way of its best predecessor, Michael Moore's "TV Nation" (from back when I liked Michael Moore). Stewart's "The Daily Show" is a satire on the media itself -- and always has been, since the days of Craig Kilborn's straight-faced imitations of handsome, hairdo talking heads. As cable news, network newsmagazines, and political operatives themselves have upped the level of media absurdity, to choose Stewart's term for it, "The Daily Show" has followed right along. Carlson and Begala asked Stewart if a Kerry or Bush win would give him better material, like he was a jokewriter for Jay Leno. Stewart flipped the tables on them -- it's shows like theirs, which purport to provide hard-hitting political analysis in a game show-derived format, that provide the best material. (Was it just me, or was the funniest/saddest moment on "Crossfire" when Carlson yelled, carnival-barker style "We'll be back with Jon Stewart -- in the RAPID-FIRE!!!")

Political news faces a problem. The vast-majority of under-60 voters don't believe in or trust the controlled, somber honesty of the network news anchor: Brokaw, Koppel, Jennings, and Rather are probably the last generation of this venerable type. Even many news-interested and news-savvy people don't like their news distilled into an hour or half-hour. If Yahoo! can give me the story before Tom Brokaw, I could care less. Hence 24-hour cable news, which has two imperatives: to fill the news day and make their programs as entertaining as daytime and primetime TV.

An easy way to fill both needs is to put political operatives or current/former politicians on the air, either as guests or as hosts of their own show. Producers rightly believe that people like Carson and Begala are in the know -- and since they specialize in prepping candidates for the media, they're made for TV. The problem, however, as Stewart revealed, is that people who are part of the political process have every interest in obfuscating the truth, because they want their candidates to win. Begala tried in vain to argue that this wasn't the case, but nobody believes this either. Most people, even media-savvy people, would rather follow the process as it happens -- how each party's guys tries to use these shows directly or indirectly to steer the populace one way or the other.

However, as Stewart rightly notes, by making the media just another extension of partisan strategies, journalists have effectively abdicated their role as truth-seekers, or at least, spin-filters. I don't believe that "The Daily Show" really seeks truth, but it may be the one show on television that manages to be simultaneously entertaining and through its satire of the media and those who would make the media their mistress, genuinely anti-spin.

Friday, October 15, 2004

How Green is my Metropolis

Having recently broadened my home search outside of Central Philadelphia to include the Jersey suburbs -- a development about which I'm more than a little ambivalent -- I was fascinated to read David Owen's article "Green Manhattan" in this week's New Yorker (lamentably, it's print-only). Here Owen -- a New Yorker staff writer and (apparently) recent author of a history of the Xerox machine -- argues that while big cities may not be pretty, per capita, they're more eco-friendly than any rural community founded by sprout-eating utopians.

I've always been fascinated by skyscrapers, or high buildings of any kind. From the nineteenth century until the advent of the internet, the skyscraper is the most efficient way to organize people, and especially information, anytime that physical proximity is at a premium -- and for most of our industrial history, it has been. But once companies deal almost totally in information, and once that information can be transmitted electronically (by telephone, fax, and internet), the skyscraper becomes obsolete, along with the principle of centralization that led to its development. Your company doesn't need to be in New York, Chicago, or San Francisco: you can relocate to pole-barn campuses outside of Phoenix, Denver, and Wilmington and everyone can be a lot happier. (Well, except the folks in Wilmington.)

Or does it? Some of Owen's best moments come when he explains how tall, narrow buildings are inherently more energy-efficient than low, sprawling ones:

Tall buildings have much less exposed exterior surface per square foot of interior space than smaller buildings do, and that means they present relatively less of themselves to the elements, and their small roofs absorb less heat from the sun during cooling season and radiate less heat from inside during heating season. (The beneficial effects are greater still in Manhattan, where one building often directly abuts another.)

Or:

One reason New Yorkers are the most dedicated [mass] transit users in America is that congestion on the city's streets make driving extraordinarily disagreeable. The average speed of crosstown traffic in Manhattan is little more than that of a brisk walker, and in midtown at certain times of the day the cars on the side streets move so slowly that they appear almost to b parked. Congestion like that urges drivers into the subways, and it makes life easier for pedestrians and bicycle riders by slowing cars to a point where they constitute less of a physical threat.

So what's Owen's solution to traffic jams? Instead of extending current mass transit lines outside of the city center (which only encourages more people to move farther away) or worse, adding expressways and widening existing traffic lanes (which only encourages more people to drive), you should gradually eliminate traffic lanes and parking spaces -- which would provide incentives for people to move downtown and give up their cars. (Or reject cities altogether. You can tell that Owen's neither a politician nor on the tourist board.)

Still, however, Owen's right on when he points out that even what most environmentally conscious Americans do to help is a symbolic gesture at best. "Recycling is popular because it enables people to relieve their gathering anxieties about the future without altering the way they live." Instead of eating organic and fighting to the death to preserve out-of-the-way green spaces, environmentalists really need to get hip to the energy crisis and move to the city. It's not pretty, but it's the best chance we've got.

Thursday, October 14, 2004

The Last Debate, Hazily Recollected

As with the first, I watched the third and final Presidential Debate at a bar. Chaucer's Tabard Inn, a trusty round-the-corner-and-up-the-street literary pub that's now under new ownership, was hosting a bluegrass music night, so a few of my friends and I met up at The Happy Rooster, a similar if smaller Center City bar, not too far from City Hall.

The environs were much less comforting. Half the crowd didn't want to watch the debate at all, including one drunken, Bush-supporting woman sitting right by the television screen. They wanted to drink and eat their dinner. It was hard to hear, and they only had bottled beer, no draught, so I quickly determined that I could do better at home. So I missed a few good chunks of the debate, and was surprised at the end to hear pundits declaring it a draw: with the volume down, Bush seemed to me to be the clear winner. (Of course, I could and should have consulted the full debate transcripts at the Commission on Presidential Debates website here.)

Bush's strategy was simple, physical, and effective: to take away Kerry's advantage at a podium debate by making himself appear to be as dynamic as possible. Bush has a hard time sitting still. This is why he appeared so increasingly uncomfortable in the first debate, and did much better in the second. This isn't about style over substance: Bush thinks and speaks better when he's in motion. When he's sitting still at a desk or podium and fumbles over a thought or phrase, he gets frustrated and begins to look a little bit like a high school student trying to remember the answers to the tests. When he can coordinate his thoughts with physical actions, both his mouth and memory work better. As a noted pacer, fidgeter, and hands-talker, I can empathize.

It's astonishing, too, to realize that another reason Bush did so much better in the second and third debates is because he's actually much more effective than Kerry on domestic issues rather than foreign policy. Bush's rhetoric on foreign affairs doesn't pass the smell test. Nobody really believes that life is peachy in Iraq and Afghanistan, and even W's noble sentiments about the power of spreading peace and freedom throughout the world don't really connect with an middle-American electorate that's isolationist, cynical, or both. Kerry's managed to outflank Bush on this issue with right-leaning moderates since his plan reduces spending and brings more troops home. For these people, it doesn't matter whether the war was a good idea or not.

On domestic issues, however, Bush has the luxury of clarity, while Kerry does not. Faced with questions about religion, abortion, and gay marriage, Kerry is politically and compelled to say one version or another of "it's complicated," while Bush is able to say, "no, it's not." With respect to taxes, Kerry is compelled to balance his promises for new programs and tax cuts with his ability to pay for them, while Bush hasn't been compelled to balance anything. Nobody believes that Bush will balance the budget, but very few people really care, since he's credible when he says that he won't raise taxes. (He'll cut programs instead, but again -- few people really care, so long as it isn't social security.) Kerry is not credible in promising not to raise taxes on the middle class, and with our daily deepening debt, may not even be credible in promising to work to balance the budget. (Howard Dean made this point especially effectively in the primaries.) If most money-minded American people believe, rightly or wrongly, that the employment is going to continue to go up at the slow but steady rate it has been, and that the deficits will remain deficits regardless who is in office, then they're probably more likely to vote for the candidate who they believe will not raise their taxes. Unless they really are paying attention to their health care, property tax, and college tuition bills, and connect this to the federal government's newfound austerity towards the states and generosity towards insurance and drug companies, in which case, the trial lawyers make a handy villain. Nobody likes lawyers anyway.

I know that on this election, I've waffled between optimism and pessimism for the Kerry campaign and his chances of reaching the White House. Now my pessimism has begun to take the long view. Suppose Kerry does become the first Senator (and northern Democrat) to be elected President since John F. Kennedy. One question is whether his presidency will be a Carter presidency or a Clinton presidency. Carter, too, was a smart, tough-minded guy who emerged out of a period of economic depression and deep dissatisfaction with the party in power, but he was ultimately ineffective in turning around the country's domestic and foreign slump, and was neither liked nor trusted by the American people. The country went Republican (and eventually grew much more conservative) immediately following his watch. Clinton, on the other hand, despite his very public failures and surprisingly private successes, was a remarkably effective steward of the country during his eight years in office, and oversaw a rebirth and revitalization of his party that has, unfortunately, fizzled out in his absence.

Kerry, of course, is neither Carter nor Clinton, nor Kennedy, Johnson, Truman, or FDR, Gore, Dukakis, Mondale, or any other Democrat in Presidential politics you can name. In some ways he's closer to 41, George H.W. Bush, than his own son is. But Bush, too, promised not to raise taxes, and eventually -- despite his international successes -- it murdered him.

Wednesday, October 13, 2004

The Last Debate, Gearing Up

I never posted any comments here on the Presidential town-hall debates a week ago, but one moments in it was, in my opinion, crucial. As far as I can tell, nobody noticed it, or at least, nobody noticed who was paying any attention. It came when a questioner asked Bush whether he could name three mistakes he had made while he had been in office and what he would do differently.

A lot of pundits took note of the first part of Bush's answer -- that when people ask if he's made any mistakes, it's about the major decisions and events of his presidency: not paying close enough attention to Al-Qaeda before September 11, invading Afghanistan, cutting taxes, passing the Medicare extension, pulling out of the Kyoto treaty and the ICC, bypassing diplomatic solutions to the problem in Iraq, the invasion, the post-invasion, Abu Ghraib, Ahmed Chalabi, passing but not funding No Child Left Behind, failing to extend unemployment benefits during the recession, more tax cuts, etc., etc. Here, Bush says, I stand by my decisions.

This was a strong, effective answer. It has the air of demystification: "this is what people mean when they talk about mistakes." And it firmly demonstrates Bush's resolve on the core issues of his presidency and his major differences with the Democratic party. Really, Bush should have said it and shut up. But when he kept talking, he said something remarkable: that if he had to name any mistakes he had made, it had been in his appointments -- but that he didn't want to name the people involved, so as to embarass them on national television.

Lucky for Bush, nobody really noticed it. Of course, he's not talking about the appointments most Bush opponents would have seen as mistakes -- Donald Rumsfeld, Condoleeza Rice, John Ashcroft -- or, in all likelhood, sending secret messages to Colin Powell -- but rather referring to people appointed by Bush who are no longer with the administration. That means -- if we're picking three of Bush's "mistakes" -- Treasury Secretary Paul O'Neill, EPA head Christine Todd Whitman, and then a wild card of either CIA director George Tenet or head of counterterrorism Richard Clarke.

In other words, the only mistakes Bush's administration is willing to publicly admit is that at one time, some of its members admitted publicly that the administration has made mistakes.

Saturday, October 09, 2004

Death and Writing



Jacques Derrida: 1930-2004

The headline of Derrida's obituary in Le Monde was typically restrained and factual, nearly to the point of dismissal:

Jacques Derrida était le philosophe français le plus connu à l'étranger, notamment aux Etats-Unis, pour son concept de "déconstruction".


A French philosopher, albeit one who had been particularly famous in the United States and had even been somewhat controversial, had died of pancreatic cancer at 74, after writing many books, marrying, fathering children, appearing on television, partnering for a time with Lionel Jospin, etc.

Obituaries necessarily deal in the past tense, but it remains astonishing how the beginning of the sentence -- "Jacques Derrida était" -- declares its own finality from the outset, as though the past had already been declared, the death contained in the birth, life, and action of the man now dead, now joined to adjectives by an imperfect copula. Derrida saw the assembly of two archives of his papers and materials before his death, one in Paris and another in Irving, California -- it was as though he had already been placed in the past, closed, understood. His obituary had been written in 1989, and occasionally kept warm -- that is to say, re-presenced. The time is out of joint -- or if one prefers Marlowe to Shakespeare:

Thou hast commited --- deconstruction.
But that was in another country,
And now, Derrida is dead.

What "was" deconstruction? Wouldn't a wire service reporter have to embarass themselves in trying to explain what kind of philosophy Derrida practiced? Wouldn't I? I can tell you that Derrida's arguments hinged on the idea of impossibility -- the impossibility of entirely ridding oneself of metaphysical presuppositions, the impossibility of a purely philosophical language, the impossibility of purity as such.

After arguing that so many philosophers' praxis continually contaminates their theoretical commitments, is it surprising that he refused to found a school or definitively state a body of beliefs, instead choosing to practice his new way of reading on philosophical, literary, linguistic, and anthropological texts alike, showing how these works effectively "deconstructed" themselves, especially when faced with foundational oppositions: speech and writing, absence and presence, the same and the other. He coupled intellectual rigor with a style that blended scholarly erudtion with writerly spontaneity, and his remarkable mystery made him a superstar for doing so.

While I've argued for Derrida's continued presence -- a ghostly presence through absence, as the ghosts of Ibsen or Joyce testified to the continued domination of the present by the past, seen in his photograph, which continues to materially and iconically bear his trace beyond his own destruction -- there is one sentence in Le Monde that does decisively work to close the past which we, its inheritors, still inhabit:

Il était le dernier survivant de ces penseurs des années 60, catalogués "penseurs de 68", (Althusser, Lacan, Foucault, Barthes, Deleuze, etc..), grands pourfendeurs de la notion de "sujet".

How can one have been the last survivor (le dernier survivant)? And yet yesterday, one could have said "Derrida is the last surviving thinker of the 60s, called 'thinkers of 68.'" Louis Althusser, Jacques Lacan, Michel Foucault, Roland Barthes (who wrote "The Death of the Author" shortly after Derrida published his first books), and Gilles Deleuze have all died. And now Derrida, who was a famous French philosopher, the last survivor of the life and death of literary theory, is nothing. Il était.

Friday, October 08, 2004

Diagram This

I don't know who Kitty Burns Florey is, or what her website is all about, but when Arts & Letters Daily links to anything mentioning Gertrude Stein, I follow.

Florey's essay "Sister Bernadette's Barking Dog" has got me thinking: to what extent could the practice of diagramming sentences explain Stein's peculiar, experimental style? Stein's alternation of simple, declarative sentences with wild, grammar-coming-apart-at-the-hinges exercises and variations are utterly crucial for the development of American modernist poetry. Or to leave Stein for a moment, consider William Carlos Williams's famous "This Is Just To Say":


I have eaten
the plums
that were in
the icebox

and which
you were probably
saving
for breakfast

Forgive me
they were delicious
so sweet
and so cold

Williams's poem is taken from a note left for his wife, but the way in which the simple sentences are rearranged, semantically and syntactically pressurized by their new graphical arrangement,
defamiliarized by their new poetic context is not unlike the grammatical/graphical gamesmanship of sentence diagramming. (Williams liked to describe his free-verse method as pressing words into pictures.)

Stein, less fond than Williams of white space, was especially fond of games of substitution and rearrangement, as in this short prose poem, "Any one doing something and standing":

Any one doing something and standing is one doing something and standing. Some one was doing something and was standing. Any one doing something and standing is one doing something and standing. Any one doing something and standing is one who is standing and doing something. Some one was doing something and was standing. That one was doing something standing.

My internal historical researcher stood up when I noticed that Williams and Stein, like many of the earliest modernists, were born in the 1870s/1880s -- in other words, almost exactly synchronous with the widespread adoption of handbooks of sentence diagramming in American primary schools.

It's always hard to show a direct correlation with something like this, but still -- I smell an article coming on.

Saturday, October 02, 2004

The Debate, Two Days After

This schrift originated as a comment on the always-cogent Snarkmarket -- but I thought that the Presidential debate merited a post of its own here.

Like most observers of the debate on either side of the aisle, I thought that Kerry was the clear winner. But "Kerry whomped ass" isn't the end of the analysis; the dynamics of the debate were remarkable. Kerry got better and better as the debate went on, while Bush got worse and worse.

Early on, Kerry seemed (as usual) dull, cerebral, unlikeable -- the generally haughty and aloof patrician image that some Democrats dislike even more than their populist Republican counterparts. His Senatorial habit of dropping names and statistics seemed both overly rehearsed and ineffective when compared to Bush's more imagination-appealing evocations of sacrifice, safety, and the transformative power of freedom.

When Kerry did try to tap into some of that language, it was mostly by feeding off of Bush's own lines: "I also believe the President should be strong and resolute..." etc. Early on, Bush was able to re-create the most effective image of himself as a strong, visionary, likeable commander-in-chief, last best seen in his first-rate convention speech. Kerry's message seemed to be: "If there's anything you like about George Bush, I am those things, and if there's anything you dislike about him, I'm the exact opposite."

But as the questions and responses followed on one another, Bush began to falter. His repetitive, "on message" replies and counterattacks seemed to be his only resort. What's worse, his disdain and discomfort during the debate were clearly obvious. Bush hasn't had to speak publicly in even a semi-hostile room for any length of time more than once or twice since he's been President. (Remember the similarly painful press conference he had on the White House lawn a year or so back? Ouch.)

At one point, I told my friends at the bar, "Bush is about two minutes away from turning into his Dad and looking at his watch." But really, watching him grimace and smirk, I expected him to shake his wrist at Kerry in the universal motion signaling "jerk-off." By the time that the questions turned to North Korea, Bush sounded so incredulous that anyone would disagree with his plan for multilateral negotiations and was so incapable of offering any other justification for his position than "that's just what Kim Jong Il wants!" that he seemed a bit raving himself. He came dangerously close to suggesting that resolve counts for more than results -- a sentiment the practically-minded middle doesn't really believe.

Meanwhile, Kerry was like a beautiful middle-distance runner -- sprinting into the stretch. His Senate experience of lengthy, factual, on-your-feet debate, a turn-off in the beginning, paid off in the end.

The turning point was when Bush, in a classic non sequitur -- probably the classic non sequitur of his three and a half years as President -- argued:

I never wanted to commit troops.
But then we were attacked.
[Therefore, I had to invade Iraq.]

Which let Kerry slap him (rhetorically) like a bitch: "Osama bin Laden attacked us -- [not Saddam Hussein]."

And then Bush replied -- "I know Osama bin Laden attacked us! [Don't get smart with me!]"

This is when Bush started to let Kerry define the debate, putting Bush on the defensive, rather than the other way around.

Kerry's scholastic style turned in his favor, as he turned into a bullet-point alternative CNN:

* The details of how the Bush administration screwed up North Korea;

* The still-scary state of security at ports, borders, and airports;

* The threat of nuclear proliferation and fissile material going across every border except Iraq's;

* The irresponsibility of increasing nonconventional weapons spending and tax cuts in a time when the greatest deficit in our Armed Services budget is personnel.

Did anyone know before the debates that Kerry had written a book about fighting post-cold war nuclear proliferation -- years before he became a national candidate? For me, it was the crowning moment in a string of similarly beautiful surprises.

A caveat: One question I have coming out of the debate -- my perennial question with the Kerry campaign -- is why Kerry lets an enormous lacuna into his biography between his Presidential campaign and his experience in Vietnam, especially when he wants to counter the contention that he's soft. The only reason Kerry won the caucuses and primaries over his somewhat half-baked opponents was his image as "a serious man." (For the record -- I always thought that Howard Dean and John Edwards were for real, although I completely underestimated Dean's ability to raise money.)

Why wouldn't Kerry say, especially in the context in a debate on foreign policy, something like this:

I left college. I fought as a combat officer in Vietnam. I was wounded and was sent home. I became an activist against what I thought was a mistaken war. Then -- I became a prosecutor in Boston, tackling organized crime, putting criminals behind bars and securing victims' rights. I was elected lieutenant governor of Massachusetts, followed by being elected four times to represent Massachusetts in the U.S. Senate, where I served on the Foreign Affairs committee. I've spent my entire adult life in public service, fighting injustice, hunting down and publishing those responsible, and trying to secure peaceful relationships between nations across the world. No one will do a better job than I will in accomplishing these goals as President of the United States.

<>That's the best contrast between Bush and Kerry in terms of biography: while Bush is fooling around with a baseball team, Kerry's writing books on post-Cold War foreign policy. If the security component of this election becomes a referendum about whether Kerry's account about his own injuries can be believed, rather than the extensive and impressive public record of his actions after Vietnam, then Kerry can only lose.

Tuesday, September 28, 2004

Homeward Bound

At a certain point in a man's life, he begins to gaze covetously at the pages of furniture catalogues, to long for window boxes and small patches of grass he can call his own, and to yearn for the twin boons of equity and tax deductions. He grows weary of landlords who make half-assed repairs, drop in on Saturdays on little to no notice, and whose response to complaints of rodent infestation is "Hey, if you grew up on a farm..." He begins to spend his hours, both waking and sleeping, comparing school districts and crown molding, property taxes and accessibility to shopping and mass transit. It is at this time that he decides to buy a house.

Graduate students are doubly if not trebly screwed over by the impermanence of their own position. Not only do you spend five or more years of your life preparing for jobs nearly as scarce as those on professional sports teams -- and most of which are in the middle of nowhere -- you do it while earning a subsistence wage as people your own age with considerably less talent rake in the bucks, and begin buying houses, getting married, and starting their lives. Medical school is rough, but at least people need doctors -- nobody needs a literature professor. We're frankly lucky to have jobs at all.

So you mortgage away what other people take for granted: healthy social and personal lives, material prosperity, and happiness. When your future mother-in-law offers you a down payment in lieu of a wedding, you count yourself inestimably lucky -- even if she holds it over you and her wayward daughter, hems and haws whenever you need her to act, drastically changes the magnitude of her offer without warning, complains that your parents won't pay for half, and plots secretly to lure her daughter back home with the promise of cheap housing for the two of you in the South. You go through three realtors in as many weeks. You begin to think that living in a 600-square-foot house between a vacant lot and a shell across from a power plant is the best idea you've ever had. You listen patiently as your girlfriend creates wild scenarios that lead you out of the city to unknown vistas in Camden, New Jersey or Wilmington, Delaware. You nurse your growing ulcer until it knows you by name.

Your hair turns gray. You begin to cry for no reason. You throw the catalogues away and fish them out of the garbage the next day. You try to get the mortgage in your name, only to discover that your student loans count against your ability to pay and the nosebleed you had in college has ruined your credit rating. You give up and start again. You take pills to make you feel better. You make wild promises and tell wilder lies. You wonder why.

For the past month or so, I've been looking for a house in or around Philadelphia. We're still looking.


Tuesday, September 21, 2004

The Death of Ferdinand de Saussure

My short schrifts have been getting the short shrift this week, as I've been getting ready for a last-minute, pinch-hitting teaching assignment: an upper-level seminar in literary theory for undergraduates. One of my favorite profs recently fell ill and she, along with my department chair, asked me to fill in.

The first class was this afternoon: the first part of Ferdinand de Saussure's Cours de linguistique generale. Very few of the kids had taken any literary theory before, but for the most part they were into it. They asked good questions, and gave good answers. Sometimes I have to remind myself that the 19-year-olds I teach are still 19-year-old Ivy Leaguers: they know how to read a text, and what to do in a classroom. I used the chalkboard too much and got chalk dust all over my carefully selected, "cool prof" outfit, but after a three-year teaching hiatus, I was having fun again.

During a conversation with another Penn professor on Sunday, she and I agreed that not enough attention is paid to the fact that professors -- especially in the humanities -- are really paid to speak and write, not to think hard or rifle through archives doing research. Teaching crystallizes that: nowhere else do you have to put ideas into clearly defined and spoken action.

Tuesday, September 14, 2004

Greenblatt's Shakespeare



I'm a little behind on this one, and sometimes it feels like all I do is recycle New Yorker articles and Arts & Letters Daily links in my Schrifts, but there are two first-rate articles on Shakespeare this week, one from Stephen Greenblatt's new book on W.S. (published in the NYT Magazine), and another by Adam Gopnik about Greenblatt's book in The New Yorker's review section.

Someday, I would like to be Stephen Greenblatt. Over his career, he's managed to combine historical erudition, theoretical sophistication, an acute and complex political sensibility, and a sharp eye for literary detail with a sharp, accessible style and a voice that's often disarmingly personal. He's a Harvard professor and onetime president of the MLA, but his books get reviewed by The New Yorker. Even more so than his colleague Louis Menand, he's managed to bridge academic culture with a sensitivity for the nonacademic public. While his writing isn't as good as Menand's (who's dying to be Edmund Wilson), for an academic, Greenblatt's prose is pretty damn sexy -- he's just dying to be Erich Auerbach, which is a different business altogether.

Saturday, September 11, 2004

Elections and the Political Imagination

From Snarkmarket's "Can I Be Secretary of Expectations?": Robin Sloan notes (following Matthew Yglesias) that political elections revolve around the gap between real and imaginary accomplishments, i.e., the difference between what someone has done and what they say they will do.

In an electoral democracy, this is in some sense inescapable: anytime you're considering a candidate who isn't the incumbent, all you can consider are their imaginary achievements, by projecting their political promises, personal character, and past actions in a different filed or position onto their potential future performance.

The incumbent's advantage is always that he or she already has the job. This is why elections -- especially presidential elections -- often turn out to be a referendum on the current office holder's performance -- or, referencing Louis Menand and myself, the performance of the country, or even that of the voter themselves. When things have been going badly, the principle of hope offered by a political challenger, especially hope for positive change, can be a very powerful (and positive) thing. Or you could put the Nazis into power; it works both ways.

An incumbent, then, especially in difficult times, needs to conquer the imaginative space of the electorate. This is the best way of outflanking the challenge posed by an incumbent -- any incumbent. When tough times hit California, Gray Davis responded in a practical, no-nonsense fashion, raising taxes and cutting spending. Voters wanted someone, anyone else, and wound up picking the candidate who (to put it nicely) had the most to offer the imagination.

This is the mistake pundits made when trying to generalize the California recall to the 2004 presidential election. When times are tough, people don't turn to Republicans or throw out the incumbent. They pick the candidate who appeals to their imagination, to their hope that tomorrow might be a better day. They pick Reagan over Carter, Clinton over George H.W. Bush, and (probably) W over Kerry. For the Democrats, picking Kerry over Dean or Edwards was an isolated blip, a moment of self-doubt and misguided Puritan moderation. In the name of electability, they picked a man who lacked the imaginative appeal to ever be elected.

Thursday, September 09, 2004

Nota Bene, Pt. 2

I finally finished The New Yorker's Food Issue, and like any good meal, its finish lived up to the promise of its start. My early diagnosis of literary-fabulism-masquerading-as-food-writing also held up, especially with Jerry Adler's essay on Lacerba, a restaurant in Milan specializing in Futurist cuisine.

Yes, apparently the unparalleled poet, theorist, and general brick-through-the-window leader of Italy's most famous avant-garde movement Filippo Marinetti also wrote a cookbook, "La Cucina Futurista." Many of its recipes are clearly inedible -- for example, chicken stuffed with ball bearings, "meant not to be eaten but to flavor the meat with the fortifying taste of steel" -- and it infamously (and for an Italian, insanely) calls for a ban on pasta, but it's more about making a revolutionary gesture than a gastronomical revolution. As Adler writes:

The food of the future, as Marinetti envisaged it, would ban spaghetti but include smoked camel meat, raw-onion ice cream, and fried trout stuffed with nuts and wrapped in liver. It marked a whole new way of thinking about food: the cuisine of the absurd.

From there we go to Bill Buford's equally excellent article on homemade Italian pasta. Here Burford triangulates a small, traditional trattoria in the countryside near Parma, classic Italian cookbooks from the Renaissance to the twentieth century, and working at the pasta station at Mario Batali's famous restaurant Babbo in NYC. Of course, he's completely taken in by all of this, from the course behind-the-scenes curses and jargon of the station to the simple poetry of shaping pasta by hand: "I was being educated in texture: how you handle a long sheet of pasta like a piece of favric, how it interacts with the air, the ways you can stretch it and the ways you can't... Postage stamps, little moons, half-moons, and belly buttons. I feel compelled to pause for a moment and ask: What other culture has a tradition of serving up its national cuisine in the form of little toys? There seems never to have been a time when Italians weren't playing with their food." The comfort of pasta, Buford argues, is almost always at least partially the comfort of childhood. It's a fine, warm-hearted defense of the food Marinetti would have had done away with by his countrymen.

Perhaps the most beautiful and thoughtful essay in the collection, however, is Malcolm Gladwell's on ketchup. Ketchup is our most popular condiment and one of the few foods for which there is no real gourmet market: there has been essentially one kind of ketchup, the whole-tomato-and-vinegar formula perfected by Heinz, with only a few competing brands, for more than a century. Grey Poupon shattered French's hold on the mustard market with a gourmet product made with brown mustard seeds and white wine instead of white seeds and vinegar, and Prego did the same thing with its multiple varieties of extra-chunky tomato sauce in the early 1990s.

Prego's breakthrough was engineered by a man named Howard Moskowitz, a physicist and statistician who changed the way food products were developed and marketed with his notion of "sensory segmentation" -- basically, the seemingly common-sense notion that not everybody prefers the same things made exactly the same way, but that if you give people a number of choices, everyone can find something they like. Gladwell writes:

It may be hard today, fifteen years later -- when every brand seems to come in multiple varieties -- to appreciate how much of a breakthrough this was. In those years, people in the food industry carried around in their heads the notion of a platonic dish -- the version of a dish that looked and tasted absolutely right. At Ragu and Prego, they had been striving for the platonic spaghetti sauce, and the platonic spaghetti sauce was thin and blended because that's the way they, thought it was done in Italy. Cooking, on the industrial level, was consumed with the search for human universals. Once you start looking for the sources of human variability, though, the old orthodoxy goes out the window. Howard Moskowitz stood up to the Platonists and said there are no universals.

This is more than just good copy -- it's really thoughtful, interesting stuff. But what's genius about the article is that after Gladwell convincingly makes the anti-Platonist argument on behalf of food diversity, he equally backtracks on behalf of ketchup. The article is titled "The Ketchup Conundrum," and with good reason: Heinz tomato ketchup is the single greatest success story of the food industry's Platonists, combining and perfectly blending aspects of all five fundamental tastes in the human palate: salty, sweet, bitter, sour, and umami ("the proteiny, full-bodied taste of chicken soup, or cured meat, or fish stock, or aged cheese, or mother's milk, or soy sauce, or mushrooms, or seaweed, or cooked tomato"). Ketchup's appeal to us is in some sense elemental, which is partly why children love ketchup so much. Gladwell astutely notes that this also has something to do with the fact that condiments are really the only part of a meal a child has some control over themselves, but his picture of ketchup's appeal is powerful, even to someone like myself who can't stand the stuff:

Small children tend to be neophobic: once they hit two or three, they shrink from new tastes. That makes sense, evolutionarily, because through much of human history that is the age at which children would have first begun to gather and forage for themselves, and those who strayed from what was known and trusted would never have survived. There the three-year-old was, confronted with something strange on his plate -- tuna fish, perhaps, or Brussels sprouts -- and he wanted to alter his food in some way that made the unfamiliar familiar. He wanted to subdue the contents of his plate. And so he turned to ketchup, because, alone among the condiments on the table, ketchup could deliver sweet and sour and salty and bitter and umami, all at once.

I don't know if Gladwell realizes that this is an argument about Platonic recollection couched in Darwinian terms, but it works all the same. As Buford and Adler had already shown, there is something about food that suggests both the effort to conquer the new and the strange and the return to familiarity, each of which characterize both childhood and adulthood. "The terrible twos" are better called "the first adolescence," since the acquisition of language and the development of better motor skills make independence and agency possible for the first time. In your second adolescence, from 11 to 21, you forget about food, when dining or preparing food becomes an occasion for other things -- a chance to work outside the home, a place to meet with your friends. It's only in childhood and on reaching maturity that you begin to discover and rediscover the simple, wondrous pleasures of food.


Sunday, September 05, 2004

Nota Bene

Maybe it's only because I finally have time to read it in its entirety, but this week's issue of the New Yorker is simply delightful, albeit in unexpected ways.

Titled "The Food Issue," there's more than just good food writing here. The first piece that really caught my attention was James Surowiecki's fine and timely essay on conventions and other megaevents: specifically, why cities like Boston, New York, and Athens, spend tremendous time, money, and other resources to attract big-name conventions when there's little in the way of dollar-for-dollar payoff. (I recently had a chance to watch Michael Moore's Roger and Me -- some of the saddest moments in Flint's history came when it unbelievably and unsuccessfully tried, after crippling GM layoffs, to reinvent itself as a tourist and convention destination.)

The food essays, however, really shine, especially as they very nearly toe the line between ordinary food writing and full-blown, abstract literary fabulism. When Jim Harrison writes of a 37 course lunch he enjoyed with fellow gourmands in the French countryside, complete with late-Renaissance recipes like delicately poached pig snouts and other various and sundry animal parts cooked in other animal's organs, the effect is less Nigella Lawson than Jorge Luis Borges -- or better yet, Poe. Calvin Trillin's terrific essay on snoek (a herring-like fish eaten by black South Africans) has a similar pulling-your-leg quality: I found that I frequently asked myself, Does this fish even exist?

Here food is less the ineradicable content of than the occasion for writing: despite (or perhaps because of) the lush details and descriptive acumen on display in the essays, the food itself is a cipher, a placeholder: it doesn't exist. Or at least it doesn't matter whether it exists or not. Madame Bovary has nothing on good food writing: its author knows that while food presents perhaps the ultimate satisfaction for desire, desire itself is better served by its object's absence than its presence. And what could we want more than the prolongation of desire itself?

Friday, September 03, 2004

Nuisance Value

Arts & Letters Daily is always a mixed bag -- sometimes obnoxious, sometimes helpful, on rare occasions illuminating -- but every so often, it pays off with a cleverly crafted piece that one could never have discovered on one's own. In this case, it's an essay in The Threepenny Review by Adam Phillips, editor of the marvelous new Penguin edition of Freud's works.

In "Nuisance Value," Phillips uses the idea of a nuisance to show off -- to ride an eclectic mix of some of the twentieth century's most interesting thinkers. He begins and ends with Richard Rorty, but in between, his meditation manages to spark clever readings of Freud, Orwell, and even the pediatrician D. W. Winnicott. Mostly, however, is Phillips's own thoughtful, meandering exploration of what it means to be a nuisance, or for something to be a nuisance to one. A typical Phillips sentence reads:

If nuisance is need insufficiently transformed—the bad art of wanting—if nuisance, like many repetitions, is the sign of something thwarted or blocked or stalled, then it would be worth wondering what would have to happen for someone to never need to be a nuisance, or, perhaps more interestingly, for them never to experience someone or something else as a nuisance.

But whenever this kind of talk gets to be -- well, a nuisance -- Phillips's writing becomes refreshingly concrete, without losing any of its provocative force:

The beggar who makes a profitable nuisance of himself is Orwell's representative modern person. In this exchange, the beggar gives the nuisance he has made, of himself, for money. It is as though a nuisance is the most minimal thing one can make of oneself. The starkest gift. At the raw end of the spectrum there is being a nuisance; at the cooked end there is being a nuisance without seeming to be one. Criminals, Orwell seems to imply in the book, are the people we punish for being a nuisance; artists are the people we reward for being a nuisance; successful businessmen are criminals disguised as artists.

In short, a fun read on a somewhat drab day.

Saturday, August 28, 2004

I feel so.... published

I've had poems, stories, and even critical essays published before, everywhere from DIY zines to more respectable university press anthologies, but I'm particularly proud that my life as author has begun its second phase: my short paper "Modernism's Objects" has just been published in a special issue of the Journal of Modern Literature by Indiana University Press.

So this means that you can buy my book at Barnes & Noble, right? Not exactly. It's a four-page review-essay at the back of an academic journal, albeit a particularly prestigious one: a pretty good get for a not-yet-at-dissertation PhD student, but not exactly big headlines. If you have access to a good university library, you can find it (p. 207 of Fall 2003's Journal of Modern Literature, Vol. 27.1). You can also find it on the web if you have access to Johns Hopkins's Project Muse database.

Friday, August 27, 2004

Settling Down

Forbes, hot off its chat-inducing list of the 40 best cities for singles, recently published a list of the top 60 cheap places to live -- smaller and lesser-known towns (populations less than 750,000) that provide the best bargains for people looking to start a career or business.

Each of these articles makes for a pretty good read, but the titles of the "community types" steal the show: "Porch-Swing, Happy Hootervilles, IQ Campuses, Steroid Cities, Bohemian Bargains, or Telecommuting Heavens."

Happy reminders of David Brooks at his very best, and proof that his brand of "comic sociology" is catching on -- at least in glossy magazines that prize catchword memorability over complicated analysis. (Of course, I think catchwords can be just as valuable as statistical regression: see my earlier post on knowledge shortcuts for more info.)


Thursday, August 26, 2004

Summoning Cassandra

A prediction: George W. Bush will win the 2004 presidential election. It won't be a landslide, but it won't be as close as 2000's toss-up between Bush and Gore.

The reason: Republicans have been moving towards the center much more deftly than their Democratic counterparts. Simply put, the Democrats are just getting outplayed.

Consider the recent flap over the Bush-connected Swift Boat Veterans for Truth smear campaign against Kerry. Almost everyone agrees that these ads were personally nasty, probably misleading, and a prime example of the continued potency of soft money post-McCain-Feingold. Bush himself, along with his administration, have finally had to distance themselves from the ads, albeit indirectly, by railing against third-party political advertisements by "shadowy groups."

But consider the moves: Kerry wants to move away from his Massachusetts liberal image and capture some middle ground. He -- and the press, the party, everyone -- leans on his experience in Vietnam to do it. They bracket his anti-war activism, at least for mainstream audiences.

Conservative groups, ninja-like, use Kerry's military momentum against him. He wants to run as a Vietnam hero? Fine. They get some friendly people to kick up enough dirt to make things a little less clear-cut. (This is easy enough: Was anyone a clear-cut hero in Vietnam?) The Republican base, already ready to believe that Kerry was a coward, now has a mantra: Unfit For Command. Late-to-the-game centrist and independent voters, who hadn't known much about Kerry besides that he was a war hero, now aren't even sure about that.

Kerry's poll numbers drop. Now he has to humiliate himself and essentially admit that he's bleeding by publicly asking Bush to renounce the ads. After waiting for the ads to finish their course, Bush does denounce them, but only through denouncing all third-party ads. Not only doesn't he need to really give in, but he gets to put a sunny, centrist face on even this concession: mugging with John McCain and appearing to take his stance as a matter of principle.

It's not going to stop. Rudy Giuliani, Mike Bloomberg, George Pataki, Arnold Schwarzenegger, Jeb's good-looking Latino son -- they're all going to be trotted out on New York's stage less than two months before the election. The protests by left-wing groups, always solipsistic, are likely to be counterproductive: they remind mainstream voters why they dislike and distrust the left in the first place. It's 1968 all over again.

I can see the GOP getting a big bounce from this convention, especially since Kerry hasn't been able to get any kind of traction. Unless voter turnout for Kerry goes through the roof, liberals could be weeping this November. And the kind of attacks Kerry's had to fend off aren't really designed to get voters to vote against Kerry -- they're designed to make them stay home.

Monday, August 23, 2004

How We Know What We Think We Know

One of the pseudo-buzzwords I've helped to coin is "indexical knowledge." Well, check out the always-nimble Louis Menand's New Yorker article on voters' decision-making. He doesn't bust out the phrase (otherwise, I'd have to sue) -- but what he's talking about, from soup to nuts, is indexical reasoning.

Most of the time, the things we want to know -- whether we can trust someone, what the weather will be like tomorrow, whether the economy is doing well or not so well -- are things of which there is no simple measurement or even certain knowledge, even with a great deal of research and effort.

So we rely on an index -- a sign -- that gives us a quick and dirty way to get a maximum amount of information with a minimum amount of effort. We're expert at sizing people up quickly by their age, sex, race, dress, demeanor, and behavior; the best way to tell what the weather will be like in Detroit tomorrow is usually what it's like in Chicago today; and we have a variety of sophisticated and not-so-sophisticated economic "indicators" (GDP, per capita income, employment and unemployment, etc.) which provide simple, bullet-point headlines on the state of the union. None of these are 100% accurate, but that's not their virtue -- they act as a kind of shortcut (and shorthand) to find out what you want to know. This is especially helpful when more detailed or accurate knowledge is either unavailable or unproductive.

Why "index"? What characterizes an indexical sign is that it bears some sort of physical or empirical connection to its referent. "Where there's smoke, there's fire." (This cliché is actually about the value of indices -- their soundness, their generality, their ability to give protection from danger.) Linguistic signs don't have this kind of relationship -- they're arbitrary. Neither do icons -- pictures or copies of the referent. (Shot-out to C.S. Peirce, who, by inventing this terminology, invented semiotics.)

How does indexical reasoning work? Menand gives the example of picking out a stereo. We could spend hours poring over expert reviews, reading technical information, comparing prices, and so on. Or we could walk into Circuit City and pick the one that looks the coolest, as long as we recognize the brand name. I actually enjoy doing the first, but the odds are pretty good that the other guy's going to get nearly as good a deal, without spending nearly as much time doing it. Also, he doesn't believe that the time and investment in acquiring an object adds to its symbolic and psychological value. But let's not get started.

So how does this relate to politics? In two ways. Voters identify candidates they like and don't like through branding (political parties), aesthetics (Bush looks good in a flight suit, Dukakis looked dumb in a helmet), endorsements, and generally, gut judgments over ideology or information. In fact, the more information voters have, the more they need good indices in order to make sense of all of it. Sometimes these are personal touchstones (elites whom people trust or with whom they identify) and sometimes they're empirical but indirect: Florida voters have done pretty well over the past four years, so they're leaning Bush, but more Ohio voters have had it rough, so they're leaning Kerry. Who knows whether voting for Bush or Kerry will make things better or worse; some indices are better than others.

What's more, voters (elites and non-elites alike) need these indices in order to legitimate their choices on the one hand and to signal them to others on the other. We need quick figures,
clichés, and nonsense formulae ("I think Kerry seems presidential") just to be able to talk with our friends, families, neighbors, and co-workers.

In short, we need to make the metropolis (the grossstadt) more like the polis (the kleinstadt). We need to make the things of the electronic age sensible for the stone-age minds we carry around. But this isn't really what I want to say -- it just gives you the gist.

Saturday, August 21, 2004

The Sea and the Rhythm



Indie-rock trends are fickle, even compared to mainstream pop. For a while, everyone was interested in European-inspired, post-rock instrumentals. (Cf. Tortoise, Sigur Ros, Godspeed You Black Emperor!) The garage-rock revival was over almost before it started, collapsing under the weight of mainstream breakthroughs and second-rate imitation. (Cf. Jonathan Fire*Eater, The White Stripes, The Strokes.) Dancepunk sprung Phoenix-like from garage-rock's ashes, combining the same angular guitar attacks with 80's-new-wave-inspired keyboard flourishes and atmospherics. (Cf. Interpol, The Rapture, Franz Ferdinand.)

All this makes the newly emerging trend -- a return to folk music -- both refreshing and entirely expected. Iron & Wine's Our Endless Numbered Days, Sufjan Stevens's Greetings from Michigan and Seven Swans, Devendra Banhart's Rejoicing in the Hands, and Joanna Newsom's The Milk-Eyed Mender have all gathered attention and praise from the usual sources -- indie websites and zines, NPR, and critic's lists -- but unlike his hipster counterparts, it's hard to imagine the decidedly unphotogenic Sam Beam (above) on TRL anytime soon. As David Bazan of Pedro the Lion sings, "Bands with managers / are going places; / bands with messy hair / and snow-white faces." The lo-fi folk crowd isn't going anywhere, but they make great music about it.

These younger artists join an established base of indie legends, and continue their precedent of using band names for solo artists: Smog (aka Bill Callhan), The Mountain Goats (aka John Darnielle), Hayden (aka Hayden Desser), The Microphones/Mount Eerie (aka Phil Elvrum/Elverum) and Will Oldham -- who's recorded as Palace, Palace Brothers, Palace Music, and lately, Bonnie "Prince" Billy -- among others.

We've had folk revivals in independent music before, with mixed results: Billy Bragg's resurrection of political folk, Ani DiFranco's personal/political singer-songwriter persona, Elliott Smith's haunting fusion of Nick Drake, classic pop, and grunge confessionalism, or the Gillian Welch-by way of-O Brother, Where Art Thou? rediscovery of traditional Americana. What distinguishes the new artists is their connection to the indie traditions of DIY, lo-fidelity recording on the one hand and their exploration of near-psychedelic abstraction on the other. (And, of course, a willingness to deviate from both of these generalizations, through surprisingly lush arrangements and personal content.)

Both Stevens's and Oldham's music has a deeply religious content: in Stevens's case, the spirituality is genuine, while in Oldham's, it's more a matter of establishing an alternate persona. Banhart, on the other hand, on the brilliant "Michigan State," joyously yelps "The salt keeps the sea from feeling sweet / And my toes have my favorite feet" -- referencing less Blood on the Tracks than the homespun whimsy of The Basement Tapes. Banhart's Mark Bolan impression grates on some, but his songs crystallize the new musical aesthetic, and herald a return to a much-neglected persona in indie and alternative music: the poetic loser.

Friday, August 20, 2004

"What's Right Is Right": A Brief Review of Goodfellas

I lost most of today to the new Goodfellas Special Edition DVD, part of the new Martin Scorsese 6-disc box set of his films with Warner Brothers. Scorsese's always been one of my favorite filmmakers, and this set has two of my all-time favorites (Goodfellas and Mean Streets) , plus three other films (Who's That Knocking At My Door, Alice Doesn't Live Here Any More, and After Hours) which have never been released on DVD.

The nineties maybe remembered as the decade when independent film triumphed, as arty, intelligent, violent films like Pulp Fiction won newfound critical acceptance and success at the box office. It's easy to connect this moment with that of the 1970s, when a similar group of young (and not-so-young), independent (and not-so-independent) directors emerged from the ashes of the studio system, making the films that the directors of the 90s saw as they came of age.

In the early 90s, however, before the independent boom, a few films appeared that directly connected the films, filmmakers, and stars of the 70s to those that would appear a few years later. Perhaps the two most striking are Clint Eastwood's Unforgiven -- another favorite of mine -- and the remarkable Goodfellas.

Scorsese is one of the rare directors who pleases both the theory-quoting critic and the cable-watching fan in me. He makes Fellini-quoting popcorn movies about existential dread. His photography, editing, and scoring always look and sound great -- he manages to do genuinely revolutionary things that don't seem particularly radical, or even unusual. Watching Goodfellas with the commentary track on and the regular sound off, I was surprised at how tightly framed and edited every shot and sequence is. Michael Ballhaus's camera is hardly ever stationary: there's always a nearly imperceptible pan or zoom to force the eye into motion and increase the tension of the scene.

Goodfellas refuses to let you sit still, and yet it rivets your attention to complicated dialogue and often mundane, relatively action-free sequences. One of the best scenes in the movie involves a coked-up Ray Liotta trying to fry veal cutlets in between looking out his window for police helicopters. Another where Liotta and Lorraine Bracco's character get a table at the Copacabana -- hardly exciting stuff in a movie filled with multiple murders, drugs, sex, and trials -- is perhaps in the top five scenes in the history of the medium. It's done in a single long steadicam shot (a special kind of camera that allows the operator a great range of movement without the jitters of a handheld) from the opening of the car door, walking across the street, down into the basement, through the kitchen, across the restaurant, where a table appears out of nowhere, closing on a shot of Henny Youngman (yes, the real Henny Youngman) telling a joke. All the while, The Crystals' "Then He Kissed Me" plays in the background. How this film lost the Best Picture Oscar to Kevin Costner's craptastic Dances With Wolves simply escapes me.

Scorsese is one of a handful of geniuses in American cinema, easily mentionable with D.W. Griffith and Orson Welles. He may even be the best director of his generation (the best in the history of cinema): better than Coppola, Spielberg, Altman, Eastwood, DePalma, Woody Allen, and Terence Malick. The box set is dirt-cheap ($40 for five movies) and the upgrade, especially for Goodfellas, is well worth it. (Goodfellas was originally released on DVD in the early ages of the medium: no extras, and you had to flip the disc halfway through the movie, like an old record.) It's well worth familiarizing or re-familiarizing yourself with this film, and with the possibility that the coarsest, fastest, and most violent of films can be the most profound and substantial.

Wednesday, August 18, 2004

Checking the Links

When I first advertised this blog to my friends, I asked any of them with HTML knowledge to help me add features to it. Well, my first modest experiment with HTML on my own reconnaissance is up and running: You'll see a new "Links" section on the sidebar, as well as a permanent link to yesterday's etymology-as-manifesto post.

These are links to sites I read regularly, which provide some sort of useful public information. (The less said about the other sites I frequent the better.)

I've grouped them into three categories: primary news sources (including two links to local news in Philadelphia), news and entertainment magazines, and entertainment and media sites.

The sites in the last category are often the most helpful. Keeping up with The New Yorker and the Times Op-Ed page might help your party talk become extra-splashy, but Consumersearch.com can tell you which all-weather tires or set of gourmet knives to buy. Chowhound is a great resource for finding restaurants and grocery stores in big metro areas, and Pitchfork keeps my CD collection (and indie rock credit) minty fresh.

I've noticed that many local news stations devote an inordinate amount of time to covering traffic and weather: local political intrigue could be getting hot national attention, but unless it's a full-blown sex-, corruption- and influence-peddling scandal (like with New Jersey Governor James McGreevey), most people are much more interested in how they'll get to work and whether they'll need to bring an umbrella.

This set of priorities can turn pernicious when people become so apolitical, disengaged, or anti-intellectual that they close themselves off to issues that are genuinely important and require their attention. (I'd much rather watch a good weather report than exposés of movie stars or political process stories -- at least on television.)

At its best, however, Americans' Franklinian pragmatism makes them shrewd, resistant to cant, and keenly attuned to their own interests. Likewise, I wouldn't feel my blog was worthwhile unless it gives its readers something they can use and enjoy. Ut docet, ut delectat, ut permovet.