Saturday, February 26, 2005

Radio, Radio

So you're surprised that I've returned to Short Schrift after a full month of silence. I'm surprised, too. And I bet we're both equally flabbergasted that with all that's happened this month -- starting a new article, wrapping up old papers, and moving across town to a new house -- that I'm posting two links instead. It's good stuff, from a surprisingly surprising source -- Chris Dahlen from indie-rock touchstone Pitchfork Media has dropped two nicely written, intellectually intriguing articles on radio this week: ranging from podcasting to productive battles between big and little public stations, and overall, addressing the possibility of reconnecting with audiences in a long-decaying medium. The two articles also showcase a nice trajectory: on the one hand, hip music fans who disdain radio altogether in favor of plugging iPods into car stereos, and on the other, hip music fans who use the same technology (plus podcasting) to rediscover radio anew.

Wednesday, January 26, 2005

The Superpower Next Door

Tony Judt, whose output tends to be a little varied for my taste, has a new article in the New York Review of Books called "Europe vs. America" that's sharp, well-written, and timely (something else the NYRB is only infrequently). Robin Sloan's Snarkmarket entry even compelled me to compose a new Schrift (which is sadly even more infrequent).

After Rob's nice critique of the mildly (but decisively) distorting image Americans attach to "welfare state," I just have this to add: when you read the article, don't miss Judt's footnotes. Some of them veer towards inanity (especially the tone-deaf quips about Janet Jackson and David Beckham), but others are juicy and delicious.

Examples:

[5] "...(T)he steadily rising cost of private medical insurance in the US puts at least as much of a burden on American firms as social taxation and welfare privileges place upon their European counterparts —- while providing none of the attendant social benefits."

[8] "... The collapsing dollar is sustained only by foreigners' willingness to hold it: Americans are currently spending other people's money on other people's products. Were the US any other country it would by now be in the unforgiving hands of the International Monetary Fund."

[16] "... (Garton Ash) cites a popular joke: Britain was promised that Blair's Third Way would bring it American universities and German prisons —- what it is actually getting are American prisons and German universities."
"Spending other people's money on other people's products." I don't know if I've ever seen a better-phrased summary of why simultaneously skyrocketing budget and trade deficits spell disaster for a country's economy.

My favorite passage, however, is the same cited by Robin, albeit for a slightly different reason:
"(T)hese are not deep structural failings of the European way of life: they are difficult policy choices with political consequences. None of them implies the dismantling of the welfare state."
It's the "difficult policy choices with political consequences" line that resonates for me. Both Americans and Europeans today too easily place themselves (and each other) in scenarios they imagine to be both apocalyptic and inevitable. We're all hopelessly cursed by demographics and unmanageable covenants: too many immigrants, too many old people, and too many promises made, with danger (whether global terrorism or environmental and economic collapse) always just around the corner. Instead of making tough choices about adjustments to the system -- politically unpopular, perhaps, but functionally necessary -- both American and European leaders have consigned blame to other quarters and contented themselves with either radically razing the system or going down with the ship, like Ahab bound to the mast.

This is more than the politics of fear -- it's a kind of insanity. What we need are a thousand cool heads -- debunkers who can show us a third way out.

Saturday, January 15, 2005

Phoning It In

Is it just me, or has NYTimes columnist David Brooks become completely retarded?

That is all.

Friday, January 07, 2005

My Nerdiest Moments

We've all had them: moments of pure nerdly inspiration and expression. Without any assistance they stick to the walls of our memory through sheer idiosyncratic force.

Many of mine came when I dated my nerdiest in a long line of nerdy girlfriends, Laura Anderson. Laura was my first serious girlfriend: we were both math majors and scholarship winners. Together we created new words in an attempt to define and fill lexical gaps: concepts that exist or can be simply explained but lack a word to signify them. My favorite was "lorange," which we declared to signify any one-syllable word that rhymes with no other in the language it is spoken. Since "lorange" creates a rhyme for "orange," its existence would effectively empty its own concept. Therefore the only way it could remain meaningful for use would be if it remained merely imaginary and weren't used at all. Russell's Paradox it ain't, but it made for quite a nerdy afternoon in 1999.

Some of my other more nerdy moments were also my most rebellious. When I was a kid, I was a hell of a smart-ass, which got me beaten by nuns pretty often. If I didn't like something, I liked to come up with high-minded putdowns: I thought they'd get me out of trouble. I got teased once by my third-grade teacher, Sister June, for always tapping Jenny, the girl I'd had a crush on since first grade, when we played a game called 7-Up. (By grade three, it was pretty easy to tell it was me, given the trembling breath and the slightly-too-quick, slightly-too-lingering touch.) I don't remember what I said at the time, but afterwards, when Sister June remarked that it had been a fun game, I loudly complained, "I thought not!!" Beaten again. After all, I'd already been suspended a year earlier after I protested that not even starving children in Bangladesh would eat the school's chicken ravioli. I must have read about it somewhere.

Throughout my growing up, the true mark of my nerdiness was the way I would mispronounce words I had frequently read and understood perfectly but never heard spoken. I think I learned almost every word I know from context: I only started seriously using dictionaries maybe a year or so ago. (Thanks to the OED for putting their dictionary online and to Penn for subscribing to it.) This still occasionally gets me into trouble: I had been getting the meaning of commiserate slightly wrong for years until I finally looked it up. (It really means to sympathize with or show pity for one another, not just to share mutual complaints.)

Given a long enough timeline, virtually everyone becomes a nerd with respect to something, in the sense of keenly reveling in the minutiae of a field where a degree of special knowledge and/or intelligence is necessary. It might be cars, sports, or rock music instead of poetry, physics, or violin, but there are nerdly delights to be found everywhere. The field of elevated nerddom that saved me for later sports was baseball. No other sport has ever been more rewarding to nerds, combining the essential rational, traditional, and charismatic forms of nerdly authority. And thank goodness there was something that got me outside, meeting older kids and winning their respect. I was enough of an insufferable tub as it was. And still am.

Monday, December 20, 2004

Technologies of Knowledge, Pt. 2

The Google Library Shockwave keeps rolling along. See my first "Technologies of Knowledge" post for my preliminary comments, but also read this new essay "Paradise is paper, vellum, and dust" in the London Times Online.

Wednesday, December 15, 2004

The Desert Island Philosopher

Recently, and after a long interval, I renewed my acquaintance with Friedrich Nietzsche. I was instantly reminded why he was my favorite philosopher when I was eighteen -- he probably is still so today.

During my freshman year in college, like a lot of other freshman, I wore out my paperback copy of the Viking Portable Nietzsche. I had already been pretty well snagged by philosophy already, through reading Plato, Pascal, Hobbes, and David Hume, among others, but Nietzsche was (and remains) different -- not just for the famous conclusions, although they're formidable: the death of God, the emptiness of liberal morality, a growing distrust in reason, science, and progress, and consequently wariness of an encroaching nihilism against which some new, positive affirmation must be made. It's the ethos with which those conclusions are reached: a probably self-contradictory scientific drive to eliminate anything resembling predetermined conclusions or received wisdom: mind-shattering blasphemy as experiment. It's what makes Nietzsche still probably the only avant-garde philosopher.

It's nevertheless surprising just how often Nietzsche refers to experiment and its counterconcept, habit. This is especially noteworthy in Nietzsche's critique of truth, less a postmodern break than the most brilliant and sustained reprisal of the argument Hume had advanced more than a century before -- that what most people (including virtually all philosophers) take to be metaphysical truths are really just our lazy, habitual prejudices and errors of perception, rephrased by way of a linguistic mistake into scholarly-philosophical terms. Hence Nietzsche's practice as the counterconcept to philosophy as practiced hitherto: philosophy as experiment, coupling an abhorrence of systems with a polemic wit designed both to discomfort others (and oneself) and explore intriguing and explosive alternatives, wherever they might lead.

If there's anything that distinguishes the use or misuse of philosophy in the study of literature from its use in mainstream philosophy departments, it's this desire/willingness to experiment, to (with all due d/reference to Kant) reason outside the limits of reason alone. Often in the study of literary texts, scholarship is nonempirical, if not antiempirical altogether: it's pure argument, with innovative interpretation as the stakes. Just as Kierkegaard argued with respect to faith in God, you have to leap. Which is why Nietzsche, with his light feet and dancing wit, is the philosopher who just may still have the most to offer -- even seven years after I left home, renounced religion, and dreamed of becoming the Übermensch.

Tuesday, December 14, 2004

Technologies of Knowledge

Every so often in a blogger's life, the stars are aligned, and the idea you've been sitting on for a while anyway suddenly becomes newsworthy. While the dynamic duo at Snarkmarket have sent their "EPIC 2014" doomsday predictions scattering across the blogosphere, there was another print-vs.-digital clash well worth a little consequence-exploration.

This week, the Chronicle of Higher Education features an article titled "College Libraries: The Long Goodbye." Apparently computers, having finally made ghastly research libraries manageable, now may be on the way to doing away with them altogether.

The bullet-point summary of the state of affairs might read something like this:

  1. Compared to software, books are expensive;
  2. Likewise, books take up too much space;
  3. And compared to outsourced tech specialists, so do librarians.
The author doesn't endorse these positions, but rather gives some arguments for the continued relevance of the research library. Some of these are quite ingenious: for example, the main problem for libraries is much less the obsolescence of print media then the sudden glut of new materials: more than three times as many books are published now versus just thirty years ago, not to mention new journals, periodicals, etc.

This might be a good moment to mention what some of us are calling the dematerialization thesis -- the argument, whether in valediction or lamentation, that digital media has overcome the meaningful physical limitations that characterized earlier forms. As far as I can tell, the locus classicus of this thesis (with respect to modernity writ large) is German sociologist Georg Simmel's 1899 book The Philosophy of Money, but another good source is Friedrich Kittler's more recent Discourse Networks 1800/1900, and its follow-up Gramophone, Film, Typewriter. There are all sorts of problems with this argument (which I'll detail at another time) but it's undeniable that faced with a space crunch, institutions are opting for what I'll call the greater material flexibility of digital media. Many libraries, perhaps most notably that of Stanford University, have made huge investments in converting or copying their collections into digital formats, and publishers have likewise targeted libraries as prime consumers of electronic texts, whether as a backup to or substitute for "the real thing."

Then today, the New York Times made the remarkable EPIC-esque announcement: "Google Is Adding Major Libraries To Its Database." It's true -- and there are no slouches, either. Stanford, Michigan, Harvard, and the always-difficult New York Public Library are all on board.

Since in-print books are going to be off the table, the real scoop is going to be in integrating these libraries remarkable collections of rare, out-of-print, and manuscript material. 99% of Americans won't care, but these are pure gold to most scholars, and until recently, most university libraries were known for hanging onto these texts like they were their balls: you used to have to either be a big name or get a big time research fellowship to even see these babies. (And I'm sure the real cream-of-the-crop will probably continue to be withheld.)

They also happen to be the texts whose conspicuous materiality (there he goes again) actually makes them best suited for popular digitization. Imagine -- now not just scholars, but undergrads and even middle and high schoolers can see and examine rare, delicate, or simply unavailable primary documents from anywhere in the world without having to travel long distances or actually get their grubby little hands all over them. For my money, the real steal won't be in electronic texts as such, but digital facsimilies of the real thing. Not only will books no longer go out of print -- they'll no longer even need to be printed. Yet we'll be able to maintain a significant degree of contact (ha ha) with the now-outmoded print culture of the past.

This is where Google has really surprised me. It may have been expected that Google would enter into the spheres of e-mail, blogging, social networks, and the like: these are the sort of fields that a start-up can start up, with the now-industry-standard limited exposure among a few dedicated partisans, eventually breaking into a wider, more lucrative market. But Harvard and Stanford are about as establishment and big-time as it gets, and between this venture and the new, hopefully improving Google Scholar, the big G has found a way both to go unexpectedly highbrow and perhaps to decisively entrench itself as the search engine of choice: the monstrous, ultimate technology of knowledge, decisively putting the autonomous nodes of the research library and the limited search engine to rest.

Tuesday, December 07, 2004

Dispatches From the Hell of Electronic Memos

Leonard Ford just sent along a Times article ("What Corporate America Can't Build: A Sentence") documenting the woeful state of writing in the business world. Of course, if the problem were just aesthetic, it wouldn't matter much, but bad writing goes straight to the bottom line. The National Commission on Writing estimates that corporations spend as much as $3.1 billion each year teaching employees how to do better than "i am writing a essay on writing i work for this company and my boss want me to help improve the workers writing skills can yall help me with some information thank you" -- as a manager quoted in the article wrote.

Whatever else high schools and universities are supposed to do, they should be doing better than this. Then again, if corporations were willing to bankroll a $3.1 billion program devoted to English education, maybe high schools might be able to get it right in the first place. (Then again, maybe not.)

It's not all gross incompetence. The article tends to confound these separate threads, but part of the problem is the anything-goes atmosphere surrounding e-mail -- which in turn contaminates memoranda and official reports as well. Once careful writers let the deceptive immediacy and informality of e-mail turn their writing into a kind of mental diarrhea. "Instead of considering what to say when they write, people now just let thoughts drool out onto the screen," says a writing consultant quoted in the article.

Another problem is technically sound but irredeemably bad writing, here attributed to CEOs and upper-management types. Another consultant reports that "many of these guys write in inflated language that desperately needs a laxative." I don't know why the business world suggests so many digestive metaphors, but it seems appropriately inappropriate. Writers rightly love those metaphors too.

It's always interesting what people are willing to allow themselves to do poorly. Most of us would never admit to being bad drivers or to lacking a sense of humor, while mathematical ability and poor spelling top the list of acceptable incompetencies. Grammar and punctuation were never far behind. I hope that businesses' attempts to get their own house in order speak to a redoubled effort to defend that last line in the sand, rather than another barrier washed away by the tide.

Sunday, December 05, 2004

Visionary Centrism -- An Alternative

Tom Friedman's got an op-ed in today's Times exhorting President Bush to increase science funding. Why? Well, it's not just any old science funding that he has in mind: he mounts a spirited and fairly convincing case that developing a viable alternative energy source can save the frickin' world, and give Americans a genuinely universal issue to rally around.

In my note on visionary centrism, I argued that universal health care was the silver-bullet issue, combining good politics with good policy: genuinely throwing the government's weight behind alternative energy may just be the other way to go.

Saturday, December 04, 2004

Mmm -- that's good grief

Hmm -- should have gotten there before A&L Daily, but...

Read Jonathan Franzen on Charles Schultz's Peanuts in last week's New Yorker. Mixing personal reflections on his troubled family c. 1970 with thoughtful appreciation of Schultz's literary and aesthetic achievement, I think Franzen gets what's special about Peanuts: it offered and continues to offer children (especially smart, sensitive children) a simultaneous escape from and recognition of their anxiety-filled lives.

Franzen's nonfiction continually wins me over. I should get on board and read his essays (and The Corrections) before the train leaves the station altogether.