Monday, December 20, 2004

Technologies of Knowledge, Pt. 2

The Google Library Shockwave keeps rolling along. See my first "Technologies of Knowledge" post for my preliminary comments, but also read this new essay "Paradise is paper, vellum, and dust" in the London Times Online.

Wednesday, December 15, 2004

The Desert Island Philosopher

Recently, and after a long interval, I renewed my acquaintance with Friedrich Nietzsche. I was instantly reminded why he was my favorite philosopher when I was eighteen -- he probably is still so today.

During my freshman year in college, like a lot of other freshman, I wore out my paperback copy of the Viking Portable Nietzsche. I had already been pretty well snagged by philosophy already, through reading Plato, Pascal, Hobbes, and David Hume, among others, but Nietzsche was (and remains) different -- not just for the famous conclusions, although they're formidable: the death of God, the emptiness of liberal morality, a growing distrust in reason, science, and progress, and consequently wariness of an encroaching nihilism against which some new, positive affirmation must be made. It's the ethos with which those conclusions are reached: a probably self-contradictory scientific drive to eliminate anything resembling predetermined conclusions or received wisdom: mind-shattering blasphemy as experiment. It's what makes Nietzsche still probably the only avant-garde philosopher.

It's nevertheless surprising just how often Nietzsche refers to experiment and its counterconcept, habit. This is especially noteworthy in Nietzsche's critique of truth, less a postmodern break than the most brilliant and sustained reprisal of the argument Hume had advanced more than a century before -- that what most people (including virtually all philosophers) take to be metaphysical truths are really just our lazy, habitual prejudices and errors of perception, rephrased by way of a linguistic mistake into scholarly-philosophical terms. Hence Nietzsche's practice as the counterconcept to philosophy as practiced hitherto: philosophy as experiment, coupling an abhorrence of systems with a polemic wit designed both to discomfort others (and oneself) and explore intriguing and explosive alternatives, wherever they might lead.

If there's anything that distinguishes the use or misuse of philosophy in the study of literature from its use in mainstream philosophy departments, it's this desire/willingness to experiment, to (with all due d/reference to Kant) reason outside the limits of reason alone. Often in the study of literary texts, scholarship is nonempirical, if not antiempirical altogether: it's pure argument, with innovative interpretation as the stakes. Just as Kierkegaard argued with respect to faith in God, you have to leap. Which is why Nietzsche, with his light feet and dancing wit, is the philosopher who just may still have the most to offer -- even seven years after I left home, renounced religion, and dreamed of becoming the Übermensch.

Tuesday, December 14, 2004

Technologies of Knowledge

Every so often in a blogger's life, the stars are aligned, and the idea you've been sitting on for a while anyway suddenly becomes newsworthy. While the dynamic duo at Snarkmarket have sent their "EPIC 2014" doomsday predictions scattering across the blogosphere, there was another print-vs.-digital clash well worth a little consequence-exploration.

This week, the Chronicle of Higher Education features an article titled "College Libraries: The Long Goodbye." Apparently computers, having finally made ghastly research libraries manageable, now may be on the way to doing away with them altogether.

The bullet-point summary of the state of affairs might read something like this:

  1. Compared to software, books are expensive;
  2. Likewise, books take up too much space;
  3. And compared to outsourced tech specialists, so do librarians.
The author doesn't endorse these positions, but rather gives some arguments for the continued relevance of the research library. Some of these are quite ingenious: for example, the main problem for libraries is much less the obsolescence of print media then the sudden glut of new materials: more than three times as many books are published now versus just thirty years ago, not to mention new journals, periodicals, etc.

This might be a good moment to mention what some of us are calling the dematerialization thesis -- the argument, whether in valediction or lamentation, that digital media has overcome the meaningful physical limitations that characterized earlier forms. As far as I can tell, the locus classicus of this thesis (with respect to modernity writ large) is German sociologist Georg Simmel's 1899 book The Philosophy of Money, but another good source is Friedrich Kittler's more recent Discourse Networks 1800/1900, and its follow-up Gramophone, Film, Typewriter. There are all sorts of problems with this argument (which I'll detail at another time) but it's undeniable that faced with a space crunch, institutions are opting for what I'll call the greater material flexibility of digital media. Many libraries, perhaps most notably that of Stanford University, have made huge investments in converting or copying their collections into digital formats, and publishers have likewise targeted libraries as prime consumers of electronic texts, whether as a backup to or substitute for "the real thing."

Then today, the New York Times made the remarkable EPIC-esque announcement: "Google Is Adding Major Libraries To Its Database." It's true -- and there are no slouches, either. Stanford, Michigan, Harvard, and the always-difficult New York Public Library are all on board.

Since in-print books are going to be off the table, the real scoop is going to be in integrating these libraries remarkable collections of rare, out-of-print, and manuscript material. 99% of Americans won't care, but these are pure gold to most scholars, and until recently, most university libraries were known for hanging onto these texts like they were their balls: you used to have to either be a big name or get a big time research fellowship to even see these babies. (And I'm sure the real cream-of-the-crop will probably continue to be withheld.)

They also happen to be the texts whose conspicuous materiality (there he goes again) actually makes them best suited for popular digitization. Imagine -- now not just scholars, but undergrads and even middle and high schoolers can see and examine rare, delicate, or simply unavailable primary documents from anywhere in the world without having to travel long distances or actually get their grubby little hands all over them. For my money, the real steal won't be in electronic texts as such, but digital facsimilies of the real thing. Not only will books no longer go out of print -- they'll no longer even need to be printed. Yet we'll be able to maintain a significant degree of contact (ha ha) with the now-outmoded print culture of the past.

This is where Google has really surprised me. It may have been expected that Google would enter into the spheres of e-mail, blogging, social networks, and the like: these are the sort of fields that a start-up can start up, with the now-industry-standard limited exposure among a few dedicated partisans, eventually breaking into a wider, more lucrative market. But Harvard and Stanford are about as establishment and big-time as it gets, and between this venture and the new, hopefully improving Google Scholar, the big G has found a way both to go unexpectedly highbrow and perhaps to decisively entrench itself as the search engine of choice: the monstrous, ultimate technology of knowledge, decisively putting the autonomous nodes of the research library and the limited search engine to rest.

Tuesday, December 07, 2004

Dispatches From the Hell of Electronic Memos

Leonard Ford just sent along a Times article ("What Corporate America Can't Build: A Sentence") documenting the woeful state of writing in the business world. Of course, if the problem were just aesthetic, it wouldn't matter much, but bad writing goes straight to the bottom line. The National Commission on Writing estimates that corporations spend as much as $3.1 billion each year teaching employees how to do better than "i am writing a essay on writing i work for this company and my boss want me to help improve the workers writing skills can yall help me with some information thank you" -- as a manager quoted in the article wrote.

Whatever else high schools and universities are supposed to do, they should be doing better than this. Then again, if corporations were willing to bankroll a $3.1 billion program devoted to English education, maybe high schools might be able to get it right in the first place. (Then again, maybe not.)

It's not all gross incompetence. The article tends to confound these separate threads, but part of the problem is the anything-goes atmosphere surrounding e-mail -- which in turn contaminates memoranda and official reports as well. Once careful writers let the deceptive immediacy and informality of e-mail turn their writing into a kind of mental diarrhea. "Instead of considering what to say when they write, people now just let thoughts drool out onto the screen," says a writing consultant quoted in the article.

Another problem is technically sound but irredeemably bad writing, here attributed to CEOs and upper-management types. Another consultant reports that "many of these guys write in inflated language that desperately needs a laxative." I don't know why the business world suggests so many digestive metaphors, but it seems appropriately inappropriate. Writers rightly love those metaphors too.

It's always interesting what people are willing to allow themselves to do poorly. Most of us would never admit to being bad drivers or to lacking a sense of humor, while mathematical ability and poor spelling top the list of acceptable incompetencies. Grammar and punctuation were never far behind. I hope that businesses' attempts to get their own house in order speak to a redoubled effort to defend that last line in the sand, rather than another barrier washed away by the tide.

Sunday, December 05, 2004

Visionary Centrism -- An Alternative

Tom Friedman's got an op-ed in today's Times exhorting President Bush to increase science funding. Why? Well, it's not just any old science funding that he has in mind: he mounts a spirited and fairly convincing case that developing a viable alternative energy source can save the frickin' world, and give Americans a genuinely universal issue to rally around.

In my note on visionary centrism, I argued that universal health care was the silver-bullet issue, combining good politics with good policy: genuinely throwing the government's weight behind alternative energy may just be the other way to go.

Saturday, December 04, 2004

Mmm -- that's good grief

Hmm -- should have gotten there before A&L Daily, but...

Read Jonathan Franzen on Charles Schultz's Peanuts in last week's New Yorker. Mixing personal reflections on his troubled family c. 1970 with thoughtful appreciation of Schultz's literary and aesthetic achievement, I think Franzen gets what's special about Peanuts: it offered and continues to offer children (especially smart, sensitive children) a simultaneous escape from and recognition of their anxiety-filled lives.

Franzen's nonfiction continually wins me over. I should get on board and read his essays (and The Corrections) before the train leaves the station altogether.

Thursday, December 02, 2004

Fragment 2: On Visionary Centrism

(This post-election schrift just never came together, even after more than a two weeks' worth of work. It was originally dated November 12, 2004, then re-dated November 19.)

In brooding about the fate of our nation (see Monday's post), I did what I often do: turned a sketchy idea into a catchy phrase, then tried to make that phrase 1) mean something and 2) make that something make sense.

In this case, the phrase is "visionary centrism," and it may be either a contradiction in terms or exactly what our country needs.

Like most Americans, I generally like moderates from either party. A healthy group of moderates makes parties not just less likely to move to one to extreme or another, but also ensures that parties stay more honest: members are more free to criticize party leadership without fear of reprisals, and are less likely to stand by one another no matter what.

This year, I voted for a moderate Republican in the Senate: Philadelphia's own Arlen Specter. There was a lot of debate among the Democrats I know as to for whom to vote, especially since it was possible that a vote for Specter could give the GOP a majority in the Senate. But Specter faced a tough primary challenge this year from Pat Toomey, an extreme conservative slightly to the right of Rick Santorum. Specter is a wiley one, a master fundraiser who knows how to cozy up to the leading Philly Democrats to get the votes he needs. And he's a pro-Labor, moderately pro-choice Republican. I'd like to keep this group from going extinct.

I smiled when I first read that Specter was in line for chair of the Judiciary committee and smiled wider when I read that he admitted publicly what everyone knows: that Democrats and moderates weren't likely to allow extreme pro-life judiciary nominees a free pass. Then I cringed at what I knew would come: public admonishment, calls for another Republican to fill the chair, being forced to promise a speedy confirmation process.

(Here's the problem -- I want to talk about visionary centrism -- that thing that Kennedy, Truman, LBJ, and even Nixon all had, that John McCain has today, however limited in magnitude or scope. Then I want to use this idea to suggest that the next Democratic presidential candidate needs to campaign on a big issue that most Americans can get behind, then convince them that they're the candidate who can get it done. This is what the Republicans currently have with the war on terrorism. For the Democrats, the silver bullet issue could be several things, but I'm betting on universal health care.)

(Instead I'm talking about Arlen Specter -- already old news -- and the advantages and disadvantages of cynically voting across party lines. This is actually close to the opposite of what I mean -- but there's no way back once you're off the rails like this.)

Wednesday, December 01, 2004

Fragment 1: On Speaking and Writing

For nearly all of November, I've been starting posts that I can't quite finish, whether due to time, space, or because I just don't know how to write them yet. This week I'll be pumping out those fragments in the hope that their presence on the web will free me up either to bring them to a conclusion or to move on to other things.

...

I like to say that success as a scholar means doing three things well: reading, writing, and speaking. Of course, I'm scholar enough to remember that this is a modification of Nietzsche's fourfold educational prescription in Twilight of the Idols: one must learn how to see, how to think, how to speak, and how to write.

When I was eighteen, I internalized this completely; only later I realized that in reading, writing, and speaking, seeing and thinking happen, and happen differently there than elsewhere. Reading, writing, and especially speaking are always potentially opportunities for rediscovery -- that is, for seeing and conceptualizing the world anew. This has particular meaning for me as at least for the moment I've devoted my intellectual life to texts' ability to connect with a lost material world. From an early age, nearsightedness and an overactive imagination led me to experience reading spatially as well as linguistically -- for as long as I can remember, I've not only read, but read silently, internally and abstractly. I also closely identify with the tradition of biblical scholarship (Jewish and Christian) in which preaching and written commentary (bound to a bodily ritual) shed light on a text that itself is somehow more real than reality itself.