Wednesday, January 31, 2007

An Extremely Fascinating Hominid

Salon has a good interview with the anthropologist Barbara J. King about the role played by religion in early homo sapiens and our primate cousins.

I especially like two things about King's reading of our primitive past. First, I think her treatment of religion not as a set of propositions about the universe but as a way of making social and symbolic relationships (i.e. meaning) is basically right. Also, this reinterpretation lets her make a very interesting argument: that religion in this sense isn't an accident of our evolutionary development, but actually let us outcompete the Neanderthals, who had a sophisticated sense of religious practice, but apparently not sophisticated enough.

And like King, I think the Neanderthals -- complicated humanoid primates who walked the earth with homo sapiens but who evolved completely independently -- are just super-cool.

Silverman is Magic

Man, I wish I still had cable. "The Sarah Silverman Program" debuts on Comedy Central tomorrow; the NYT has a write-up of the show in today's paper.

Among its many other attractions -- and I have a man-sized crush on Sarah Silverman that's impossible for me to lie about -- is its inclusion of "Mr Show with Bob and David" alumni Brian Posehn and Jay Johnston in the cast. Jay Johnston may have been the least funny person on that astonishingly funny show, but in the absence of a full list of the show's writers, the fact that both are on board is a good sign.

If only they could snap up the Brothers Odenkirk.

Tuesday, January 30, 2007

Mass Transit That Isn't

Portland, Oregon just built an Aerial Tram -- yes, a streetcar that is suspended by cables in the air -- connecting the Willamette waterfront to the Oregon Health and Science University. The cars and stations look stunning; the trip sounds like a fantastic tourist experience; and in some ways, the fact that the route is so limited and the cars so few -- just two 78-passenger cabins, probably less than the number of people I was jammed in with last week in one car on the Market-Frankford line in Philly -- just adds to its charm.

One woman, according to the New York Times, said that she'd "like it even more if it was 20 times slower and they served cocktails.” And apparently, so many people wanted to ride it when it opened to the public this weekend that the city decided to make the tram free of charge for every Saturday in February.

The regular fare is $4, roundtrip; an annual pass is $100.

The aerial tram seems to solve some of the problems of mass transit in already crowded cities, like how to create private-right-of-way without digging tunnels or closing streets. But in some ways, this little leg of Portland's already copious transit system isn't concerned -- or at least isn't primarily concerned -- with the typical problems of mass transit at all. The tram shaves ten minutes off the commute by car, but shortening a fifteen minute commute isn't a problem that demands that a city and school make this kind of public outlay. Instead, the goal seems to be principally aesthetic: "Some say it will give eminently livable Portland an aesthetic exclamation point it lacks, something like the Golden Gate Bridge or the Space Needle in Seattle."

Portland may well be the only city that I've never visited that I often think that I would very much like to live in someday.

Monday, January 29, 2007

One Man's "Good Government" Is Another Man's "Creeping Feeling of Dread"

This is one for the "Fascism" tag over at Points of Note. It's from a NYT article titled "Bush Directive Increases Sway on Regulation."

In an executive order published last week in the Federal Register, Mr. Bush said that each agency must have a regulatory policy office run by a political appointee, to supervise the development of rules and documents providing guidance to regulated industries. The White House will thus have a gatekeeper in each agency to analyze the costs and the benefits of new rules and to make sure the agencies carry out the president’s priorities.
...
The White House said the executive order was not meant to rein in any one agency. But business executives and consumer advocates said the administration was particularly concerned about rules and guidance issued by the Environmental Protection Agency and the Occupational Safety and Health Administration.


This, on the other hand, is a handy definition from Wikipedia:

Commissar is the English transliteration of an official title (комисса́р) used in Russia after the Bolshevik revolution and in the Soviet Union, as well as some other Communist countries.

It is used to distinguish the title from similar titles in a variety of languages (such as commissaire in French or kommissar in German), which are usually translated into English as commissioner.

In Russia, the title was associated with a number of administrative and military functions in the Bolshevik forces during the Russian Civil War and the Soviet government afterwards. During the war, the White Army widely used the collective term bolsheviks and commissars for their opponents.

There were two well established titles: People's Commissar (government) and political commissar (military).

The term derives from a similar term in French to describe the equivalent of the rank of Major both in the army of the ancien regime and the French revolution. Such officials were not military officers but reported back to the political authorities: the king and the National Assembly, respectively. It is the use by the French revolutionary government which gave the idea to the Russian one.


Let's avoid the fine points about exactly which horrible phase in human history this reminds us of the most, or if it's some brand new variation of brazen political power-grabbing. I hope we all can agree that this largely-under-the-radar move is six different kinds of messed up.

Just Larking About

The UK and Ireland have their own version of Apple's Justin Long/John Hodgman Mac vs. PC ads. Good show. (Via CNET.)

Reflecting Reading

Just bouncing a couple of really good links along. The first, via Arts and Letters Daily, is a first-rate essay by Joseph Epstein about turning 70. It's smart, and often funny -- I read it out loud to my wife, and it performs very well. The most wistful part, at least for me, is when he lists the books he hopes to re-read before he dies. (He's already knocked out Tolstoy'sWar and Peace, and hopes to give Proust, Cervantes, Herodotus, and Montaigne one more go.)

The other, via Snarkmarket, is Michael Pollan's essay on "Nutritionism" from The New York Times Magazine. Calling it an essay on "nutritionism" is potentially misleading -- that strange word is one of at least a dozen terrific ideas, observations, and arguments Pollan makes about the history of our eating habits before it's all through. This is perfect long-form journalism -- too short for a book, way too long for a story, op-ed, or blog post. In addition to making you think, it just might make you change your life. It also makes me think that I should give Pollan's The Omnivore's Dilemma a whirl. After all, the vagaries of my diet aside, I'm still only 27 -- Herodotus can wait a little longer.

Saturday, January 27, 2007

Thursday, January 25, 2007

Email and Blogging, Copyright and Test Prep

Over at Snarkmarket on January 5, I wrote in a comment about something I found peculiar in my own internet habits. Here's the relevant chunk of my comment:

I'll often blog or comment on somebody else's article, but hardly ever email an author directly to tell him or her what I think. Part of the electronic public wants to continue the conversation with the author, but many of us are having our own conversation about somebody's thoughts, deeds, or words without ANY reference or input from them directly. This is understandable when it's an author, politician, artist, or other public figure -- there's a tradition there of having some kind of divide between the people who talk and those who are talked about. When it's other people who are in a similar situation to our own, it's more than a little weird.


A few days ago, I decided to break out of this habit and email another blogger directly rather than blogging on his post. The blogger happened to be Cory Doctorow at Boing Boing. Doctorow posted an anecdote sent to him by a friend about a copyright-protection clause the friend was made to sign when taking the GRE (Graduate Record Examination). Here's the text of Doctorow's post:

GREs: Cheating == copyright infringement

Friend of Boing Boing sez, "I took the GRE this afternoon, at a computer testing center in Brooklyn. I was forced to sign (and copy in longhand!) a statement about how I wouldn't tell anyone what was on the test. I was particularly upset by the following statement."

If I reproduce test questions in any manner I am subject to a copyright infringement lawsuit and any other action(s) ETS may take.

"I can understand ETS saying that telling someone what's on the GRE is dishonorable behavior, and that if you're found cheating they'll 'take measures' against you -- perhaps invalidating test scores. But 'copyright infringement lawsuit'? That's crazy talk!"


I'm more or less sympathetic with Doctorow and his copyright abuse hobbyhorse, but it seemed like there was an important part of this story that the friend (and Doctorow, by passing it along without comment) misunderstood. So I found Doctorow's email address by following the link from his Boing Boing site to his personal page, Craphound. Actually, its full title is "Cory Doctorow's Craphound.com."

Here's the text of my email:

Dear Cory,

I read your recent Boing Boing post on GREs and the copyright waiver
your friend was made to sign. Having worked for several test prep
companies and taught the GRE in particular, I can say that ETS is much
less concerned with student-to-student cheating than it is the
unauthorized reproduction of test questions by companies like Kaplan,
Princeton Review, and other test prep providers.

ETS publishes its own book of sample GRE exams
(http://store.digitalriver.com/store/ets/DisplayCategoryProductListPage/categoryID.3551800)
that are authentic, preadministered tests. Test prep companies have to
pay to develop their own sample tests and questions for their courses,
based on the publicly available ETS materials. Unlike ETS, they're not
allowed to say that their questions are "actual" test questions...
although it's been known to happen that some are slightly modified or
outright lifted versions of ETS-developed questions.

Best,

--Tim


To me, at least, this is a more interesting story than the one Doctorow tells. It's not that the GRE is equating cheating with copyright violation. It's that they're not really concerned with cheating at all, or at least, not as much as they are with protecting their own intellectual property, which just so happens to provide a lucrative publishing business on top of their primary business of writing and administering tests.

I had high hopes that Doctorow would add a chunk of my clarification to Boing Boing, not least because the bloggers there do that sort of thing all the time. But it's been three days, and no luck. I don't know if he hasn't seen the email, if he doesn't trust it, or just isn't interested in clarifying the story. (Instead, he posted on Craphound about how he'd made Forbes's list of the "Top 25 Web Celebrities.")

I don't want to be too hard on Doctorow. I very nearly called another blogger a coward for not posting a critical comment, when it turned out that the blog software just wasn't working the way it was supposed to. But it's left me a little down on the idea of carrying on direct conversations with other bloggers. Blogging may be the public sphere, but it doesn't leave much room for conversations on the side. Unless next time, I try Xeni Jardin.

Tuesday, January 23, 2007

More Brooklyn Photos

The Brooklyn Museum had two top-notch shows that I got to see with my sister and cousin. It was the last day of the big Annie Leibovitz retrospective, and there were no separate tickets to see the special exhibition, so the place was packed: amusement-park length lines and bodies smashed against each other all around.

Here's a good pic of one of AL's photos of Susan Sontag:



There was also a special exhibition of Ron Mueck's work that I'd read about. It was startling, but after all the hype I was surprised to find myself underwhelmed. Mueck does miniature and gigantic renderings of human bodies, using silicone and other materials to faithfully recreate individual strands of hair, etc. Here's a good image:



I took two photos myself, using the iSight camera on my laptop, which proved to be surprisingly good for this sort of thing. The museum's wifi capability was a real lifesaver too, as I was able to figure out when I could catch a train back to Philly. Here's one of Mueck's sculptures, surrounded by the thronging, curious crowd.



The display room where Mueck's work was shown was strangely empty, especially compared to the packed-to-the-gills, not-entirely-coherent displays in the permanent collection. Immediately across from the large statue I just photographed was a utility door with an electrical outlet on either side. I made a long Duchampian joke about the door being a true (if accidental) work of genius: the "really real" in a room of clever, conspicuous fakes. "The unopened door," I said, "both invites and resists us."



That said, I remain unconvinced that I'm not (accidentally) right. To quote Oscar Wilde: "it is a terrible thing for a man to find out suddenly that all his life he has been speaking nothing but the truth."

Monday, January 22, 2007

Large Paragraphs, Eyes Wandering

Subtitled:
For Current Readers -- One German Answer to Robin's Question

On the off chance that Robin's post at the Current blog (Hey! I'm billed ahead of the Times!) redirects RSS-happy DIY TVlings to this humble but necessarily difficult blog about literature, academia, city life, and indie rock, you should know what you're getting yourselves into.

This is the first paragraph from the foreword to Nietzsche's On the Use and Abuse of History for Life, one of his Untimely Meditations. Substitute "history" for "blogs" or "news" and I think his sentiment is nearly as perfect for 2006 as it was for 1874.

"Incidentally, I despise everything which merely instructs me without increasing or immediately enlivening my activity." These are Goethe's words. With them, as with a heartfelt expression of Ceterum censeo [I judge otherwise], our consideration of the worth and the worthlessness of history may begin. For this work is to set down why, in the spirit of Goethe's saying, we must seriously despise instruction without vitality, knowledge which enervates activity, and history as an expensive surplus of knowledge and a luxury, because we lack what is still most essential to us and because what is superfluous is hostile to what is essential. To be sure, we need history. But we need it in a manner different from the way in which the spoilt idler in the garden of knowledge uses it, no matter how elegantly he may look down on our coarse and graceless needs and distresses. That is, we need it for life and action, not for a comfortable turning away from life and action or merely for glossing over the egotistical life and the cowardly bad act. We wish to use history only insofar as it serves living. But there is a degree of doing history and a valuing of it through which life atrophies and degenerates. To bring this phenomenon to light as a remarkable symptom of our time is every bit as necessary as it may be painful.


Consider this the equivalent of an essay assignment on the first day of class.

(This is the way my conversations with Robin go. He asks a question about the internet, cars, gadgets, or public life, and instead of an answer, I give him the gist of what three German philosophers might have thought about it. Read my first-ever comment on Snarkmarket if you don't believe it. Then read Robin's response.)

And A City Cheered

Philadelphia lost, and Detroit never got close, so now I'm fully free to root for the team from my favorite city in the world.

Go Bears.

P.S.: Not only are the Bears and the Colts just a jog on I-65 away from each other, Lovie Smith and Tony Dungy are, together, the first black coaches to reach the Super Bowl. That's pretty cool.

Friday, January 19, 2007

I Am a Font of Biblical Knowledge

You know the Bible 98%!
 

Wow! You are awesome! You are a true Biblical scholar, not just a hearer but a personal reader! The books, the characters, the events, the verses - you know it all! You are fantastic!

Ultimate Bible Quiz
Create MySpace Quizzes



Do you want to know the craziest part? As far as I can tell, the only question I got wrong was the one about the talking donkey.

Technology and Crowds, or See, Here's the Gist

Snarkmarket's got a good discussion going about the iPhone. Well, it's not really about the iPhone; it's about Iraq. Or it might be about Alexander Hamilton, or Wikipedia. I'm not totally sure any more.

What it's really about is secrecy vs. openness, and the comparative appeal and danger of each. The iPhone was secret, ComputerWorld (convincingly, I think) argues it should have been secret longer, and despite some legitimate concerns and sad haters, looks totally awesome. This seems at least in part because it was designed in secret. As Robin Sloan says: "The iPhone was most definitely made by professionals. They were locked in a room. No one blogged."

iRaq, on the other hand (or as Matt Thompson calls it, "Desert W. Storm") was likewise cooked up in secret by a cabal of cooks, and it's a total disaster. And -- perhaps more tellingly -- many of us are troubled in principle by the idea of secrecy, even if the end result is something we would applaud. After all, even Mussolini claimed that he made the trains run on time.

Technology, I would argue, poses another problem, especially when it comes to design. We're pretty much willing to accept a certain amount of mystery when it comes to the production of art on the one hand and, in many cases, the workings of mechanical devices like cars or appliances on the other. But computers, and especially the information on computers, seems different. Information seems like it should be open to public scrutiny, or at least expert scrutiny. The greatest moral strength of open-source projects, whether they're encyclopedias or software, is that information is subject to review, critique, and revision. It's often their greatest practical strength as well, but I'd say its primary appeal is that it squares with our already-formed ideals as users about how information in general should work. I deliberately add "as users" because many people who produce or deal in information often think about this differently.

I was the first commentor on this thread at SM, but lots of other smarties have picked it up. (I secretly suspect Robin has edited his original post since I first commented on it -- I remember it being shorter and only slightly less awesome.) Since not everyone who reads Short Schrift also reads Snarkmarket, here's the gist of what I've already said about this topic over at that blog. I'll fill in some of the gaps with parenthetical commentary.

1. God, I'd have been willing to trade a little openness if the war in Iraq had been run monolithically and near-secretly but executed as well as an Apple product.

I'm serious. Not only was the war cooked up by some pretty misguided (and misguiding people), but the more you find out about it, the more it seems like an open-source project gone all wrong, dragged into too many directions by people with opposed goals, beliefs, principles.

Rumsfeld wants to win the war with a third of the troops; Bremer decides to dissolve the Iraqi army. The army can't decide (or figure out) whether it's on the way out or staying for good. Part of the government rushes to put Iraqis in charge of everything, while another wants to run the place like a colony and give all the jobs to Halliburton and Bechtel. Some misguided idealists decide to start the plumbing and oil infrastructure from scratch, bypassing the people who know the most about the system and blowing millions on software programs the Iraqis don't know how to use. I mean, the stories you hear are just crazy.

And now you hear people like Richard Perle or John McCain essentially arguing that if they war had been fought in their way -- if the software and the hardware had been designed to work together from the start -- then the situation would have been much better than it is today. I don't know whether that's true -- even if we'd gotten Colin Powell's war, or John Kerry's. (Both of which, paradoxically, involved bringing more people into the process.)

Maybe I'm confusing the transparency/secrecy distinction with a pluralistic/monolithic design scheme. There are plenty of hybrid forms. There's no reason why a group of people can't be secret about their activities even as their contradictions pull them apart, and no reason why a single designer can't be open and transparent about his/her actions, even while maintaining the unity of what's created. Likewise, a collective can be very restrictive on what end-users do with their product, and a single integrated design can allow for a lot more customization and alteration. It's all about what works, and to a certain extent about what you really value: efficiency, beauty, universality, a chance to contribute something yourself...


(Robin introduces a note about diffusion/unity, order/chaos, introduces the idea that what's important is leadership. Also note that Robin dropped the F-bomb in his original post -- that is, he quoted the Federalist papers.)

2. Come to think of it, Hamilton might be the perfect instantiation of a different model of transparency: the deliberative executive. For Hamilton, his best argument against executive-by-committee is that the ability to diffuse blame "tends to conceal faults and destroy responsibility." (Fed 70) An executive needs to be able to explain what he thinks, the meaning of his actions, and to accept praise or blame for them. It might be a model of transparency more sympathetic to notions of noblesse oblige than say, the socialist hive mind -- but does that make it necessarily less transparent?


(Saheli says that I'm awesome, but gadgets, eh. I heart you too, Saheli.)

(Robin gives the example of Gmail as a especially well-executed form/function synthesis probably designed by a single user.)

(Howard distinguishes between great products and great experiences: Apple provides the latter even more so than the former.)

3. Gmail's a good example, because at least part of the success of Gmail has been in exceeding our expectations for what proprietary internet mail [can do]. This isn't just true in terms of storage and search, but also price and accessibility. The killer innovation of Gmail might be the fact that you can access your Gmail account from any mail reader, or forward your mail from the account for free (neither of which you could do w/most free web-based mail accounts). It's just that Gmail's interface is so perfect that the circumstances are rare in which you'd want to. If I could forward my Yahoo! mail to my Gmail account, I would never use my Yahoo mail again.

What we really want, I would say across the board, is for a single designer, or small group, to provide innovative and distinctive experiences and objects. Then we want the right to do whatever we want with them.


I think this last paragraph summarizes my position as well as I can state it. I would go further and argue that this is the position of most people whose idea of a sexy good time is not keeping people from knowing the history of the civil rights movement, drinking Mountain Dew while writing code for a killer disk management app, or preserving the virtue of the "Thundercats" entry on Wikipedia. (Virgo Intacta.) Am I wrong?

Tuesday, January 16, 2007

All the HTML and CSS I know...

... went, piece by piece, into this really quite modest redesign of the Short Schrift template.

And most people probably read it with feed readers anyways, if at all. But it makes me happy.

(And hey! I've got more posts in January than in the entire second half of last year. Let's hope this newfound productivity keeps going.)

Monday, January 15, 2007

A Prayer on the Sidewalks

If, today, you're looking for television footage of or about Martin Luther King, Jr., don't watch the History Channel. There's only two hours of MLK- or civil-rights-related programming on the channel today, and it already aired once this morning and again this afternoon. Just for a comparison, there's a full two-hour documentary about Harry Houdini that's also airing twice today, including in prime time. They spend more time covering The Da Vinci Code in a given week than they have ever spent on the civil rights movement or African-American history in a year, including February.

Last year, there was no King-related coverage on the holiday at all. My wife and I will never watch the History Channel, A&E, or Biography ever again.

PBS, by contrast, is a salvation. I don't know what they're broadcasting on any given channel, but you don't need to go anywhere on the web besides the PBS site for Eyes on the Prize, the seminal documentary on the civil rights movement. The original documentary aired in two parts in the late 1980s, and finally rebroadcast this fall. For years Eyes on the Prize had been held up due to copyright issues on the extensive archive footage used. Now it's finally available on DVD, but for nearly $400. To me at least, It's a clear case of copyright abuse. If you can get your local academic or public library to buy it, it's well worth watching. I hear it's surprisingly (and deliberately) widely available on filesharing networks too.

The Eyes on the Prize site is a genuine treasure. It features video clips from each of the 26 sections of the documentary, primary textual documents written by and about both major and minor historical figures and ordinary people (the letter from freedom rider John Dolan's father shows you just how far outside the mainstream the civil rights movement was held to be), and snappy profiles of a slew of the major figures featured in the documentary. As an example, here's a chunk of the profile of Ronald Reagan, who according to a horrifying Discovery Channel/AOL poll, beats both King and Abraham Lincoln as "the greatest American":

As governor of California (1967-1975), Reagan clashed with activists like the Black Panthers in Oakland, and encouraged the firing of University of California professor Angela Davis. In the presidential campaign of 1980, he invoked "states' rights" which some heard as a code phrase for racism. As America's 40th president (1981-1989), he tried to eliminate the Department of Education and cut back on loan and assistance programs that helped underprivileged students get a leg up. Throughout his political career, Reagan undercut anti-poverty programs and urban assistance while cutting taxes, widening the gap between rich and poor. In support of business, his administration failed to enforce the Community Reinvestment Act, which outlaws racial discrimination in home loans, and made many federal contractors exempt from Affirmative Action programs.

However, during his presidency, Ronald Reagan signed the bill that declared the third Monday in January a national holiday honoring Dr. Martin Luther King, Jr.


There's one video that particularly stands out to me. It's from Albany, Georgia, widely held to be the civil rights' movement greatest defeat, where the nonviolent tactics of desegregation failed. The local police chief refused to combat the protestors directly with outright brutality, but relegated them to prisons outside of town so his jail never filled up. He paid King and Abernathy's bail so the two leaders could avoid being flash points. The SCLC and SNCC left town without desegregating the city. In the video, the protestors weep and pray on the sidewalks before they are arrested, carried off on stretchers. At the end, they clap and sing, "Ain't Gonna Let Nobody Turn Me Round." It is a truth machine that will not be stopped.

I got into an argument with my wife this morning over the content of my earlier post, and my feeling that most of the local events for MLK squandered the potential of a holiday devoted to a man like this. My wife's position is that the King holiday is the most important holiday we have, since it's the only nonreligious holiday where people are called on to help someone other than themselves or their families. I agree, but I'm frustrated by isolated charity, endless talk about the meaning of King's message, or empty two-block marches that only engage nostalgia, like recreations of George Washington crossing the Delaware. I'm usually the first person to get cynical about political marches or sit-ins or boycotts, but on MLK, I am ready to march anywhere, because I truly believe.

I don't like the normalization of King, or civil rights history: it's like it's been stuffed and mounted, turned into beautiful phrases that express what, it is obvious (isn't it obvious?) that we all think and believe, without having to confront the content of the message, its challenge, its not-yet. It's as though we can't bear to look at King directly, so we direct him to the out-of-town prisons of impoverished schools and afternoon television shows, carried off on the stretchers of empty tributes and hollow homages. Whether it's racial equality, crushing poverty, the deaths of scores of thousands in Iraq, civil rights for gays and lesbians, or the horrifying treatment of detainees in the war on terror, something must give. We may have left the mountaintop, but we haven't crossed Jordan yet.

The Noise of the Long Tail

I'm not always a fan of Joey Sweeney's hipster Philadelphia blog Philebrity, but this is pretty cool: YouTube clips of "50 Fabulous American Freaks." It's mostly musicians like David Byrne, Tom Waits, and Missy Elliot, but also includes people like Allen Ginsberg, Gilda Radner, John Waters, and Richard Pryor. It doesn't capitalize very well on the "Martin Luther King" was a nonconformist conceit -- where are all the other political freaks? -- and I think Sweeney had it in the bag before he decided to use it on MLK. It's a nice reminder that today isn't just about us all getting along, but about some of us having the courage to refuse to fit in. (More thoughts on MLK later.)

Saturday, January 13, 2007

The Grey, Incomprehensible Powder

I accidentally found this Jorge Luis Borges poem while searching on my hard drive for something else. The poem was so beautiful, and the sentiment so appropriate to its discovery, that I had to share it. (The unraveling of that last paradox is left as an exercise for the reader.) The translation is by Robert Mezey and Richard Barnes.



Things

Book fallen back, now hidden by the others

In the deep recesses of the shelf and covered

Slowly and silently by the thick dust

Of many days and nights. Phoenician anchor

The seas of England in their blind abyss

Press hard in on. Looking glass that copies

Nobody's face now that the house is empty.

Fingernail parings that we leave behind

Along the road of time and space. The grey,

Incomprehensible powder that was Shakespeare.

The ever-changing figures of the clouds.

The rose, symmetrical and momentary,

That chance once flashed upon the inward glass

Of a boy's kaleidoscope. The sweating oars

Of Jason's Argo, the first ship. The footsteps

The sleepy, mortal wave washes the sand of.

The radiances of Turner when the lights

Go out in the long gallery one by one

And no step echoes in the lofty dark.

The back of the densely printed map of the world.

The gossamer cobweb in the pyramid.

The blind stone. The inquisitive, tactile hand.

Dreams I have had just before dawn, and lost

When daylight came, clearing it all away.

The beginning and the end of the epic poem

Of Finnesburh, now but a few precious lines

Of iron not worn away by the centuries.

The letter's mirror-image on the blotter.

The turtle lying in the cistern's depths.

What cannot ever be. The other horn

Of the unicorn. The Being both Three and One.

Triangular disc. Point the mind cannot grasp

When Zeno's arrow, motionless in air,

Arrives at last at the target. Flower pressed

Between the pages of a book by Becquer.

The pendulum whose swing time has arrested.

The sword that Odin drove into the tree.

The text on uncut pages. The resounding

Clatter of hoofs in the onslaught at Junin,

A battle which in some eternal way

Has not ended, is still part of the plot.

Sarmiento's shadow on the sunlit walls.

The voice the shepherd heard on the mountainside.

The pile of bones whitening in the desert.

The ball that shot Francisco Borges dead.

The reverse side of the tapestry. The things

That no one sees but Bishop Berkeley's God.

The Other Hague Convention

The NYT has a strong, heartbreaking article on international adoption programs. This article is about programs in the Ukraine where children visit with host families in the USA, then made to return before the families can adopt them. The families then have to travel to Ukraine to finalize adoption. The process is ripe with graft and misinformation, and neither the children nor their prospective adoptive families are guaranteed that they will ever see each other again.

The article as a throwaway mention to an international adoption treaty which would require accreditation of adoption agencies, so I looked it up. The US, after fifteen years, will probably ratify the Hague Convention on International Adoption and implement it in 2008.

Friday, January 12, 2007

The Suicidal Death March of Academic Writing

In this week's Chronicle of Higher Education, Thomas H. Benton writes on publishing and the ivory tower. It's a temp link only, but I'll give the highlights.

First, there's the call-to-arms:

It's time for most us -- and I am thinking in particular of younger academics -- to abandon the genteel pose of being aloof from the sordid marketplace. We should stop acting as if we were monks, destined for a lifetime of cloistered self-denial. Or romantic poets who die penniless and forgotten in their own time, but whose genius and poignant suffering will, one day, move the world to tears.

If we are going to avoid being blockheads, we are going to have to start writing books that more people will want to buy as something besides remainders.


Then there's the absurdist cautionary tale:

I have a colleague who is working on a visual-history project -- her first book -- and, apparently, few university presses can publish it in the kind of format that would do justice to the material. And few such presses are likely to invest much in promoting the book.

But after hearing my colleague give a lecture, a distinguished trade publisher jumped on the project immediately. It seems like an excellent opportunity for my colleague to produce a beautiful book that might reach a wide audience.

Yet my colleague wondered, "Is this respectable?"

At first I guffawed, but perhaps my colleague was right to worry, depending upon her career aspirations. A serious book, rigorously edited, that is read by 10,000 people rather than by 100 -- and that required no subvention -- is often discounted because it was published by a commercial publisher rather than a university press.

As a graduate student, I remember being warned that writing for mainstream audiences would become a red flag in my Google portfolio that I would never be able to escape. And Ph.D.'s who contribute to the blogosphere have been warned by the famous Ivan Tribble to watch their words. In many quarters -- particularly at research universities -- anything but scholarly articles in refereed journals and university press books indicates a lack of seriousness and commitment to the profession.


Then there's the personal confession:

I think becoming a columnist is the best thing I've done with my academic career. I'm sure it has frightened some prospective employers (pseudonymity seldom lasts), but it has also led to talks with agents and publishers. And, finally, it is beginning to lead to contracts to publish books that I think are as serious as my academic work but aimed at a much larger audience.

So far, the experience has been more rewarding and intellectually exciting than most of the crabbed, obscure writing I did when I was trying to prove I was a competent scholar. I find that thinking about a general audience makes me a better teacher than my academic writing ever did. My theory-soaked younger self must have sounded like a man from Mars to most undergraduates.

I no longer feel beholden to the petty rivalries and resentments that characterize academic life. It's like being born again. Imagine it for yourself: There are people out there -- possibly millions of them -- who are willing to pay for the pleasure of reading your work. Those people could give your ideas, expressed in a single mainstream book, the impact of a lifetime of scholarly writing. You can also earn the posthumous respect of Samuel Johnson, as well the relatives who warned you that professors tend to be paupers. Poverty and obscurity, as the world sees them, are not necessarily signifiers of academic virtue.


Yes, "Thomas H. Benton" is a pseudonym.

And what he says is true enough that some time ago, I removed anything on this blog that reveals my full name.

Wednesday, January 10, 2007

A Dish Of Effortless Purity

A witty, enthusiastic, wistful, and nearly perfect appreciation of Momofuku Ando, the recently deceased inventor of ramen noodles, and especially the fearfully salty, delicious noodles he gave us all. By Lawrence Downes, suburban metro editor for the New York Times.

Tuesday, January 09, 2007

Techno-lust Regained

Apple's new iPhone.

My goodness.

According to ZdNet: "The iPhone will be available in June for $499 with a 2-year contract (4GB model). The 8GB model will be $599. It will be sold through the Apple stores and Cingular stores."

And yet, somehow, I want it now.

The Title Is Unintentionally Apt

Brian Boyd, in a puzzling article in The American Scholar titled "Getting It All Wrong," somehow manages the impossible: to write about science like a literature professor, and literature like a science professor. I swear, the beef that some science-minded people have about literary theory makes no sense to me at all.

Boyd's argument seems to go something like this: literary theory (which seems to mean either a few quotes from Louis Menand or a hastily drawn caricature of Jacques Derrida's philosophy) is all wrong, because evolution already says the same thing. Huh? Oh, I guess with evolution we know how we know stuff. Because see, we've got braaaaiiinns.

The peddling of evolutionary literary criticism has been bothering me since it emerged around the same time I entered the field five years ago. I'll try to explain why.

As I see it, biology-based literary criticism has at least two fatal flaws: first, it limits the amont of interesting (or true) things you can say. Essays about "Madame Bovary's Ovaries" don't tell us anything new about Emma Bovary or the book she appears in. They're just barely dressed versions of the argument that Emma acts the way she does because she's horny and Charles isn't much of a catch. What about this don't we know already? If my lit students wrote papers that uninformative, I'd tell them to write the paper again, and to try harder this time.

Historical criticism of the New or old variety and one kind or another of close reading (philosophically informed or otherwise) are, as far as I can tell, the only games in town for saying something new about a literary text. Biological criticism falls under the same problem as the old, aesthetic school of criticism: why, especially if I'm a professional scholar, would I buy a book that tells me something I already know. That's why the old Oxford dons who would peddle their sparsely reasoned opinions about whether Pope or Dryden was a more exquisite read died off. (Even if they were reincarted in the latter-day Harold Bloom -- who does occasionally have other interesting things to say about literature.)

Biology might form a good basis for discussing general epistemology or aesthetics -- say by identifying what part of the brain responds to narrative, how it's different from the part that responds to lyric poetry, and showing how that lobe developed from the part of a chimp's brain that responds to visual memory, danger, and sexual desire. Someone's gotta be working on that, and that would be really interesting. But that's really the proper province of biologists, and maybe some philosophers. Literary criticism, especially criticism about a particular literary text, is a different business. Until I see one of these English profs in a tweed lab coat with leather patches, I'm not buying it.

This brings me to the second flaw of most evolution-informed lit crit: in addition to being bad literary criticism, it's usually bad science. The one that really gets under my skin is the reading of the Aeneid (also in the Barash and Barash volume). According to the "Madame Bovary's Ovaries" authors, we can see evolution at work in the Aeneid, because what's important is that Aeneas founds a city for his descendants to live in for millennia to come. As Barash and Barash put it (in their 2002 Chronicle of Higher Education article):

If Aeneas's genes could spell out their reckoning, it would go somewhat like this: Although staying with Dido is pleasurable, you -- and thus, your genes -- have bigger fish to fry. When the alternative is maximizing your inclusive fitness by founding a dynasty, a sterile dalliance with a middle-aged woman is maladaptive. So Aeneas sets sail once again, revealing, as he departs, an intuitive comprehension of his actions. Thus, as he pleads for Dido's understanding, Aeneas explains, "It is not my own free will that leads me to Italy." In his conscious mind, it is the gods who dictate Aeneas's actions, but deep down, his biological impulses compel him to leave, a kind of ancient "My genes made me do it."


No. First of all, there is no biological imperative to found Rome. If Dido's genetically a bad choice for Aeneas, he could hook up with some other Carthaginian babe, or better yet, not leave Asia at all. (I don't even remember Virgil saying that Dido couldn't bear Aeneas children.) He doesn't have to travel across the Mediterranean, let alone to Hades to converse with the shades of the dead (is there a biological interpretation of that scene?). Odysseus leaves lots of fertile babes (including Helen of Troy herself) to get back to Penelope, who, hot as she is, can't be a spring chicken.

Also, at least if Darwin's right, human beings don't really think about genetic succession that way. We mate with the people we mate with, and if our genes are successful, our traits get passed on. At the micro level, it's an accident: only at the macro level does it really begin to make scientific sense. Saying that Aeneas leaves Dido because he wants to father generation after generation of children is a little like saying giraffes got long necks because they stretched them out trying to reach higher branches. It sounds intuitive, but it really doesn't work that way.

Barash and Barash propose that we test aesthetic works by their evolutionary plausibility -- a character's motives are believable if they make good evolutionary sense. But this presupposes that literature is only concerened with evolutionarily successful behavior. What about evolutionarily unsuccessful behavior? What about homosexuality? For a Darwinian literary critic, is gay desire implausible on its face?

If anything, evolutionary criticism does tell us a little bit about what interests us in literature, but only in a negative sense. Because it's really the genetically suicidal or inscrutable that's aesthetically compelling. Gregor Samsa turns into a monstrous vermin, dying with the thought that he has ruined his family. Proust's narrator, who interrupts his chronic action only for exactingly acute analysis of the foibles of society and the irrationality of human desire. Achilles, who, knowing he will die in Troy, vows not to leave until he has killed Hector in revenge for his friend's death. (Evolutionary biology might explain Priam's horror at Hector's defiled body, but not why Achilles defiled it.) Or Aeneas, who sacrifices sexual pleasure so that he can build the walls to a city for someone else's children to live in. (Romulus might or might not be descended from Aeneas, but the genetically successful ones aren't the Trojans, but the native Latins. That is, Virgil himself, who proudly claims his native Italian blood and declares that Rome will be a city of many and mixed races.

Sadly, Boyd's entry in these lists, this time taking on literary theory more directly, isn't even this cogent. He relies on virtually nothing but fallacious and hackneyed arguments.

"The idea that there is no universal truth runs into crippling difficulties straightaway, since it claims to be a universal truth."

"In fact, not everything in human lives is difference. Commonalities also exist..."

"One of the most extreme advocates of difference was Hitler..."

"For most of the 20th century, anthropology has stressed the difference between peoples, since anthropologists earn attention by reporting on the exoticism of other ways of life."


Bad science to the end. If you read any of the influential anthropologists of the 20th century, it's all about universal or comparative structures -- the elementary structures of kinship, the structural study of mythemes, comparing the Balinese cockfight to King Lear.

Likewise, that fraction of literary theory that concerns itself primarily with difference doesn't deny the existence of commonality -- it just tries to expose the logical contradiction in certain philosophical and political traditions of universality that try to deny difference. I don't think this guy's ever read Derrida or any other philosopher/literary theorist he can't quite bring himself to name. And I don't think he's ever worn a lab coat or gone on a serious zoological or anthropological research trip either.

I guess what I really can't understand is why anyone would think these are good ideas, or even serious ones. And as someone who really has studied science (ok, mathematics, BA) and literature (MA/PhD) and philosophy (BA) and the social sciences (my one year MA program at Chicago counts!), it troubles me that we all seem to understand each other so poorly, and that no genuine connection is in sight.

Sunday, January 07, 2007

The Hypothetical Z-Axis

There's a good, but hardly complete Science News article on the use of mathematics in The Simpsons and Futurama, surely the brainiest cartoons in the history of television.

Astonishingly enough, I was just explaining to someone the other day what "taxicab numbers" were. (Short answer: "taxicab numbers" are the smallest number that can be written as the sum of two cubes n different ways. Find out why they're called "taxicab numbers" here -- it's actually one of the better stories in the history of mathematics.)

I had forgotten the joke about the two robots' serial numbers, but if I remember, there's another joke about Bender being robot # 1729 (sure enough, I found it at this Wolfram site about taxicab numbers).

By the way, "Taxicab Numbers"? Great name for a band. Right up there with Mouthfeel.

Friday, January 05, 2007

Bush's Executions

My opposition to the death penalty, like my opposition to school vouchers and free-market economic policies, has softened as I've gotten older. But Will Bunch from the Philadelphia Daily News has a very good blog post on George Bush, Saddam Hussein, and the lack of dignity in executions. A teaser:

And you thought that hooded, revenge-bent Shiite guards in Baghdad invented the taunting of condemned murderers? Turns out that Saddam's hangmen had nothing on George W. Bush.

Tuesday, January 02, 2007

Define Optimism

optimism, n.

Etymology:[< French optimisme (1737 in sense 1a (see note below); 1788 in sense 3) < classical Latin optimus best (see OPTIMUM n.) + French -isme -ISM. With sense 3 cf. earlier OPTIMIST n. 1.
Leibniz, in his Théodicée (1710), uses optimum as a technical term, on the model of maximum and minimum. Hence the Mémoires de Trévoux, in the issue for Feb. 1737, gives the name optimisme to his doctrine:
1737 Mém. de Trévoux (Fév.) 207 En termes de l'art, il l'appelle la raison du meilleur ou plus savamment encore, et Theologiquement autant que Géométriquement, le systême de l'Optimum, ou l'Optimisme.
It owes its general diffusion to the satirical attack upon the doctrine by Voltaire in Candide ou l'Optimisme (1759).]

1. Contrasted with PESSIMISM n. 3.

a. Philos. The doctrine propounded by Leibniz (1710) that the actual world is the best of all possible worlds. Also: any of various similar philosophical doctrines of earlier or later thinkers.

b. A view or belief which assumes the ultimate predominance of good over evil in the universe...

3. Hopefulness and confidence about the future or the successful outcome of something; a tendency to take a favourable or hopeful view. Contrasted with PESSIMISM n. 2.


The brainy brains+philosophy+science+tech+intellectual site Edge.org recently posted 160 responses to its third "annual question" for 2007: What are you optimistic about? It's worth reading, although it's not nearly perfect.

First, you may want to skim a bit. I don't know exactly what kind of editorial process John Brockman goes through with these questions, but halfway through you might wish that he had put a shorter cap on either the number of contributors or the word count. 100 short paragraphs would be easier to scan, and might even (egad!) whet your appetite to find out more. And way too many of the contributors' answers just seem to be whatever project it is that they're working on or have just completed. "I'm totally optimistic about the subject of my new book." Please.

Another criticism I have in general about Edge, particularly with respect to this question, is that a general intellectual salon, even one celebrating the "third culture" of empirical scientists/philosophers, would be served well by more disciplinary diversity. Brockman makes it clear in his 1991 statement of Edge's raison d'etre that his editorial impetus comes from an attack on the traditional, or in his phrase, the "literary" intellectual -- but why not include political scientists, economists, or more than one historian or novelist? (I'm counting Cory
Doctorow, although I doubt he would have been included if he weren't an editor at Boing Boing) Not only is it misleading to bill your contributors as "THE WORLD"S LEADING THINKERS," but if you really want to change intellectual culture, these are the kinds of conversations you need the other intellectuals to be getting in on.

My last beef is about the loose parameters of the question and the definition of optimism. My own opinion is that the question "What are you optimistic about? Why? Surprise Us!" is not just an invitation to make a prediction, or to point out something that you feel pretty good about. It's really somewhere in between. What I would be interested in identifying, and the category of things I'd tag with the phrase "I'm optimistic about _______," are things that satisfy these conditions:

1) They must have already happened or are currently happening now;
2) they don't necessarily look like they have turned or will turn out very well;
and 3) we have some good reasons to be hopeful about them in the near future regardless.

To take it a little further, things that I'm optimistic about can't just be good but uncertain: there should be some uncertainty about their goodness, too -- or at the very least, their efficacy. A good example a genuinely optimistic statement might be "I'm optimistic about democracy in the Middle East." This is a statement that's at least debatable -- not just that democracy will spread, but that it will be a positive good for people in the region in the near future. Some people have argued that the results of democratic elections in Palestine, Lebanon, Afghanistan, and Iraq have ultimately created too much chaos and tilted towards hardliners and groups with ties to terrorism, etc. Then I would have to back my optimism up with some data: say, transformations in electoral politics in Iran, Palestine, or Egypt. But ideally, I would have to build in some concession to the notion that things might not go well at all.

Another excellent example of this kind of cautioned optimistic thinking could be found in David Denby's terrific and thorough look at the current and impending technological transformations in cinema in this week's New Yorker, "Hollywood Looks For A Future." Denby looks at changes in home screens and media, theater culture and projection technologies, and Hollywood's new business and marketing models, and seems to find something to be optimistic and something to be pessimistic about in every sphere. Very few opinion articles about the film industry feel this thoughtful and well-reported. It's another long article, but very much worth the read.

I don't want to just rag on the Edge respondents, because there are some good examples of this kind of optimistic statement in the Edge poll, too. Walter Isaacson defends the continued value, relevance and technology genius of print; Howard Rheingold nicely defends the positive possibilities of the fact that "the tools for cultural production and distribution are in the pockets of 14 year olds" ; and Brian Goodwin nicely frames his optimism about the development of new energy technologies in the context of the impending crisis (or as Goodwin calls it, "the challenge") of peak oil.

But my larger point is that there's a wider range of things that we can be optimistic about than the impending triumph of science over religion (I hope the intellectual conversation of 2007 will be less dominated by this old saw) or the immanent triumph of someone's pet project (however beneficial success might be). A genuine salon would offer a wide-open exploration of this kind of optimism. And with good luck, we might see that kind of salon come together sooner than we think.