Sunday, November 01, 2009

My 30th Birthday Present

Greetings, longtime readers of Short Schrift. I hope that all of you have been following my shenanigans over at Snarkmarket, which consumes most of my blogly energy. For a little while, I was using Short Schrift as a links-and-quotes diary, but that function now is largely served by my Twitter account.

What I foresee for Short Schrift is that it will become essentially a blog about my life, featuring things that don't fit in to the admittedly wide purview of Snarkmarket. This is such a post.

Tuesday is my 30th birthday. I'd like to buy a Nook e-reader. I'd been setting money aside for it -- okay, I'd been setting it aside for a Kindle -- but I've recently been in a bad accident that's forced me to take the semester off of work. So, I've asked family and friends if they'd be willing to pitch in to collectively buy the Nook as a birthday present. I figure if we can get 13 friends to pitch in 20 dollars each, we're home. My buddy Kelly Bennett suggested setting up a Paypal donation button for this purpose, so that's exactly what I did.







This is here purely as an offer, for folks who wanted to find a way to participate in this but didn't have a better way. Most readers of this blog have never met me, nor do they have any business buying me a birthday present. But, I figure - what the heck. Anything is worth a shot.

Thursday, July 09, 2009

Our Neanderthal Neighbors

Svante Pääbo, "Mapping the Neanderthal Genome":

One thing that we're beginning to see is that we are extremely closely related to the Neanderthals. They're our relatives. In a way, they're like a human ancestor 300,000 years ago. Which is something that leads you to think: what about the Neanderthals? What if they had survived a little longer and were with us today? After all, they disappeared only around 30,000 years ago, or, 2,000 generations ago. Had they survived, where would they be today? Would they be in a zoo? Or would they live in suburbia?...

For example, if the Neanderthals were here today, they would certainly be different from us. Would we experience racism against Neanderthals much worse than the racism we experience today amongst ourselves? What if they were only a bit different from us, but similar in many ways — in terms of language, technology, social groups? Would we still have this enormous division that we make today between humans and non-humans? Between animals and ourselves? Would we still have distanced ourselves from animals and made this dichotomy that is so strong in our thinking today? These things we will never know, right? But they are fascinating things to thnk about.

Tuesday, July 07, 2009

The New Liberal Arts: Photography

New Liberal Arts, a Snarkmarket/Revelator collaboration, is available for sale today. It's 80pp and costs $8.95. Robin reports that after five hours, half of the initial print run of 200 copies has already been sold.

I have three short pieces in the book. I co-wrote what I hope is a cogent Introduction with Robin Sloan and what I know is an absolutely whiz-bang take on Journalism with Matt Thompson. I also wrote a solo essay on Photography, which I really do think is the new liberal art par excellence, the technology that changes the whole meaning of both science and the humanities.

We're going to be doing a lot to promote this book, which I'm sure is going to sell out soon. When it does, it'll be available for everyone has a freely downloadable PDF. (There are plans for Kindle and MobiPocket versions, too.) So when you buy one, you're helping to unlock it for everyone else. Since the logic of freeriding doesn't seem to deter digital humanists, I hope this is seen as a boon and not a rip.

But now I just want to give you an idea of the sort of things we're thinking about. This is what I had to say about "Photography."

FROM THE INAUGURAL ADDRESS FOR THE COLLEGE OF PHOTOGRAPHY

Apart from the exact sciences, nothing has transformed the idea of the liberal arts as profoundly as PHOTOGRAPHY -- which enables not only the recording of still and moving images, but their reproduction, transmission, and projection onto a page or screen.

The classical liberal arts are arts of the word, products of the book, the letter, the lecture. The Renaissance added the plastic arts of painting and sculpture, and modernity those of the laboratory. The new liberal arts are overwhelmingly arts of the DOCUMENT, and the photograph is the document par excellence.

Like the exact sciences, photographic arts are industrial, blurring the line between knowledge and technology. (The earliest photographers were chemists.) Like painting and sculpture, they are visual, aesthetic, based in both intuition and craft. Like writing, photography is both an action and an object: writing makes writing and photography makes photography. And like writing, photographic images have their own version of the trivium -- a logic, grammar, and rhetoric.

We don't only SEE pictures; we LEARN how they're structured and how they become meaningful. Some of our learning is intuitive, gathered from the ways our eyes and brains make sense of the visual world. We have an habitual sense of how photographic meaning is created, taken from our experience watching movies or taking our own photographs. But we also have a critical sense of it, taken from our aesthetic responses to photographs and cinema, and our awareness of how both are edited, enhanced, and manipulated. Photography is the art and science of the real, but also of the fake; of the depth and the surface, and the authentic as well as the inauthentic or nonauthentic appearances of the world.

Rather than "pictures," "film," or even images, PHOTOGRAPHY, the recording of light, is the term to bet on: It's the only category that can describe pictures on metal, glass, paper, celluloid, or flash memory -- whether still or moving, analog or digital, recorded or broadcast, in color or black and white, representative or abstract. It is essential to examine equally the transmission and consumption of photography as well as its production: still images, cinema, television, digital video, and animation all belong to you, as well as photoreproduction, photomontage, image databases, and any possible combination where the still or moving image appears. Even the optical cables that have transmitted this data to you several times over communicate through pulses of light. Photography is the science of the interrelation and specificity of all of these forms, as well as their reproduction, recontextualization, and redefinition. Photography is a comprehensive science; photography is a comparative literature.

It took universities CENTURIES to answer the demand posed by the exact sciences to liberal education -- it is your task to pose -- and to answer -- the demand photography makes of us now.

The Ripped Veneer Of Inhumanity


6a00d83451c45669e2011571d3a179970b-800wi.png

For Andrew Sullivan, AIDS explains why 1990 was the year American attitudes towards gays changed:

Remember: most of these deaths were of young men. If you think that the Vietnam war took around 60,000 young American lives randomly over a decade or more, then imagine the psychic and social impact of 300,000 young Americans dying in a few years. Imagine a Vietnam Memorial five times the size. The victims were from every state and city and town and village. They were part of millions and millions of families. Suddenly, gay men were visible in ways we had never been before. And our humanity - revealed by the awful, terrifying, gruesome deaths of those in the first years of the plague - ripped off the veneer of stereotype and demonization and made us seem as human as we are. More, actually: part of our families.

I think that horrifying period made the difference. It also galvanized gay men and lesbians into fighting more passionately than ever - because our very lives were at stake. There were different strategies - from Act-Up actions to Log Cabin conventions. But more and more of us learned self-respect and refused to tolerate the condescension, double standards, discrimination and violence so many still endured. We were deadly serious. And we fight on in part because of those we had lost. At least I know I do.
See also this appreciation of Bayard Rustin, who wrote "The New Niggers Are Gays" and "From Montgomery to Stonewall" in 1986:
Today, blacks are no longer the litmus paper or the barometer of social change. Blacks are in every segment of society and there are laws that help to protect them from racial discrimination. The new “niggers” are gays. … It is in this sense that gay people are the new barometer for social change. … The question of social change should be framed with the most vulnerable group in mind: gay people. [Rustin, as quoted by Rev. Sekou.]
When you think about Rustin -- who as an openly gay black civil rights activist, pacifist, and former Communist was about as vulnerable as you could get in the confluence of sexual, racial, and political paranoia of the 1940s, 50s, and early 60s - and also Sullivan's account of the "homocons," I think there was a sense by the end of the 80s, with the decline of communism and the relative achievements of civil rights legislation, that a certain kind of culture war had run its course, and that the time for legal protection and activism for gays had finally come. AIDS gave it an existential urgency, but the shifting politics of "pink" had finally made it possible.

Thursday, July 02, 2009

Hallucinating Sovereignty

Chris Bray:

In the first volume of his biography of Andrew Jackson, Robert Remini neatly captures the strangeness of state sovereignty. It happens in a single quiet paragraph that describes the ceremony on the morning of July 17, 1821, in which Spain relinquished its claim to the Floridas. Jackson handed the Spanish governor "the instruments of his authority to take possession of the territory," and Governor José Callava responded by giving Jackson control of his keys and his archives. Then, finally, having surrendered the symbols of power, Callava "released the inhabitants of West Florida from their allegiance to Spain." The paragraph ends with members of the Spanish crowd -- suddenly finding themselves members of an American crowd -- bursting into tears...

For historians, state power rests on very thin crust. State actors manage imagined communities with invented traditions, but only for as long as the ritual works. States are ephemeral; sovereignty grows out of statements on paper and the performance of symbolic acts -- here are the keys, General Jackson -- and the tenuousness of that recurring project means that it keeps crashing and burning. States disappear, and take the massively powerful apparatus of the state with them; the Stasi archives seem quaint. Floridians were Spanish until some guy read a sentence from a piece of paper that said they weren't.

But how do we bridge that view of the state with the bizarre reality of this thing that owns all the gravity and subsumes everything -- General Motors, AIG, Iraq, the financial industry, and, coming soon, entire broad swaths of the energy and health care fields, and etcetera -- so entirely that we can sit inside its orbit and casually talk about our affairs like Iraq and Lebanon?

I think of the state as a consensual hallucination, and yet somehow the American model turns out to run much of the world like personal property. I don't understand how we get from there to here. The "state" is a guy who shows up with some pieces of paper -- the "instruments of his authority to take possession" -- and then really takes possession.


(From Cliopatria/History News Network.)

Wednesday, July 01, 2009

Erving Is Always On

Miriam Burstein:

Many years ago, I heard a sociologist tell an anecdote about being the only undergraduate at a faculty party.  After a short while, he realized that somebody was watching him from a distance.  Worse still, wherever he went, there his mysterious observer followed.  Understandably anxious, he finally cornered one of his professors to find out what on earth was going on.  "Oh, that's Erving," his professor sighed.  "He's always on." The Erving in question was Erving Goffman, the author of The Presentation of Self in Everyday Life (1959). 



Everyone I know who ever encountered Erving Goffmann has a similar story. The one I've heard most often is that he would arrange for his students (at UPenn, natch) to meet for class outside on the lawn in front of the library, then hide and watch, laughing at how they reacted when he didn't show.

(From The Little Professor.)

Tuesday, June 30, 2009

Issac Hayes And His Marvelous Scalp

Pitchfork:

Think about how crazy this is for a moment: Stax loses Otis Redding and the Bar-Kays to a plane crash and the rights to their back catalog (and, later, Sam & Dave) to Atlantic. Without their biggest stars and their best session group, Stax executive Al Bell takes a desperate but necessary gamble: in an attempt to build an entirely new catalog out of scratch, he schedules dozens of all-new albums and singles to be recorded and released en masse over the course of a few months. And out of all of those records, the album that puts the label back on the map is a followup to a chart dud, recorded by a songwriter/producer who wasn't typically known for singing, where three of its four songs run over nine and a half minutes. And this album sells a million copies. If it weren't for the New York Mets, Isaac Hayes' Hot Buttered Soul would be the most unlikely comeback story of 1969.

Friday, June 26, 2009

Why Michael Jackson's Death Feels Different

Josh Marshall:

I think it's because so much of Michael Jackson's life seemed like make believe. Sometimes farcical. But always like play acting, somehow. So much theatrics. So many costumes. And on various levels the desire -- often frighteningly realized -- to deny or defy his physical self, his age and much more. Even the things that seemed terribly serious, perhaps especially those -- the trials for child molestation which could have landed him in jail for years or decades -- never seemed to stick. Whether he was truly guilty of these accusations or not, it always blew over. All together it conditioned me to think of Jackson as someone whose drama was always just drama -- whether it was the threat of prison or vast debts or bizarre physical tribulations -- all of it would pass or blow over, perhaps not even have been real, leaving him more or less in place, as weird or surreal as ever, but basically unchanged.

In the span of time between when news first broke that Jackson had been rushed to the hospital and when it was reported that he'd died, I actually saw some people speculating on the web that the whole thing might be a stunt to get out of his tour dates or perhaps some health emergency that was not quite as serious as it was being described. And even though these speculations turned out to be tragically, embarrassingly off base, I wasn't sure if they might not turn out to be accurate since it seemed somehow more in character, at least more in keeping with the never ending drama.

In the end death just seemed more out of character for Michael Jackson than for most people. Because through most of his life he and reality seemed at best on parallel but seldom overlapping courses. And death is reality, full stop.



(From Talking Points Memo.)

Thursday, June 25, 2009

University of Chicago / Your Mom

Andrea Walker, "Chicago, Where Fun Comes to Die":

The U. of C. is known for serious thinking combined with a sarcastic, self-deprecating sense of humor that always amused me when displayed on undergraduate T-shirts. These described the school as “The level of hell Dante forgot,” “The place where fun comes to die,” and “The University of Chicago: if it was easy it would be…your mom.” Though my new favorite has to be “The University of Chicago: where the only thing that goes down on you is your GPA.”



(From The Book Bench.)

Wednesday, June 24, 2009

The Boom and Bust of Asian Cinema in the U.S.

Andrew O'Hehir interviews Grady Hendrix at the New York Asian Film Festival :

"You have acquisitions people picking up movies that aren't very good," he says, "and releasing them to an audience that doesn't know anything about them or have any context in which to enjoy them. They're being written about by a press that knows less and less about more and more Asian films and directors as magazines and newspapers downsize, fire their older writers and pay for shorter articles that are generally just about that week's new releases."



The Kindle and the Jewish Question

Chava Willig Levy,The Kindle and the Jewish Question:

Like my father and the Jewish doctoral student, a Chasidic master living at the turn of the 20th century looked at the world around him with an eye to Jewish life. One day, a disciple approached him and asked, "Rebbe, every time I turn around, I hear about new, modern devices in the world. Tell me, please, are they good or bad for us?"

"What kind of devices?" asked the Rebbe.

"Let me see. There's the telegraph, there's the telephone, and there's the locomotive."

The Rebbe replied, "All of them can be good if we learn the right lessons from them. From the telegraph, we learn to measure our words; if used indiscriminately, we will have to pay dearly. From the telephone, we learn that whatever you say here is heard there. From the locomotive, we learn that every second counts, and if we don’t use each one wisely, we may not reach our destination in life.”

So, what can we learn from the Kindle? Like the telegraph, telephone and locomotive, it offers us lessons - as I see it, at least three of them - for living life meaningfully...

Content: Imagine receiving a Kindle as a gift from your father. Now picture three separate scenarios:

Scenario #1: Several months later, he asks you if you like it. You hesitate to answer. How can you tell him that it's been sitting in its box, unused, devoid of content?

Scenario #2: Several months later, he sees you using it. You see him beaming with delight — until he notices that you're reading some insipid, platitude- or gossip-filled book.

Scenario #3: Several months later, you take him out to dinner for the express purpose of thanking him for his gift and the meaningful, scintillating material to which it has introduced you.

The spiritual parallel is obvious. Granted the gift of life, what do we fill it with? Nothing? Junk? Or purpose?



(Via LISNews.)

A Gesellschaft of Angestellten

The Importance of Order: German Researchers Tackle Untidy Desks, from Der Spiegel Online:

It's the same problem everywhere: Overloaded desks aren't just frustrating for their owners -- they also make employers unhappy. Academic researchers have long been studying the issue and have reached some surprising conclusions. According to a study on the "lean office" by the Stuttgart-based Fraunhofer Institute for Manufacturing Engineering and Automation, a good 10 percent of working time is wasted through "superfluous or missing work material" or "constant searching for the right document in chaotic file directories."

The researchers found that wasted time in poorly organized offices could eat up nearly one-third of annual working time. Over a year, that means there are 70 days in which employees are -- as Kurz puts it -- "engaging in pointless activity." It's a statistic which would shock any personnel manager...

According to figures from the German association of office furniture manufacturers BSO, more than 18 million German employees and freelancers -- out of a population of 82 million -- have their own desk at work. In addition, there are a further 2 million desks in private homes. That's a lot of potential clutter.

"The desk is kind of like an exterior version of our brain," says Küstenmacher. "Whatever you have in your head, is reflected, almost magically, on your desk."

Tuesday, June 23, 2009

One Hundred and Forty-One Years of the Typewriter

[I especially like the shout-out to linotype.] June 23, 1868: Tap, Tap, Tap, Tap, Tap … Ding! | This Day In Tech | Wired.com:

Christopher Latham Sholes’ machine was not the first typewriter. It wasn’t even the first typewriter to receive a patent. But it was the first typewriter to have actual practical value for the individual, so it became the first machine to be mass-produced.

With the help of two partners, Sholes, a printer-publisher from Milwaukee, Wisconsin, perfected his typewriter in 1867. After receiving his patent, Sholes licensed it to Remington & Sons, the famous gunmaker. The first commercial typewriter, the Remington Model 1, hit the shelves in 1873.

The idea was based on the principle of Gutenberg’s movable-type printing press, arguably the most important invention in the history of mass communications. As with the printing press, ink was applied to paper using pressure. While the typewriter couldn’t make multiple copies of an entire page, it simplified — and democratized — the typesetting process for a single copy with a system of reusable keys that inked the paper by striking a ribbon.

Within a couple of decades of the first Remington typewriter, big-press operations would begin using a modified, more sophisticated keyboard system, known as Linotype, for their typesetting needs. That little tweak helped make the mass production of newspapers possible.

Monday, June 22, 2009

Project Girl Wonder

Henry Jenkins interviews Mary Borsellino about Project Girl Wonder and her book Girl and Boy Wonders: Robin in Cultural Context:

The idea of Stephanie Brown as Robin was so fresh and strange as a direction, but was handled so clumsily and with such obvious institutionalised sexism that it was pretty vile to witness, both as a cultural observer and as a fan who's also a feminist.
Essentially, for those not familiar with the character or with Robin's larger back story: when the second Robin, a boy named Jason, died, Batman created a memorial out of his costume in the Batcave. Stephanie was the fourth Robin, and her costume was different to the three boys who'd had it before her in that she sewed a red skirt for herself. Just a few months after her first issue as Robin was released, Stephanie was tortured to death with a power drill by a villain, and then died with Batman at her bedside.

The sexualised violence alone was pretty vomitous, but what made it so, so much worse for me was that Batman promptly forgot her. DC's Editor in Chief had the gall to respond to questions of how her death would affect future stories by saying that her loss would continue to impact the stories of the heroes -- how sick is that? Not only is the statement clearly untrue, since the comics were chugging along their merry way with no mention of her or her death, but it was also an example of the ingrained sexism of so much of our culture. Stephanie herself was a hero, and had been a hero for more than a decade's worth of comics, but the Editor's statement made it clear that he only thought of male characters as heroes, and the females as catalysts for those stories. It was a very clear example of the Women in Refrigerators trope, which has been a problem with superhero comics for far, far too long.



Lots of good stuff here on Spock on Uhura, and on Carrie Kelly in Frank Miller's The Dark Knight Returns:

Robin crosses all sorts of imposed gender boundaries, both literal and figurative. Carrie Kelley, for example, the young girl who becomes Robin in Frank Miller's The Dark Knight Returns, is referred to by a news broadcaster as 'the Boy Wonder'; she looks completely androgynous in-costume, and so is assumed to be a boy. Dick Grayson and Tim Drake both assume female identities to go undercover in numerous stories -- Dick even played Bruce's wife on one occasion back in the forties -- and Stephanie Brown's superhero identity before she became a Robin, the Spoiler, is thought to be a boy even by her own father.


Never understood why Miller made Kelley take on a different identity in The Dark Knight Strikes Again.

Sunday, June 21, 2009

The Map Is Not The Territory

Tim O'Reilly gives an interview on the relevance of classical education to digital humanism:

The unconscious often knows more than the conscious mind. I believe this is behind what Socrates referred to as his inner "daimon" or guiding spirit. He had developed the skill of listening to that inner spirit. I have tried to develop that same skill. It often means not getting stuck in your fixed ideas, but recognizing when you need more information, and putting yourself into a receptive mode so that you can see the world afresh.

This skill has helped me to reframe big ideas in the computer industry, including creating the first advertising on the world wide web, bringing the group together that gave open source software its name, and framing the idea that "Web 2.0" or the "internet as platform" is really about building systems that harness collective intelligence, and get better the more people use them. Socrates is my constant companions (along with others, from Lao Tzu to Alfred Korzybski to George Simon, who taught me how to listen to my inner daimon.)

I believe that I've consistently been able to spot emerging trends because I don't think with what psychologist called "received knowledge," but in a process that begins with a raw data stream that over time tells me its own story.

I wrote about this idea in my Classics honors thesis at Harvard. The ostensible subject was mysticism vs logic in the work of Plato, but the real subject was how we mistake the nature of thought. As Korzybski pointed out in the 1930s, "the map is not the territory," yet so many of us walk around with our eyes glued to the map, and never notice when the underlying territory doesn't match, or has changed. Socrates was one of my teachers in learning how not to get stuck following someone else's map.

Friday, June 19, 2009

Money, Philosophy, and Tragedy in Ancient Greece

The Greeks and money by Richard Seaford in the TLS:

This new and revolutionary phenomenon of money itself underpinned and stimulated two great inventions in the Greek polis of the sixth century, “philosophy” and tragedy. “Philosophy” (or rather idea of the cosmos as an impersonal system) was first produced in the very first monetized society, early sixth-century Ionia, and – even more specifically – in its commercial centre Miletos. The tendency of pre-modern society to project social power onto cosmology (for example, “king Zeus rules the world”) applies to the new social power of money. And the following description applies equally to money and to much of the cosmology of the early philosophers: universal power resides not in a person but in an impersonal, all-underlying, semi-abstract substance.


Annie Clark Unplugged

I really like her acoustic take on "The Strangers".:

Virginia Woolf's Collected Essays

Claire Harman in the TLS:

It is a surprise to discover, from Stuart Clarke’s excellent notes, how hard Woolf worked on these seemingly effortless pieces for the New York Herald Tribune, the Yale Review or the Nation, and how the “grind & the screw & the torture” of writing criticism neither decreased with time nor put her off. The sheer number of essays in this volume bears witness to the useful balance she found between different kinds of composition: “writing articles is like tying one’s brain up in neat brown paper parcels”, she wrote to Ethel Smyth. “O to fly free in fiction once more! – and then I shall cry, O to tie parcels once more!”

Wednesday, June 17, 2009

Green and Saffron

George Packer on why Iran's nascent revolution may be different from Burma's stillborn 2007 protests:

For a few days, Burmese citizens with cell-phones (rare and expensive in Burma), modems (agonizingly slow), and cameras were able to send reports, still pictures, and video to the exile media, such as Democratic Voice of Burma in Oslo, which in turn posted them on Web sites that people inside Burma could read. This was how the protesters got the word out to the world and in turn stayed informed of what was happening inside the country (in these situations people on the inside almost always have less information than those outside). It became a prototype of how new media could become a powerful tool in the hands of otherwise defenseless civilians. But far fewer Burmese than Iranians have access to these things, and after a few days the regime narrowed the Internet bandwidth so tightly that almost nothing could get in or out. Iran, a much more technologically developed country, can’t afford to shut down communications across the board. Information technology is too integrated into the life of the country and the government for a complete news blackout. So the demonstrators continue to figure out ways to organize themselves, and the whole world continues to watch.



(From Interesting Times.)

Foucault, Iran, 1978

Michel Foucault, "What Are The Iranians Dreaming Of?"

The situation in Iran can be understood as a great joust under traditional emblems, those of the king and the saint, the armed ruler and the destitute exile, the despot faced with the man who stands up bare-handed and is acclaimed by a people. This image has its own power, but it also speaks to a reality to which millions of dead have just subscribed.

The notion of a rapid liberalization without a rupture in the power structure presupposes that the movement from below is being integrated into the system, or that it is being neutralized. Here, one must first discern where and how far the movement intends to go. However, yesterday in Paris, where he had sought refuge, and in spite of many pressures, Ayatollah Khomeini "ruined it all."

He sent out an appeal to the students, but he was also addressing the Muslim community and the army, asking that they oppose in the name of the Quran and in the name of nationalism these compromises concerning elections, a constitution, and so forth.

Is a long-foreseen split taking place within the opposition to the shah? The "politicians" of the opposition try to be reassuring: "It is good," they say. "Khomeini, by raising the stakes, reinforces us in the face of the shah and the Americans. Anyway, his name is only a rallying cry, for he has no program. Do not forget that, since 1963, political parties have been muzzled. At the moment, we are rallying to Khomeini, but once the dictatorship is abolished, all this mist will dissipate. Authentic politics will take command, and we will soon forget the old preacher." But all the agitation this weekend around the hardly clandestine residence of the ayatollah in the suburbs of Paris, as well as the coming and going of "important" Iranians, all of this contradicted this somewhat hasty optimism. It all proved that people believed in the power of the mysterious current that flowed between an old man who had been exiled for fifteen years and his people, who invoke his name...

It is often said that the definitions of an Islamic government are imprecise. On the contrary, they seemed to me to have a familiar but, I must say, not too reassuring clarity. "These are basic formulas for democracy, whether bourgeois or revolutionary," I said. "Since the eighteenth century now, we have not ceased to repeat them, and you know where they have led." But I immediately received the following reply: "The Quran had enunciated them way before your philosophers, and if the Christian and industrialized West lost their meaning, Islam will know how to preserve their value and their efficacy."

When Iranians speak of Islamic government; when, under the threat of bullets, they transform it into a slogan of the streets; when they reject in its name, perhaps at the risk of a bloodbath, deals arranged by parties and politicians, they have other things on their minds than these formulas from everywhere and nowhere. They also have other things in their hearts. I believe that they are thinking about a reality that is very near to them, since they themselves are its active agents.

It is first and foremost about a movement that aims to give a permanent role in political life to the traditional structures of Islamic society. An Islamic government is what will allow the continuing activity of the thousands of political centers that have been spawned in mosques and religious communities in order to resist the shah's regime. I was given an example. Ten years ago, an earthquake hit Ferdows. The entire city had to be reconstructed, but since the plan that had been selected was not to the satisfaction of most of the peasants and the small artisans, they seceded. Under the guidance of a religious leader, they went on to found their city a little further away. They had collected funds in the entire region. They had collectively chosen places to settle, arranged a water supply, and organized cooperatives. They had called their city Islamiyeh. The earthquake had been an opportunity to use religious structures not only as centers of resistance, but also as sources for political creation. This is what one dreams about [songe] when one speaks of Islamic government.

Paper Without Books

From Libraries of the Future (1965):

As a medium for the display of information, the printed page is superb. It affords enough resolution to meet the eye's demand. It presents enough information to occupy the reader for a convenient quantum of time. It offers great flexibility of font and format. It lets the reader control the mode and rate of inspection. It is small, light, movable, cuttable, clippable, pastable, replicable, disposable, and inexpensive. Those positive attributes all relate, as indicated, to the display function. The tallies that could be made for the storage, organization, and retrieval functions are less favorable.

When printed pages are bound together to make books or journals, many of the display features of the individual pages are diminished or destroyed. Books are bulky and heavy. They contain much more information than the reader can apprehend at any given moment, and the excess often hides the part he wants to see. Books are too
expensive for universal private ownership, and they circulate too slowly to permit the development of an efficient public utility. Thus, except for use in consecutive reading — which is not the modal application in the domain of our study — books are not very good display devices. In fulfilling the storage function, they are only fair. With respect to retrievability they are poor. And when it comes to organizing the body of knowledge, or even to indexing and abstracting it, books by themselves make no active contribution at all.

Saturday, June 13, 2009

The Great Sewing Machine of Memory

Scott Horton, "Proust—Memory and the Foods of Childhood" (Harper's Magazine):

The image of Proust’s madeleine, a spongy almond-flavored cookie baked in a press to look like a scallop shell, a delight with an afternoon cup of tea or coffee, has become an icon for this reclusive writer. But what is Proust telling us in this passage? All memories are not created equal, he suggests, some are imprinted more strongly than others. One can have a very sharp recollection of a specific experience from one’s childhood, and still have forgotten entirely what one had for breakfast in the morning. Moreover, the long-past recollection need not even be associated with some objectively significant event, something traumatic, or happy, or historical. Second, he is pointing to the role that smell and taste play in memory, which may in fact be very intense but is not generally closely associated with memory. Third, he is noting that memory and its clarity and detail depend a lot on the mood of the individual, both at the time of the initial experience and at the time of occurrence.

One can struggle to recollection without success, and then the memory can come back suddenly, flooding the imagination of the rememberer, triggered by the strangest coincidence–the cup of linden-flower tea and the cookie, for instance. In our age, memory is facilitated greatly by artificial intelligence, by the Internet and computerized search programs. But the purely human memory has a very curious search program. The way we order and collect thoughts and memories is not entirely logical, and it links to all the senses–those of vision, touch, taste and sound. Our mind seems to act like a great sewing machine, stitching things together for reasons that may not immediately be present but which generally relate to the synchronization of the senses.



Proust called this kind of memory memoire involuntaire -- pretty much the opposite of the kind of thing you can Google search for.

Thursday, June 11, 2009

Our Minds Are Made Of Meat

Jonah Lehrer on "Emotional Perception":

From its inception in the mid-1950's, the cognitive revolution was guided by a single metaphor: the mind is like a computer. We are a set of software programs running on 3 pounds of neural hardware. (Cognitive psychologists were interested in the software.) While the computer metaphor helped stimulate some crucial scientific breakthroughs - it led, for instance, to the birth of artificial intelligence and to insightful models of visual processing, from people like David Marr - it was also misleading, at least in one crucial respect. Computers don't have feelings. Because our emotions weren't reducible to bits of information or logical structures, cognitive psychologists diminished their importance.

Now we know that the mind is an emotional machine. Our moods aren't simply an irrational distraction, a mental hiccup that messes up the programming code. As this latest study demonstrates, what you're feeling profoundly influences what you see. Such data builds on lots of other work showing that our affective state seems to directly modulate the nature of attention, both external and internal, and thus plays a big role in regulating thinks like decision-making and creativity. (In short, positive moods widen the spotlight, while negative, anxious moods increase the focus.) From the perspective of the brain, it's emotions all the way down.



(From The Frontal Cortex.)

Tuesday, June 09, 2009

Hard Ideas in Hardcover

Peter J. Dougherty, "A Manifesto for Scholarly Publishing" - from ChronicleReview.com:

Books — specifically scholarly titles published by university presses and other professional publishers — retain two distinct comparative advantages over other forms of communication in the idea bazaar:

First, books remain the most effective technology for organizing and presenting sustained arguments at a relatively general level of discourse and in familiar rhetorical forms — narrative, thematic, philosophical, and polemical — thereby helping to enrich and unify otherwise disparate intellectual conversations.

Second, university presses specialize in publishing books containing hard ideas. Hard ideas — whether cliometrics, hermeneutics, deconstruction, or symbolic interactionism — when they are also good ideas, carry powerful residual value in their originality and authority. Think of the University of Illinois Press and its Mathematical Theory of Communication, still in print today. Commercial publishers, except for those who produce scientific and technical books, generally don't traffic in hard ideas. They're too difficult to sell in scalable numbers and quickly. More free-form modes of communication (blogs, wikis, etc.) cannot do justice to hard ideas in their fullness. But we university presses luxuriate in hard ideas. We work the Hegel-Heidegger-Heisenberg circuit. As the Harvard University Press editor Lindsay Waters notes, even when university presses succeed in publishing so-called trade books (as in Charles Taylor's recent hit, A Secular Age), we do so because of the intellectual rigor contained in such books, not in spite of it.

Hard ideas define a culture — that of serious reading, an institution vital to democracy itself. In a recent article, Stephen L. Carter, Yale law professor and novelist, underscores "the importance of reading books that are difficult. Long books. Hard books. Books with which we have to struggle. The hard work of serious reading mirrors the hard work of serious governing — and, in a democracy, governing is a responsibility all citizens share." The challenge for university presses is to better turn our penchant for hard ideas to greater purpose.



(Via Brainiac.)

Sunday, June 07, 2009

Imaginationland

Andrew Sullivan in 2007:

The longer this war goes on and the more we find out, the following scenario seems to me to be the best provisional explanation for a lot of what our secret, unaccountable, extra-legal war-government has been doing - and the countless mistakes which have been laid bare. On 9/11, Cheney immediately thought of the worst possible scenario: What if this had been done with WMDs? It has haunted him ever since - for good and even noble reasons. This panic led him immediately to think of Saddam. But it also led him to realize that our intelligence was so crappy that we simply didn't know what might be coming. That's why the decision to use torture was the first - and most significant - decision this administration made. It is integral to the intelligence behind the war on terror. And Cheney's bizarre view of executive power made it easy in his mind simply to break the law and withdraw from Geneva because torture, in his mind, was the only weapon we had...

But torture gives false information. And the worst scenarios that tortured detainees coughed up - many of them completely innocent, remember - may well have come to fuel US national security policy. And of course they also fueled more torture. Because once you hear of the existential plots confessed by one tortured prisoner, you need to torture more prisoners to get at the real truth. We do not know what actual intelligence they were getting, and Cheney has ensured that we will never know. But it is perfectly conceivable that the torture regime - combined with panic and paranoia - created an imaginationland of untruth and half-truth that has guided US policy for this entire war. It may well have led to the president being informed of any number of plots that never existed, and any number of threats that are pure imagination. And once torture has entered the system, you can never find out the real truth. You are lost in a vortex of lies and fears. In this vortex, the actual threats that we face may well be overlooked or ignored, as we chase false leads and pursue non-existent WMDs.



(Via Jay Rosen.)

Saturday, June 06, 2009

The Internet Is Different Now

"Blogs Falling in an Empty Forest," Douglas Quenqua, NYTimes:

“Before you could be anonymous, and now you can’t,” said Nancy Sun, a 26-year-old New Yorker who abandoned her first blog after experiencing the dark side of minor Internet notoriety. She had started it in 1999, back when blogging was in its infancy and she did not have to worry too hard about posting her raw feelings for a guy she barely knew.

Ms. Sun’s posts to her blog — www.cromulent.org, named for a fake word from “The Simpsons” — were long and artful. She quickly attracted a large audience and, in 2001, was nominated for the “best online diary” award at the South by Southwest media powwow.

But then she began getting e-mail messages from strangers who had seen her at parties. A journalist from Philadelphia wanted to profile her. Her friends began reading her blog and drawing conclusions — wrong ones — about her feelings toward them. Ms. Sun found it all very unnerving, and by 2004 she stopped blogging altogether.

“The Internet is different now,” she said over a cup of tea in Midtown. “I was too Web 1.0. You want to be anonymous, you want to write, like, long entries, and no one wants to read that stuff.”

Um, Yeah; That's Not Cool

Publius at Obsidian Wings:

So there you have it – I’ve been officially outed by Ed Whelan. I would never have done that to my harshest critic in a million years, but oh well.

And to be clear – the proximate cause was that Whelan got mad that I criticized him in a blog post. More specifically, he’s mad that Eugene Volokh made him look rather silly – and he’s lashing out at me for pointing that out, and publishing my name...

As I told Ed (to no avail), I have blogged under a pseudonym largely for private and professional reasons. Professionally, I’ve heard that pre-tenure blogging (particularly on politics) can cause problems. And before that, I was a lawyer with real clients. I also believe that the classroom should be as nonpolitical as possible – and I don’t want conservative students to feel uncomfortable before they take a single class based on my posts. So I don’t tell them about this blog. Also, I write and research on telecom policy – and I consider blogging and academic research separate endeavors. This, frankly, is a hobby.

Privately, I don’t write under my own name for family reasons. I’m from a conservative Southern family – and there are certain family members who I’d prefer not to know about this blog (thanks Ed). Also, I have family members who are well known in my home state who have had political jobs with Republicans, and I don’t want my posts to jeopardize anything for them (thanks again).

All of these things I would have told Ed, if he had asked. Instead, I told him that I have family and professional reasons for not publishing under my own name, and he wrote back and called me an “idiot” and a “coward.” (I’ve posted the email exchange below).


Whalen's post is titled "Exposing an Irresponsible Anonymous Blogger":

In the course of a typically confused post yesterday, publius embraces the idiotic charge (made by “Anonymous Liberal”) that I’m “essentially a legal hitman” who “pores over [a nominee’s] record, finds some trivial fact that, when distorted and taken totally out of context, makes that person look like some sort of extremist.” In other of his posts (including two which I discussed here and here), publius demonstrated such a dismal understanding of the legal matters he opined on—including, for example, not understanding what common law is—that it was apparent to me that he had never studied law.

Well, I’m amused to learn that I was wrong about publius’s lack of legal education. I’ve been reliably informed that publius is in fact the pseudonym of law professor John F. Blevins of the South Texas College of Law. I e-mailed Blevins to ask him to confirm or deny that he is publius, and I copied the e-mail to the separate e-mail address, under the pseudonym “Edward Winkleman,” that publius used to respond to my initial private complaints about his reckless blogging. In response, I received from “Edward Winkleman” an e-mail stating that he is “not commenting on [his] identity” and that he writes under a pseudonym “[f]or a variety of private, family, and professional reasons.” I’m guessing that those reasons include that friends, family members, and his professional colleagues would be surprised by the poor quality and substance of his blogging.


(Edward Winkleman is actually a former member of Publius's group blog, Obsidian Wings.)

Scorsese and Kubrick Do This Really, Really Well

Matt Zoller Seitz, "On the Creepy Alluring Art of the Follow Shot," a video essay for The L Magazine:



"Following" is a montage of clips illustrating one of my favorite types of shots: one where the camera physically follows a character through his or her environment. I love this shot because it's neither first-person nor third; it makes you aware of a character's presence within the movie's physical world while also forcing identification with the character. I also love the sensation of momentum that following shots invariably summon. Because the camera is so close to the character(s) being followed, we feel that we're physically attached to those characters, as if by an invisible guide wire, being towed through their world, sometimes keeping pace, other times losing them as they weave through hallways, down staircases or through smoke or fog.



(Via Fimoculous.)

UrbanOutfitters is selling Olivettis

16650962_01_f.jpeg
UrbanOutfitters.com > Olivetti Manual Typewriter - Black:
There's nothing like the resounding "click" of an old-fashioned typewriter. Live out all of your next-great-writer fantasies with this classic travel machine from Olivetti, the 100 year-old Italian manufacturer favored by authors from Sylvia Plath to Stephen King. Features include: 49 keys with 86 symbols; margin stop with 8 stop tab keys; Space Bar with Repeater key; variable line space; paper and carriage release lever; ribbon color selector switch; black plastic housing and carrying case for secure transporting.



(Via Tomorrow Museum, who calls it "nostalgia tech". The less-creative counterpart to retronovation.)

Just What Is "the Passive Voice"?

Geoffrey K. Pullum, "Drinking the Strunkian Kool-Aid: victims of page 18":

Why are college-educated Americans so prone to think that simple active-voice intransitives like "bus blows up" or "took on racial overtones" or "were leaving" or "there will be setbacks" or "this happened," or even transitive examples like "has instructed us to," are in the passive voice?



(From Language Log.)

Bursting the Higher Education Bubble

"Will Higher Education Be the Next Bubble to Burst?" - The Chronicle of Higher Education:

Consumers who have questioned whether it is worth spending $1,000 a square foot for a home are now asking whether it is worth spending $1,000 a week to send their kids to college. There is a growing sense among the public that higher education might be overpriced and under-delivering.



And here's a new idea -- year-round college:

Two former college presidents, Charles Karelis of Colgate University and Stephen J. Trachtenberg of George Washington University, recently argued for the year-round university, noting that the two-semester format now in vogue places students in classrooms barely 60 percent of the year, or 30 weeks out of 52. They propose a 15-percent increase in productivity without adding buildings if students agree to study one summer and spend one semester abroad or in another site, like Washington or New York. Such a model may command attention if more education is offered without more tuition.

Brigham Young University-Idaho charges only $3,000 in tuition a year, and $6,000 for room and board. Classes are held for three semesters, each 14 weeks, for 42 weeks a year. Faculty members teach three full semesters, which has helped to increase capacity from 25,000 students over two semesters to close to 38,000 over three, with everyone taking one month (August) off. The president, Kim B. Clark, is a former dean of the Harvard Business School and an authority on using technology to achieve efficiencies. By 2012 the university also plans to increase its online offerings to 20 percent of all courses, with 120 online courses that students can take to enrich or accelerate degree completion.


Sunday, May 31, 2009

Ira, Jad, and Robert

Must listen: Ira Glass, Jad Abumrad, and Robert Krulwich on the differences between radio and television. Includes such gems as how radio amplifies intimacy and television turns gesture into parody, Jad's observation that This American Life made real people's true stories sound like fairytales, and how Stephen Colbert is more like a radio personality (his show more like a radio show, his audience more like a radio audience) than a television one.

(My own thesis about Colbert: it's his perfect miming of big-personality talk show hosts like Limbaugh, O'Reilly, Scarborough, Hannity, Olbermann, usw., most of whom started on radio, continue to host radio shows, and whose TV shows and audiences are still a whole lot like radio.)

Dating the Past

Historiscientific nerd alert: There's a hot new method of dating historical artifacts, specifically ceramic artifacts, based on their moisture uptake. But there's at least one big problem -- it assumes that mean temperatures are constant. HNN's Jonathan Jarrett has the goods, in a paragraph so well-linked that I've cut-and-pasted them all. (I also changed some of the punctuation and split Jarrett's long paragraph into a few short ones.)

Now, you may have heard mention of a thing called "the medieval warm period." This is a historical amelioration of temperature in Europe between, roughly, the tenth and twelfth centuries. This probably decreased rainfall and other sorts of weather bad for crops, therefore boosted agricultural yield, pumped more surplus into the economy, fuelled demographic growth and arguably deliquesced most European societies to the point where they changed in considerable degree.

However, because of the current debate on climate change, it has become a ball to kick around for climate "scientists," those who wish to argue that we're not changing the climate pointing to it and ice coverage in Norse-period Greenland (which was less than there is currently despite less carbon dioxide in the atmosphere then), while those who wish to argue that we are changing the climate (and, almost always, that this relates to CO2 output, which does seem like a weak link in the argument) dismiss it as legend or scorn the very few and unscientific datapoints, not really caring that the historical development of European society in the ninth to eleventh centuries just doesn't make sense without this system change from the ground. None of these people are medievalists and they're not trying to prove anything about the Middle Ages, so it gets messy, but there is a case about this temperature change that has to be dealt with.

This obviously has an impact on this research. If the sample were old enough, the errors and change probably ought to balance out. But if it were, from, say, the eighth century, then the moisture uptake in the four or five subsequent centuries would be higher than expected from the constant that this research used and the figure would be out, by, well, how much? The team didn't know: "The choice of mean lifetime temperature provides the main other source of uncertainty, but we are unable to quantify the uncertainty in this temperature at present."

We, however, need to know how far that could knock out the figures. Twenty years? More? It begins to push the potential error from a single sample to something closer to a century than a year. That is, the margin of historical error (as opposed to mathematical error) on this method could be worse than that of carbon-dating, and we don't actually know what it is.



Lots of good stuff in the whole, long post, including an annotated run-down of ALL of the ways we know how to date old things.

Finally, You Too Can Be Marcus Aurelius

I am a sucker for long histories, especially when they're summarized with simple schema. Phillip Greenspun wrote this for a talk on how the internet has changed writing, under the subhead "Publishing from Gutenberg (1455) through 1990":

The pre-1990 commercial publishing world supported two lengths of manuscript:
  • the five-page magazine article, serving as filler among the ads

  • the book, with a minimum of 200 pages

Suppose that an idea merited 20 pages, no more and no less? A handful of long-copy magazines, such as the old New Yorker would print 20-page essays, but an author who wished his or her work to be distributed would generally be forced to cut it down to a meaningless 5-page magazine piece or add 180 pages of filler until it reached the minimum size to fit into the book distribution system.


In the same essaylet, Greenspun has a subhead, "Marcus Aurelius: The first blogger?":

Marcus Aurelius, Roman Emperor from 160 AD to 180 AD, kept a journal during a military campaign in central Europe (171-175). It was not available until after his death and not widely available until printed in 1558 as the Meditations...

This was preserved because the author had been Emperor. How much ancient wisdom was lost because the common Roman citizen lacked TCP/IP? [By 1700 BC, the Minoans were trading with Spain, had big cities with flush toilets, a written language, and moderately sophisticated metalworking technology. Had it not been for the eruption of Thera (on Santorini), it is quite possible that Romans would have watched the assassination of Julius Caesar on television.]


It's not all since-the-dawn-of-civilization stuff -- there are lots of examples of writing that really only works on the internet and more pedestrian things like the virtues of blogs over Geocities. "Webloggers generally use a standard style and don't play with colors and formatting the way that GeoCities authors used to." This shows how in the weblog, content becomes more important than form. (Psst-- It also suggests that if Minoan civilization had survived and spread, Augustine's Confessions might have been excerpted on a lot of home pages with lots of crappy animated GIFs.)

Via Daring Fireball.

Friday, May 29, 2009

It Is Not Logical

Andrew Hungerford -- aka the smartest, funniest dramatist * astrophysicist = lighting director you should know -- has written the best post on the physical holes in the new Star Trek movie that I think can be written.

Basically, almost nothing in the movie makes sense, either according to the laws established in our physical universe or the facts established in the earlier TV shows and movies.

Wherever possible, Andy provides a valiant and charitable interpretation of what he sees, based (I think) on the theory that "what actually happened" is consistent with the laws of physics, but that these events are poorly explained, characters misspeak, or the editing of the film is misleading. (I love that we sometimes treat Star Trek, Star Wars, etc., like the "historical documents" in Galaxy Quest -- accounts of things that REALLY happened, but that are redramatized or recorded and edited for our benefit, as opposed to existing ONLY within a thinly fictional frame.)

If you haven't seen the movie yet, you probably shouldn't read the post. It will just bother you when you're watching it, like Andy was bothered. If you have, and you feel like being justifiably bothered (but at the same time profoundly enlightened), check it out right now. I mean, now.

In Praise of Post-

Music critic Simon Reynolds praises music's moments of in-between:

It rankles a bit that the late '80s are now treated as a mere prequel to grunge. The recently aired Seven Ages of Rock on VH1 Classic was a marked improvement on earlier TV histories of rock, which tended to jump straight from Sex Pistols to Nirvana. But its episode on U.S. alternative rock nonetheless presented groups like the Pixies, Dinosaur Jr., and Sonic Youth as preparing the ground for Nirvana. That's not how it felt at the time: Sonic Youth and the rest seemed fully formed significances in their own right, creative forces of monstrous power, time-defining in their own way (albeit through their refusal of the mainstream). My Melody Maker comrade David Stubbs wrote an end-of-year oration proclaiming 1988—annum of Surfer Rosa, Daydream Nation, My Bloody Valentine's Isn't Anything—to be the greatest year for rock music. Ever!

We actually believed this, and our fervor was infectious, striking an inspirational, Obama-like chord with young readers heartily sick of the idea that rock's capacity for renewal had been exhausted in the '60s or the punk mid-'70s. Yet that period will never truly be written into conventional history (despite efforts like Michael Azerrad's Our Band Could Be Your Life) because it doesn't have a name. It's too diverse, and it's not easily characterized. For instance, the groups were "underground," except that by 1988 most of them—Husker Du, Throwing Muses, Sonic Youth, Butthole Surfers—had already signed, or soon were to sign, to majors. Finally, it'll never get fairly written into history because, damn it, grunge did happen.


As I've gotten older, I like 80s alternative music better than the stuff I grew up with in the 90s, although now (with almost two decades' distance), the 90s looks better, and just plain different, from the radio I remember. (I didn't listen to Belle and Sebastian, Neutral Milk Hotel, or Smog in the 90s. I do now.)

The weird thing is that to be a precursor is a recipe for big sales but also diminished significance in your own right. The 80s are full of bands that influenced Nirvana who don't really sound like Nirvana, who don't sound ANYTHING like the rest of what passed for grunge, who actually don't make a lot of sense in that context.

But to be post- is a kind of liberation -- one has a sense of being reflective, developing, moving beyond something else, a continuation with that history but also a break. So the coolest thing to be is post-punk. It's so cool that the first half of this decade saw dozens of bands who were post-post-punk.

So Reynolds identifies two strains of in-between music to go along with 80s post-punk: post-disco and post-psychedelic. I'm convinced that these typologies totally work; I might be more invested in the post-psychedelia bands he lists than the post-disco ones, but it all sounds interesting. And in this case, naming is claiming: giving these bands and their sound a name actually gives you a context to talk about them, one that might be misleading (in which case, time to toss it out) but which might be a way to call more attention to things that would otherwise go unnoticed.

He also includes this nice postscript (har har) on post-rock and post-metal:

There are some other "post-" genres out there, but to my mind, they describe something quite different from the above. Take post-rock, a term that mysteriously emerged in the early '90s to describe experimental guitar bands that increasingly abandoned guitars altogether. (Oh, OK, it was me who came up with that one.)

What Kinds of Math Do We Need?

Biologists are debating how much quantitative analysis their field needs; at Language Log, Mark Liberman pivots to linguistics:

The role of mathematics in the language sciences is made more complex by the variety of different sorts of mathematics that are relevant. In particular, some areas of language-related mathematics are traditionally approached in ways that may make counting (and other sorts of quantification) seem at least superficially irrelevant — these include especially proof theory, model theory, and formal language theory.

On the other hand, there are topics where models of measurements of physical quantities, or of sample proportions of qualitative alternatives, are essential. This is certainly true in my own area of phonetics, in sociolinguistics and psycholinguistics, and so on. It's more controversial what sorts of mathematics, if any, ought to be involved in areas like historical linguistics, phonology, and syntax...

Unfortunately, the current mathematical curriculum (at least in American colleges and universities) is not very helpful in accomplishing this — and in this respect everyone else is just as badly served as linguists are — because it mostly teaches thing that people don't really need to know, like calculus, while leaving out almost all of the things that they will really be able to use. (In this respect, the role of college calculus seems to me rather like the role of Latin and Greek in 19th-century education: it's almost entirely useless to most of the students who are forced to learn it, and its main function is as a social and intellectual gatekeeper, passing through just those students who are willing and able to learn to perform a prescribed set of complex and meaningless rituals.)


My thoughts are still inchoate on this, so I'll throw it open -- is calculus 1) a waste of time for 80-90% of the folks who learn it, 2) unfairly dominating of the rest of useful mathematics, 3) one of the great achievements of the modern mind that everyone should know about, or 4) all of the above?

More to the point -- what kinds of maths (as they say in the UK) have you found to be most valuable to your later life, work, thinking, discipline, whatever?

And looking to the future - I don't think we have a mathematics entry as such in the New Liberal Arts book-to-come; but if we did, what should it look like?

The Negative Dialectics of Whiteness

Ta-Nehisi Coates:

The idea is that Latinos have a dual experience that whites don't have and that, all things being equal, they'll be able to pull from that experience and see things that whites don't. The problem with this reasoning is it implicitly accepts the logic (made for years by white racists) that there is something essential and unifying running through all white people, everywhere. But White--as we know it--is a word so big that, as a descriptor of experience, it almost doesn't exist.

Indeed, it's claims are preposterous. It seeks to lump the miner in Eastern Kentucky, the Upper West Side Jew, the yuppie in Seattle, the Irish Catholic in South Boston, the hipster in Brooklyn, the Cuban-American in Florida, or even the Mexican-American in California all together, and erase the richness of their experience, by marking the bag "White." This is a lie--and another example of how a frame invented (and for decades endorsed) by whites is, at the end of the day, bad for whites. White racism, in this country, was invented to erase the humanity and individuality of blacks. But for it to work it must, necessarily, erase the humanity of whites, too.


TNC of course makes the further (and necessary point) point that these are all fictions that become socially real.

P.S.: I realize the "negative dialectics" reference is probably too insidery for 98% of readers. It's a term that Theodor Adorno used for a title of his book. Hegel defined identity as "the identity of identity and nonidentity" - the idea being that any concept or act of identification glosses over differences and unifies things that are like in some ways but unlike in others. For Adorno, negative dialectics explores "the nonidentity of identity and nonidentity," i.e., disintegrating all of that.

Cf. the kind of weird quasi-discourse on whether Judge Sotomayor will or will not be the first "Hispanic" judge on the Supreme Court - the idea being that Justice Cardoza (whose ancestors, Portuguese Jews, emigrated to New York state in the eighteenth century) would qualify. If you try to pursue a purist/universalist idea of racial identity to the end, you start to focus on definitional descriptors (biological and/or cultural ancestry on the Iberian peninsula) that just wipe out all differences. "Hispanic" in this context may be as much of a lie-word -- that is to say, as powerful a concept -- as "white."

Faking It In Translation

Suzanne Menghraj loved Pierre Bayard's How to Talk About Books You Haven’t Read so much that she read it twice. She wanted to read Bayard's 2000 book Comment améliorer les oeuvres ratées (How to Improve Failed Works). But it hadn't been translated, and she couldn't speak or read French. So she decided to bang it out herself anyways:

I came very close to failing French several times over the eight years I studied the language. This does not make me proud. But it does make me want to explore my persistent lack of facility with a language whose structure and habits I understand only well enough to catch a word here, a sense or mood there (let’s say I “skim” French). And so, a good French-English dictionary in hand, I read “Hélas!” (literally, “Alas!”), the introduction to Comment améliorer les oeuvres ratées and was as taken with the iconoclastic ambitions expressed in it as I am with those expressed in How to Talk About Books You Haven’t Read—so taken that I decided to give translation of “Hélas!” a shot.


My own speaking French is terrible, and my reading French is so slow that I've read more than a few books with the original in one hand and a translation in the other, jotting notes with a pen between my teeth when I can't be bothered to put either book down. (I'm telling you - this is the only way to read Proust.)

And my German's probably about the same as Menghrai's French. I was astonished when I switched from philosophy to comparative literature, because suddenly everyone around me was fluent as hell - they were born in Austria, they spent every summer in Paris, they didn't just like to dick around with Kant or Baudelaire.

But I still think that my ambient awareness of, my ability to skim four or five different languages, has really helped me do a lot of things I otherwise wouldn't be able to do. I say, let's have more people half-assing it in languages not their own.

Language is like cooking, or sex: if you get all hung up on being really, really good, not only won't it be fun, you're probably never going to get around to doing it at all.

Via Willing Davidson at The Book Bench.

Sonority in Translation

Marvelous profile of Svetlana Gaier, translator of Dostoyevsky into German:

Svetlana Ivanov was 18 years old when the Germans marched into Kiev (she acquired the name Geier later from her husband, a violinist). Although these events were the prelude to great suffering for countless subjects of the Soviet Union, it was a time of great promise for the young woman. Like others willing to work for the Germans for a one-year period, she was eligible to receive a scholarship to go to Germany. Having received private lessons in French and German from childhood, she was able to work as an interpreter for a Dortmund construction firm that was erecting a bridge across the Dnieper River.

Svetlana and her mother – who came from a family of tsarist officers - were victims of Stalinism. Svetlana Geier still recalls watching as a small child while her grandmother cut up family photos into tiny pieces with manicuring scissors: under the Communist regime, their possession could have been dangerous. Her father, a plant breeding expert, was interned during the purges of 1938. He remained in prison for 18 months, was interrogated and abused, but nonetheless eventually released. The following year, he died from the after-effects of imprisonment. Still ostracized even after his release, he spent his final months in a dacha outside of town, cared for by his daughter.

In the eyes of the young interpreter’s countrymen, her work for the Germans had discredited her: "As far as they were concerned, I was a collaborator." After Stalingrad, she could easily imagine what awaited her under Soviet rule. She took advantage of an offer to enter the German Reich with her mother, somewhat starry-eyed, and still hoping to receive a scholarship. That she, a "worker from the east" (her automatic classification in Nazi Germany) actually received it - one of two Humboldt scholarships reserved for "talented foreigners" - borders on the miraculous. Playing benevolent roles in her lengthy and stirring account of these events are a generous entrepreneur, an alert secretary, and a pair of good-natured assistants at the Ministry for the Occupied Eastern Territories...

Now, a year before the end of World War II, Svetlana Ivanov began her literary studies. She recalls the very first lecture she heard, Walter Rehm's "The Essence of the Tragic," which she attended in the company of her fellow students, all of them men with war injuries. She still has her notes.


I'm reminded, more than a little ironically, of the line the rabbi speaks at the beginning of Tony Kushner's Angels in America: "You can never make that crossing that she made, for such Great Voyages in this world do not any more exist. But every day of your lives the miles that voyage between that place and this one you cross. Every day. You understand me? In you that journey is."

I really like this description of her translation method:

Svetlana Geier’s method, if one can call it that, is an acoustic one. She immerses herself in the text until she has absorbed it completely, is able to hear its unique tenor, or as she says, "its melody." Then she induces it to resound in German, and this again takes place acoustically, for Geier dictates her translations. They ring out aloud before ever becoming fixed on paper. Her Dostoevsky translations have received extraordinarily praise for this "sonorous" character in particular. Finally, it is said, the divergent voices of Dostoevsky’s protagonists have become distinguishable.


Geier's last translation, of a book by Dostoevsky that I haven't read, Podrostok - Geier's title, Ein grüner Junge, brings the German closer to Constance Garnett's A Raw Youth -- also sounds fascinating. But, I've already excerpted this short article to death, so you should click on it if you, you know, actually want to know something about her/FD's book.

The New Socialism is the New Humanism

We loooove Kevin Kelly around here at Snarkmarket. Robin tipped me off to his stuff and he's since joined Atul Gawande, Roger Ebert, Virginia Heffernan, Clay Shirky, Michael Pollan, Clive Thompson, Gina Trapani, Jason Kottke, Ben Vershbow, Hilzoy, Paul Krugman, Sy Hersh, and Scott Horton (among others) in the Gore-Gladwell Snarkfantastic Hall of Fame. Dude should have his own tag up in here.

But I think there's a rare misstep (or rather, misnaming) in his new Wired essay, "The New Socialism: Global Collectivist Society Is Coming Online." It's right there in the title. That S-word. Socialism.

Now, don't get me wrong. I like socialism where socialism makes sense. Almost everyone agrees that it makes sense to have a socialized police and military. I like socialized (or partially socialized) education, and I think it makes a lot of sense to have socialized health insurance, as part of a broad social safety net that helps keep people safe, capable, knowledgeable, working. Socialism gets no bad rap from me.

I know Kelly is using the word socialism as a provocation. And he takes pains to say that the new socialism, like the new snow, is neither cold nor wet:

We're not talking about your grandfather's socialism. In fact, there is a long list of past movements this new socialism is not. It is not class warfare. It is not anti-American; indeed, digital socialism may be the newest American innovation. While old-school socialism was an arm of the state, digital socialism is socialism without the state. This new brand of socialism currently operates in the realm of culture and economics, rather than government—for now...

Instead of gathering on collective farms, we gather in collective worlds. Instead of state factories, we have desktop factories connected to virtual co-ops. Instead of sharing drill bits, picks, and shovels, we share apps, scripts, and APIs. Instead of faceless politburos, we have faceless meritocracies, where the only thing that matters is getting things done. Instead of national production, we have peer production. Instead of government rations and subsidies, we have a bounty of free goods.


But I think of socialism as something very specific. It's something where a group of citizens pools their resources as part of a democratic (and at least partially technocratic) administering of benefits to everyone. This could be part of a nation-state or a co-op grocery store. And maybe this is too Hobbesian, but I think about it largely as motivated by a defense against something bad. Maybe there's some kind of general surplus-economy I'm missing where we can just socialize good things without risk. That'd be nice.

When masses of people who own the means of production work toward a common goal and share their products in common, when they contribute labor without wages and enjoy the fruits free of charge, it's not unreasonable to call that socialism.


But I'll put this out as an axiom: if there's no risk of something genuinely bad, no cost but opportunity cost, if all we're doing is passing good things around to each other, then that, my friend, is not socialism.

This is a weird paradox: what we're seeing emerge in the digital sphere is TOO altruistic to be socialism! There isn't enough material benefit back to the individual. It's not cynical enough! It solves no collective action problems! And again, it's totally individualistic (yet totally compatible with collectivities), voluntarist (yet totally compatible with owning one's own labor and being compensated for it), anti-statist (yet totally compatible with the state). It's too pure in its intentions and impure in its structure.

Kelly, though, says, we've got no choice. We've got to call this collectivism, even if it's collective individualism, socialism:

I recognize that the word socialism is bound to make many readers twitch. It carries tremendous cultural baggage, as do the related terms communal, communitarian, and collective. I use socialism because technically it is the best word to indicate a range of technologies that rely for their power on social interactions. Broadly, collective action is what Web sites and Net-connected apps generate when they harness input from the global audience. Of course, there's rhetorical danger in lumping so many types of organization under such an inflammatory heading. But there are no unsoiled terms available, so we might as well redeem this one.


In fact, we have a word, a very old word, that precisely describes this impulse to band together into small groups, set collective criteria for excellence, and try to collect and disseminate the best, most useful, most edifying, most relevant bodies of knowledge as widely and as cheaply as possible, for the greatest possible benefit to the individual's self-cultivation and to the preservation and enrichment of the culture as a whole.

And that word is humanism.

The Soul of American Medicine

If I ever meet Atul Gawande, I'm giving him a high-five, a hug, and then I'm going to try to talk to him for about fifteen minutes about why I think he's special. From "The Cost Conundrum," in the new New Yorker:

No one teaches you how to think about money in medical school or residency. Yet, from the moment you start practicing, you must think about it. You must consider what is covered for a patient and what is not. You must pay attention to insurance rejections and government-reimbursement rules. You must think about having enough money for the secretary and the nurse and the rent and the malpractice insurance...

When you look across the spectrum from Grand Junction [Colorado] to McAllen [Texas]—and the almost threefold difference in the costs of care—you come to realize that we are witnessing a battle for the soul of American medicine. Somewhere in the United States at this moment, a patient with chest pain, or a tumor, or a cough is seeing a doctor. And the damning question we have to ask is whether the doctor is set up to meet the needs of the patient, first and foremost, or to maximize revenue.

There is no insurance system that will make the two aims match perfectly. But having a system that does so much to misalign them has proved disastrous. As economists have often pointed out, we pay doctors for quantity, not quality. As they point out less often, we also pay them as individuals, rather than as members of a team working together for their patients. Both practices have made for serious problems...

Activists and policymakers spend an inordinate amount of time arguing about whether the solution to high medical costs is to have government or private insurance companies write the checks. Here’s how this whole debate goes. Advocates of a public option say government financing would save the most money by having leaner administrative costs and forcing doctors and hospitals to take lower payments than they get from private insurance. Opponents say doctors would skimp, quit, or game the system, and make us wait in line for our care; they maintain that private insurers are better at policing doctors. No, the skeptics say: all insurance companies do is reject applicants who need health care and stall on paying their bills. Then we have the economists who say that the people who should pay the doctors are the ones who use them. Have consumers pay with their own dollars, make sure that they have some “skin in the game,” and then they’ll get the care they deserve. These arguments miss the main issue. When it comes to making care better and cheaper, changing who pays the doctor will make no more difference than changing who pays the electrician. The lesson of the high-quality, low-cost communities is that someone has to be accountable for the totality of care. Otherwise, you get a system that has no brakes.

Two Visions Of Our Asian Future

Looking to the east for clues to the future (or the past) of the west isn't the least bit new, but these two recent takes (both in the NYT, as it happens) offer some interesting contrasts.

First, Paul Krugman looks at Hong Kong:

Hong Kong, with its incredible cluster of tall buildings stacked up the slope of a mountain, is the way the future was supposed to look. The future — the way I learned it from science-fiction movies — was supposed to be Manhattan squared: vertical, modernistic, art decoish.

What the future mainly ended up looking like instead was Atlanta — sprawl, sprawl, and even more sprawl, a landscape of boxy malls and McMansions. Bo-ring.

So for a little while I get to visit the 1950s version of the 21st century. Yay!

But where are the flying cars?


And Choe Sang-Hun shows us South Korea:

In the subway, Ms. Kim breezes through the turnstile after tapping the phone on a box that deducts the fare from a chip that contains a cash balance. While riding to school, she uses her mobile to check if a book has arrived at the library, slays aliens in a role-playing game, updates her Internet blog or watches TV.

On campus, she and other students touch their mobiles to the electronic box by the door to mark their attendance. No need for roll call — the school’s server computer logs whether they are in or how late they are for the class.

“If I leave my wallet at home, I may not notice it for the whole day,” said Ms. Kim, 21. “But if I lose my cellphone, my life will start stumbling right there in the subway.”

It has been a while since the mobile phone became more than just a phone, serving as a texting device, a camera and a digital music player, among other things. But experts say South Korea, because of its high-speed wireless networks and top technology companies like Samsung and LG, is the test case for the mobile future.

“We want to bring complex bits of daily life — cash, credit card, membership card and student ID card, everything — into the mobile phone,” said Shim Gi-tae, a mobile financing official at SK Telecom, the country’s largest wireless carrier. “We want to make the cellphone the center of life.”


It was easier in the 1950s for Americans to imagine flying cars than it was to imagine cashless subways. Hell, it may still be easier.

Height or distance? The billboard ad or the cellphone ad? Physical mobility or mobility of information? The skyscraper or the network?

Virginia Woolf on the Future of the Book

From a BBC radio debate with her husband (and publisher) Leonard, titled "Are Too Many Books Written and Published?":

Books ought to be so cheap that we can throw them away if we do not like them, or give them away if we do. Moreover, it is absurd to print every book as if it were fated to last a hundred years. The life of the average book is perhaps three months. Why not face this fact? Why not print the first edition on some perishable material which would crumble to a little heap of perfectly clean dust in about six months time? If a second edition were needed, this could be printed on good paper and well bound. Thus by far the greater number of books would die a natural death in three months or so. No space would be wasted and no dirt would be collected.


Via the New Yorker's Book Bench.

Adventures in Paleoblogging

pennyblack.jpg


Clusterflock's skeleton crew has some nice nineteenth-century stuff this weekend:

Papa's Got A Brand New Bag

File under: "Why didn't you just Twitter this, again?" I've been shopping for a laptop bag as we speak, so I am 100% primed for this, but I still love Lifehacker's "What's In Our Bags" series. Gina Trapani just posted her bag + contents, shouting-out a bagufacturer I'd never heard of, and an awesome idea I'd never thought of -- headphone splitters so two people can watch a movie on a plane or train!

Me, I keep insane junk in my bag -- whatever the Bookstore was selling the day my old whatever the Bookstore was selling up and quit on me -- for way too long -- receipts and airplane stubs, books and student papers (oops), pens in zippered components that don't even work (the pens, not the zippers). The only constant companion is laptop plus plug. Even then, sometimes I discover (as I did on a trip to central NY for a job talk) that there's a scone from Au Bon Pain where my plug should be.

But I wish, nay long for, a genuine system! And the Lifehacker folks actually seem to have one!

It's also positive proof that the dematerialization thesis (you know, the idea that objects themselves don't matter, everything is up in the cloud, etc.) is bunk at worst, needs to be qualified at best. We just pretend that matter doesn't matter, until you can't get your Prezi on the screen 'cause you forgot your DVI-VGA thingy, if you ever even took it out of the box in the first place.

Here are people living the life digitale to the fullest, and what do they do? Schlep their stuff around in a bag, just like us jerks. And when they have a good idea, do they whip out their magic pen-with-a-microphone for instant digitalization? Only if they're jotting it down on a 99-cent spiral notebook. All this is very reassuring to me.

It Was Citizen Kane

This Kids in the Hall sketch has come up twice in conversation this week. I consider it, like the film that gives it its name, essential viewing. Enjoy.