Friday, June 25, 2004

Botany, history, and natural selection

I'm in the midst of Michael Pollan's The Botany of Desire and, for the most part, I'm enjoying it. The writing's a bit overblown and there's too much of his personal life included, but his main point is rather intriguing. The book would be considerably better if he more fully explored the ramifications of his argument and spent less time reminiscing about planting tulips in his parents' garden.

Here's his main point: humans like to think that they've domesticated a wide range of plants for their own benefit. But we don't need to view the human-controlled dissemination of plants as a process that solely benefits humans. Instead, (at the risk of anthropomorphizing them) plants have a real interest in making themselves attractive to humans. If, say, people find a particular flower enchanting and want fields of that flower surrounding them, that species is far more likely to survive than a brown, wilty thing that people would more likely crush under their heel than plant in their garden. In a very real sense, plants use people just as much as people use plants. Perhaps the relationship between human and plants should be seen more as symbiosis than domestication.

One of Pollan's more intriguing observations is about Darwin's concept of artificial selection. Darwin pointed to the fact that humans selectively breed plants and animals to bring out certain characteristics to support his claim that species change over time. Darwin went on to argue that natural selection was a different process, one in which an organism's environment provided the pressures that selected particular characteristics over time.

Pollan rightly points out that, from the perspective of the animal or plant being bred, there's no difference between natural selection and Darwin's artificial selection. Flowers don't care whether it's bees that are instinctively drawn to bright colors or gardeners who consciously seek out a certain shade who do the work of pollination. All that matters is that there's some external factors that affect what genetic material is passed onto future generations.

There's a parallel to draw between this conflation of artificial and natural selection and the production of history (in the sense of writing history, not in the sense of actually doing things later considered historic).

The birth of academic history is traced to Leopold von Ranke, a 19th-century German historian. Ranke stressed rigorous archival research which sought primary sources to get as close to the event in question as possible. This new "scientific" history is typically seen as a dramatic departure from earlier methods of producing history which simply took older, existing narratives as truthful.*

Scientific history can be seen an analogous to Darwin's artificial selection. The new historians went into the archives with an eye for sources with particular features: proximity to the event of interest, reliability, etc. Those features allowed the information present in those sources to be preserved for posterity in the articles and books written by historians. In a very real sense, this is a case of survival of the fittest, where the fittest are those sources most valued by historians with consciously constructed objectives.

If academic history is an instance of artificial selection, what, then, is history's parallel to natural selection? It seems to me that the myriad ways in which non-historians engage with the past constitute the natural selection of history.

Plenty of the past is forgotten.** But certain people and events are remembered, even without the help of professional historians. Take Independence Day celebrations. Americans started commemorating the Fourth of July in the earliest years of the republic. Parades, speeches, newspaper reprints of the Declaration of Independence, you name it. But not everything about Independence Day was remembered. The fact that Philadelphia was the location were independence was declared, for example, was almost completely ignored. People wanted to see the Fourth of July as a national holiday, not one linked to a particular place (for more on this speculative conclusion, see my paper The Forgotten Fourth).

Details aside, the point is that individuals and societies make decisions about what bits of the past are to be passed onto future generations.

As you may have guessed by now, my overarching argument is entirely parallel to Pollan's. The distinction between the artificial selection practiced by historians and the natural selection that everyone else engages in is spurious. Historians and the general public certainly look for different things when they seek to remember the past. Historians, by and large, want reliability. Most Americans want stories that they can personally relate to.

But what matters is that there's no inherent difference between the historical selections of historians and the historical selections of the general public. From the perspective of the historical information itself (I'm anthropomorphizing again... my apologies), it matters little whether it gets preserved by historians or marchers in a parade.

This doesn't mean we should treat the historical knowledge of the public as equally valuable as that of professional historians. Far from it. But historians should recognize that their knowledge of the past is ultimately selective and subjective, just like everyone else's.

*This is a drastic simplification. It's been shown rather definitively that Ranke was not the first to examine historical sources with a critical eye. See, for example, Francesco Guicciardini.

**I've got another post brewing on this. The gist: the past is lost. We can only recover a small bit of it. But it's still worth getting whatever we can.

Wednesday, June 23, 2004

Matthew Yglesias and quantities

Matthew Yglesias had two posts yesterday that dealt, directly or indirectly, with the quantities denoted by various determiners.

First, Yglesias's article in the American Prospect (linked from this post) took a look at the Bush administration's claims regarding connections between al-Qaeda and Iraq through the lens of Gricean implicature. It's nice to see the work of linguists applied in the real world. It would be even nicer if those considering the importance of language looked at how people use it in the real world.

Here's Yglesias's example of Gricean implicature:

If I tell you, "they're not all in the meeting yet" when, in fact, no one is in the meeting, I haven't lied to you about anything. If no one is there, then, indeed, they're not all there. Nevertheless, any reasonable listener will have understood me to mean that some, but not all, of the expected attendees are then. Again, if I say, "some people are in the room" when only one person is in the room, I'm not speaking falsely, I'm simply speaking uncooperatively. You'll infer that more than one person is in the room although, strictly speaking, I said no such thing.


I disagree. If you say, "Some people are in the room" and it turns out there's only one person, you're lying. Now, "Some person is in the room," is a different story. As I pointed out in the comments, "Morphologically plural NPs don't always refer to multiple individuals (e.g. 'every dog' when there's only one dog in the discourse), but 'some X' when X is plural requires at least two individuals fulfilling the predicate of the sentence for the statement to be true."

Another commenter pointed out that, in logic, "some" is taken to mean "at least one." I can live with that. But conversational English is not equivalent to logic. If you tell me that some people are in the room, there's gotta be at least two for you to be telling the truth.

Second, in his discussion of how "a majority of the US public wound up believing Iraq was behind the attacks," Yglesias elided a statement of Jeff Jarvis. "Few if any people in power said that Iraq was behind the attacks" (the original can be found here) became "Few ... people in power said that Iraq was behind the attacks." It's a small change, to be sure. But Yglesias's commenters raked him over the coals for it, claiming that the difference in meaning is substantial.

The commenters' criticisms are, in my mind, unfair. I think Yglesias is wrong, but for a different reason.

First, as to why the commenters are wrong. The difference between "few" and "few, if any" is not one that affects the truth conditions of the sentence in question. The commenters' point is that by getting rid of the "if any," Yglesias eliminated the possibility, hinted at by Jarvis, that no people in power connected Iraq and 9/11.

The problem is that "few" already includes the possibility that no people in power made that connection. Suppose I'm a teacher and someone in my class how many students passed a recent test. Suppose I respond by saying, "Few students passed the exam." Suppose no students, in fact, passed the exam. Did I lie? No, I didn't. I misled the class, suggesting that at least some students (at least two... remember what I said above?) passed. But it's a matter of Gricean implicature. We know because that implicature can be cancelled. "Few students passed the exam. In fact, none of you did," is not a contradiction. Less than felicitous communication, sure, but not a contradiction. Truth condtionally, "few" and "few, if any" are identical. It's fine for Yglesias to remove the "if any" to ease reading.

Yglesias errs in assuming that "few" and "a few" are equivalent. As we've just seen, sentences of the form "few X Y" (where X is a noun and Y is a verb) can be true without any Xs actually Y-ing. "A few" is different. If Jarvis had said, "A few people in power said that Iraq was behind the attacks," Yglesias would have been right in assuming that there existed some administration officials who connected Iraq to the 9/11 attacks. But Jarvis used "few," not "a few." It's a small difference, even smaller than the removal of "if any."

In short, Yglesias took the implicature of Jarvis's statement as truth. It's an entirely reasonable step to take, one that everyone does without thinking. But if you're concerned with logical truth, as Yglesias already showed himself to be, it's a step you just can't take.

For more on this, see my paper on the meaning of few and a few.

Tuesday, June 15, 2004

Billy Joel, Backstreet Guy

I'm hardly the first to point this out, and I'm at least twenty years too late, but Billy Joel clearly has no sense of who he is, what his music sounds like, or how perceive him.

Take, for example, "Uptown Girl." Great song. I love it. Polished, with a slight doo wop feel to it. Cute video, too, with his future bride Christie Brinkley playing the uptown girl to his downtown man. Here's how Joel portrays himself in the song: "backstreet guy," "downtown man," and too poor "to buy her pearls."

We get a similar picture at the tail end of "The Ballad of Billy the Kid" when Joel sees himself as a latter day Billy the Kid.

From a town known as Oyster Bay, Long Island
Rode a boy with a six-pack in his hand
And his daring life of crime
Made him a legend in his time
East and west of the Rio Grande


We get it, Billy. You're rough. You're gritty. You work hard. You're an alternative to the mainstream.

Except not.

Go back to "Uptown Girl." Listen to it. No "backstreet guy" would ever be caught dead listening to a song like that, to say nothing of singing it. Going falsetto? Come on. And that's to say nothing of "Longest Time." Gorgeous harmonies, slickly produced. Pop at its best. If anything, Billy Joel's music is too smooth. If I had to describe Joel's sound in a word, it'd be "facile."

Then consider his audience. Billy Joel's music is about as mainstream as it gets. Show me 100 people who bought any albums from, say, 1973 to 1983 and I'll show you 95 people who bought Billy Joel albums.

People dance to Billy Joel at weddings. If that's not a signifier of being the mainstream, I don't know what is.

How about his lifestyle? While Joel may have had it rough growing up (as rough as it gets in suburban Long Island, I suppose), by 1983 (when "Uptown Girl" came out) he was hardly scrounging around for rent money. Hell, he married Christie Brinkley. When you sell as many albums as Billy Joel did, you're living pretty comfortably.

Now, Billy Joel is hardly the first or last to have a musical self-image that differs so dramatically from reality (Sean Combs, anyone?). But it's hard to find anything about him that's similar to the Billy Joel described in Billy Joel songs. Much of his music falls squarely on the pop side of the pop/rock divide. His commercial success both signified widespread appreciation for his work and guaranteed a privileged lifestyle.

That's fine. But these aspects of Billy Joel's music and life just don't jibe with the gritty, working-class guy-on-the-corner he wanted us all to think he was.

Sunday, June 06, 2004

Smarty Jones!

Those words, with plenty of enthusiasm, have been all around Philadelphia the past few weeks, leading up to Smarty's run towards the Triple Crown.

Yesterday, at the Belmont Stakes, he lost.

Given that Smarty Jones, a horse, is what passes for a prominent Philadelphia sports figure these days, his defeat is not much a surprise. Philadelphia, you see, has had more than its fair share of sports disappointments. There's no need to catalog them all, but as a taste, consider the fact that the Eagles have lost three consecutive NFC championship games.

That Philadelphians rallied around this horse is expected. In a town that craves championships as much as this one, people will embrace whatever likely candidate is around.

What's fascinating, however, how that Philadelphia took to Smarty Jones. Smarty's owner, Roy Chapman, captured Philadelphia's feelings towards Smarty perfectly when he described him as a "blue-collar horse" and "a horse of the people."

On the face of it, this is absurd. In what way is an animal bred and trained to run fast around a track similar to a member of the working class? In what way is Smarty Jones a horse of the people? There's no collective ownership arrangement, where all of Philadelphia stands to benefit financially from Smarty's successes. Smarty Jones has led a rather privileged and pampered life* with no connections to the masses of people that have come to adore him.

But the fact that Philadelphia has come to see Smarty Jones as a tough, working-class horse is rather telling about the self-image of Philadelphia. Philadelphians see themselves as tough and gritty. Philadelphia has a chip on its collective shoulder, especially when it comes to sports.** When Philadelphia sports teams succeed, that success is seen as the result of hard work and overcoming odds, not through the talent of the team or players involved (for the best example of this, look at the 1993 Phillies).

It's no surprise, then, that Smarty Jones took on these characteristics in the minds of Philadelphians as he rose to prominence. Stories about his small size came up again and again. His humble home, Philadelphia Park, contrasted sharply with the plusher environs of Saratoga and Kentucky. In short, Smarty Jones became the archetypal Philadelphia sports hero.

This isn't to say, of course, that these facets of the Smarty Jones story are false. My point is that these aspects were played up while others, like the fact that his physical ratings are nearly flawless, were virtually ignored. Philadelphians don't want to hear about a tremendously gifted horse that wins because he's flat-out a better horse than the rest. Philadelphians want a horse that works hard and succeeds under less than ideal conditions, just like them.

*I'm leaving aside my vague concerns about the inhumane aspects of horse racing. I don't know all that much about the sport itself, so I'm not in an educated position to judge.

**For more on this, see this paper on Philadelphia's memories of Veterans Stadium.

Friday, June 04, 2004

Swarthmore Commencement, 2004

As of somewhere around noon this past Sunday, I am a graduate of Swarthmore College. Unsurprisingly, I don't feel all that different. If anything, my feelings this past week have been strangely reminiscent of going off to Boy Scout camp when I was in middle school. I'm still trying to figure out what the connection is.

In any case, the ceremony was better than expected. The reading of names was, of course, a drag, everyone deserves their moment in the sun. Speaking of which, the weather was perfect all weekend long.

The best part, however, was the speakers. The ones I liked most:

Patrick Awuah, class of 1989, who founded the first liberal arts college in Ghana.

Barry Schwartz, professor of psychology who basically told us to settle for good enough if we want to be happy with life.

The rest are certainly worth reading.

I'll have more to say about the speeches later...