Laurence Sterne's Tristram Shandy (full title: The Life and Opinions of Tristram Shandy, Gentleman) is one of the weirdest books in the Western canon. Written from 1759 to 1769 in seven volumes, it tells the madcap life story of its title character in the most digressive way possible. Sterne, in the eighteenth century, wrote fiction that seems wildly 'postmodern' even by today's standards: the novel 'remixed,' for instance, the text of other books to the point of plagiarism (a prime source, hilariously, was Robert Burton's Anatomy of Melancholy).
Sterne used visual and typographic tricks, too. There are two very famous ones. The first is the Black Page: When Yorick, the town parson, dies, Sterne frames the words "Alas, Poor Yorick!" in a little box, and then fills the opposite page -- page 73 -- with black ink, laid on so thickly that it shows through on the other side. It provides a surprising, moving counterpoint to the book's wordy excesses, as though, when it comes to death, no words are necessary or will do. Tristram Shandy has always been an inspiration to other writers and artists (grad students everywhere are still waiting on the minimalist composer Michael Nyman, who has been working for decades to turn the book into an opera). In 2009, to mark the 250th anniversary of the Black Page, 73 artists made their own black pages in an exhibition organized by The Laurence Sterne Trust; the artworks were displayed in Sterne's home, Shandy Hall, in Yorkshire.
This year, meanwhile, is the 250th anniversary of the colorful Marbled Page, which Sterne called "the motley emblem of my work." It appears in Volume III, on page 169. On the page opposite the marbling, Sterne wrote:
I tell you before-hand, you had better throw down the book at once, for without much reading, by which your reverence knows, I mean much knowledge, you will no more be able to penetrate the moral of the next marbled page (motly emblem of my work!) than the world with all its sagacity has been able to unraval the many opinions, transactions and truths which still lie mystically hid under the dark veil of the black one.
Sterne's printers made sure that every edition of the book used a different pattern for the Marbled Page, allowing the "motly emblem" to change over time. (Unfortunately, modern editions of the book usually reprint the same marbling in black and white.) To mark the anniversary, a second exhibition is being mounted in Shandy Hall -- this time with 169 artists. You can watch online as the new marbled pages are posted one-by-one over the coming months. In the meantime, it's worth looking back at Tristram Shandy. The book, written at a time when novels were new and hadn't yet settled into a conventional shape, is continually challenging and surprising. And with his two famous pages, Sterne created a singular work of art: death and life, represented simply and vividly.
Laurence Sterne by Joshua Reynolds.
This video of Dunder the German Shepherd has been delighting people everywhere. One of the most striking things about it is how wolf-like he looks by the end. As the always fascinating Temple Grandin explains, domesticated dogs are essentially wolf puppies who never grow up into full-grown wolves. From her excellent Animals in Translation:
Humans have neotenized dogs: without realizing it, humans have bred dogs to stay immature for their entire lives. In the wild, baby wolves have floppy ears and blunt noses, and the grown-ups have upright ears and long noses. Adult dogs look more like wolf puppies than like wolf adults and act more like wolf puppies than wolf adults, too. That's because dogs are wolf puppies: genetically, dogs are juvenile wolves....
[D]ogs stop developing emootionally and behaviorally at the wolf puppy equivalent of thirty days. A grown-up German shepherd can do every aggressive behavior a thirty-day-old wolf can do, but nothing beyond that age.
The only domestic dog capable of more full-grown wolf behavior, Grandin explains, is the husky, "which looks a lot like a wolf." A Chihuahua, meanwhile, "never advances past the wolf puppy equivalent of twenty days of age." You can read more here.
Statue of John Henry in Talcott, WV.
This article in the Yale Alumni Magazine introduced me to an idea I'd never encountered before: John Henryism is what happens when you respond to social stresses by working extra hard. If, despite your hard work, those same stresses keep you from succeeding, then the effect can feed back into itself, pushing you to work even harder. Some epidemiologists suggest that John Henryism could have long-term physiological effects. Ron Howell, a journalism professor at Brooklyn College, explains:
It takes its name from the black folk hero who, big in size and grand in strength, banged steel spikes into place during the nineteenth-century railroad boom. To save his job and those of other black "steel drivers," John Henry offered to show that he [could] drive spikes better than the steam-powered hammers the bosses were introducing to save money.... "Before I let this steam drill beat me down," goes the song, "I'll die with my hammer in my hand."
According to the story, that's exactly what happened: The statue pictured above stands near the railroad tunnel where, after defeating the steam-driven hammering machine, John Henry is said to have died of exhaustion.
Howell, a Yale alumnus from the class of 1970, noticed that his African American classmates have been dying in unusually large numbers. (African Americans made up 3% of the class of 1970, but, today, account for 10% of its deaths.) Could John Henryism be to blame? His research uncovers some astonishing facts. It's well-known, for instance, that there's a health gap between blacks and whites: white men live, on average, 76.2 years, while black men live, on average, 70.9. Often that gap is attributed to socioeconomic differences -- but it turns out that, for several health indicators, the gap actually widens as African Americans climb the social ladder. Howell's theory: for African Americans in 1970, "an Ivy education opened doors of aspiration and ambition, but not necessarily corresponding doors of opportunity."
Howell finds, too, that some psychologists believe there are two kinds of stress: the bad kind, which shortens your life, and eustress, a good kind of stress that makes you feel accomplished and fulfilled, and may lengthen it. Being active, hard-working, busy, and successful is good for you. When the class of 1970 entered Yale, Howell notes, "the civil rights era was barely beginning to bloom" -- and being active, hard-working, busy, and unsuccessful, for reasons beyond your control, can lead, Howell proposes, to John Henryism, and perhaps to an early death.
Image via the University of MIssouri.
Robert Benfer, an archeologist at the University of Missouri, has found "the oldest 3-D statue" in the Western hemisphere. It's 4,000 years old and part of a Peruvian temple complex devoted to agriculture and the Zodiac cycle:
On the 4,000-year-old statue, it appears that the horn player is announcing the priests when they enter the Temple of the Menacing Disk, a site first discovered by Benfer in 2004. On the left, the female foxes around the menacing disk mask-like central figure face the June solstice sunset with two eyes shaped like the moon, indicating gathering darkness.... On the right, the male fox has one eye shaped like the sun, looking to the rising sun of the December solstice.
Cowboys and Pit Crews -- Atul Gawande's commencement address at Harvard Medical School. "The revolution that remade how other fields handle complexity is coming to health care." (The New Yorker)
Combat in the First Person -- American soldiers evade a Taliban ambush, and it's captured by one soldier's helmet-mounted camera. (The New York Times)
How to Merge on the Interstate -- "The Illinois DOT has placed a message board on Interstate 74 in Moline, approaching the lane closure for work on the bridge. It states: 'Use both lanes. Take turns at merge.'" It's called the "Zipper system." (The Quad-City Times)
Area 51 Explained -- The Roswell flying saucer "was Russian-made and crewed by human children who were surgically altered to resemble aliens by Nazi death-camp doctor Joseph Mengele, acting at Joseph Stalin's behest." I'm not sure what's more bizarre -- this story, or the fact that Popular Mechanics published it. (Popular Mechanics)
Tyler Cowen, America's Hottest Economist -- "When Tyler Cowen was 15, he became the New Jersey Open Chess Champion, at the time the youngest ever. At around the same age, he began reading seriously in the social sciences." Cowen is the world's best blogger (in my opinion). (Bloomberg Businessweek)
[Image: Area 51.]
According to Amazon.com, which just released its list of "the most well-read cities in America," Cambridge, Massachusetts ordered more books per capita than any other city in the United States (excluding cities with less than 100,000 residents).
Quoth Amazon, in their press release:
Not only do they like to read, but they like to know the facts: Cambridge, Mass.--home to the prestigious Harvard University and Massachusetts Institute of Technology--also topped the list of cities that ordered the most nonfiction books.
Donna Seger, a professor of history at Salem State University, has assembled a wonderful collection of "maps in the form of plants, animals, and humans" on her blog. The earliest ones date from the 16th century, when printers, seeking to outdo one another, reached back to the whimsical "conceptual" maps of an earlier era. The most recent ones are from the First World War.
Here's a map from 1581; it appeared in Travels according to the Scriptures, by a German theology professor named Heinrich Bunting, and shows Jerusalem at the center of the world:
And here's one from 1882, L'Europe Animale, in which "Germany is a sly wolf waiting to pounce":
150 years ago this month, the Scottish physicist James Clerk Maxwell (inventor of electromagnetic theory) and Thomas Sutton, a photographer, took the world's first color photograph. The photo, of a tartan ribbon, was actually created by taking three black-and-white photographs; they were taken through red, green, and blue filters, and then thrown onto a screen by three projectors, each casting its light through a filter of the appropriate color (like so). Maxwell came up with the scheme when he discovered that the cells in our retinas are sensitive, separately, to red, green, and blue light. Maxwell presented the photo during a lecture at King's College London on May 17, 1861.
The first color photograph.
Phil Coomes, a photo editor for the BBC, explains that up until the 1980s color photos were still transmitted and broadcast this way:
As late as the 1980s wire photographs would be transmitted by news agencies such as the Associated Press and printed out by the client as three black and white pictures; these would then be photographed through the same filters and re-constituted as a colour print. As electronic delivery took over this method moved to the computer, but even then the pictures would arrive in three parts ready for the client to reassemble. It was sometime before full colour transmission was widespread.
When an image needed to be shown on color TV, producers would snap a Polaroid of the three filtered and projected images, and then put it in front of the camera.
Harold Bloom, who will turn 81 this July, has been one of America's most fascinating literary critics for nearly half a century. In his newest book, The Anatomy of Influence: Literature as a Way of Life, Bloom revisits the ideas that made him a star -- and explains, in a straightforward way, why he's spent his career trying "to build a hedge around the secular Western canon." Bloom argues that it's simply impossible to understand how literature really gets made unless you recognize that some books are head-and-shoulders above the rest. It's the genius of those books, he contends, that powers the whole of literary creation.
Bloom first rose to prominence in 1973, with a book called The Anxiety of Influence. In technical prose, Bloom made a straightforward argument: great writers, he proposed, are influenced by the writers who came before them; their own writing, meanwhile, can be understood as a complicated, anxious response to those literary influences. The power of Bloom's argument depended, and still depends, on his capacious ideas about what 'influence' is. Influence, for Bloom, really means inspiration.
To get what Bloom's talking about, think back to your own early experiences with truly great writing. A few extraordinary lines in a single poem by Yeats, Whitman, or Shakespeare might strike you with extraordinary force, and make you feel as though you're seeing the world in a new way. If you're really under the spell, you might even try tapping into that current of inspiration yourself, by trying to write your own poems. Artists, Bloom contends, don't start making art out of the blue. They make art because they're awakened to it by other artists -- and not just by any other artists, but by the great geniuses. Under the influence of great art, they have the courage to think, "I can do this, too!"
Being inspired, Bloom argues, is a strange, anxious experience. You wonder whether your inspiration will last. (Bloom quotes Percy Shelley, who explained it this way: "The mind in creation is as a fading coal, which some invisible influence, like an inconstant wind, awakens to transitory brightness.") And you question whether you're too inspired -- whether you're just trying to recreate something great that someone else has already made. Influence, Bloom concludes, is something an artist cultivates, but also tries to escape; it's a source of power, but sometimes it's too powerful. Bloom calls it "literary love." And it's under the influence of literary love that creativity is enflamed and great literature is written.
In a way, Bloom's ideas about literary creativity seem uncontroversial, even intuitive; he describes, for many people, exactly what it's like to encounter something great, and then to try and create something great yourself. The controversy comes in because the general trend in literary studies has, over the last forty years or so, been to broaden the literary canon to include more popular kinds of literature. Bloom, thinking against that current, believes that there are very few sources of real literary greatness. Plenty of people read romance novels and feel inspired by them -- but they're inspired only to create more romance novels. Truly great artists are inspired from above, not below.
Ultimately, Bloom believes, all of the greatest literary art is networked together: “If you carry the major British and American poets around with you by internalization,” he explains, “after some years their complex relations to one another begin to form enigmatic patterns." Those patterns of influence keep looping back to the greatest writers, like Homer, Shakespeare, Dante, and Tolstoy. Literary influence, Bloom says, is like a "labyrinth" built up from moments of genuine inspiration, when great literary minds encounter one another. To map it, you have to notice how writers draw inspiration from one another -- but also how they seek to wriggle out of inspiration's grip. Tennyson was inspired by Keats, but fled that inspiration; Walt Whitman was influenced by Shakespeare, and his originality results, in part, from an attempt to evade that influence. To understand this process, Bloom argues, you have to start by admitting that "there is such a thing as great literature, and it is possible and important to name it."
With Oprah Winfrey's final show airing today, publishers are saying goodbye to Oprah's Book Club -- it had an extraordinary ability to transform modest best-sellers into mega-hits. Matthew Flamm of Crain's puts the Book Club in context:
Wait til you get a load of this!
Though her influence waned in the past few years, the host of Oprah’s Book Club chalked up a record for pushing titles into the sales stratosphere that no other media personality could match.... Still, some observers point out that Ms. Winfrey sometimes lavished her attentions on authors who were either dead, like John Steinbeck and William Faulkner, or already well-known, like Gabriel Garcia Marquez. And she was hardly the only book-loving host.
“She definitely leaves a hole in a cultural regard, but she might hold up a book on her show ever month or two or three,” said Michael Norris, senior analyst with Simba Information, which tracks the book industry. “Stephen Colbert and Jon Stewart do that four times a week.”
The most interesting aspect of the Book Club, to my mind, was the way that it highlighted great books from the past -- Great Expectations, Light in August, The Sound and the Fury, As I Lay Dying, Anna Karenina -- and difficult books from the present, like The Road. It's a good thing she didn't propose a book four times a week -- it takes a while to read and digest three Faulkner novels! The complete list is here. And, if you missed it, here's one of the oddest literary moments of the past few years: Cormac McCarthy's first, and awesome, television appearance, on Oprah.
Writing at West 86th ("a journal of decorative arts, design, history, and material culture") Ben Kafka, a media history professor at N.Y.U., highlights a magnificent marketing film from 1967: I.B.M.'s "Paperwork Explosion." "Commissioned by IBM," Kafka explains, "the film was directed by a little-known experimental filmmaker named Jim Henson and scored by the Raymond Scott, the composer and inventor who wrote most of the tunes behind Looney Tunes, introduced the first racially integrated network studio orchestra, and pioneered electronic music with such technologies as the Orchestra Machine, the Clavivox, and the Electronium."
Henson and Scott’s collaboration explains, no doubt, the film’s considerable formal intelligence and diegetic wit.... The film promotes the IBM MT/ST, a machine released in 1964 combining the company’s Selectric typewriter with a magnetic tape disk. Operators entered text and formatting codes onto magnetic tape; they could then make simple changes before printing a clean copy of the document.... Among historians of computing, the MT/ST is best known as the first machine to be marketed as a “word processor” (a term that, as Thomas Haigh has pointed out, emerged at the same moment as Cuisinart’s “food processor”).
"The 'paperwork explosion,'" Kafka concludes, "expresses both a threat and a wish. The threat, of course, is that we are being overwhelmed by paperwork’s proliferation, its explosion.... The wish is to convert all this cumbersome matter into liberating energy, which is exactly what explosions do." To which I'd only add: "The Paperwork Explosion" shows just how amazing Mad Men should have been!
Henson was, as Kafka writes, an "experimental filmmaker": before the Muppets, Henson made large numbers of avant-garde short films. They tend to be free-associative and madcap in tone, with lots of crazy music and quick, humorous juxtapositions. Some of them were broadcast on TV -- NBC actually commissioned Henson to make experimental films for a show called Experiment in Television, which ran from 1968 to 1971. Many of them are available on YouTube:
Excerpt from Time Piece (1966)
Excerpt from The Cube (1969)
You can see more of the winning slides here.
Over at Wired, Jonah Lehrer synthesizes some recent research on how, exactly, power corrupts:
The very traits that helped leaders accumulate control in the first place all but disappear once they rise to power. Instead of being polite, honest and outgoing, they become impulsive, reckless and rude. According to psychologists, one of the main problems with authority is that it makes us less sympathetic to the concerns and emotions of others. For instance, several studies have found that people in positions of authority are more likely to rely on stereotypes and generalizations when judging other people. They also spend much less time making eye contact, at least when a person without power is talking.
This time-lapse vide of planes taking off from Logan is oddly mesmerizing. It's from BostonAirborne, whose YouTube channel is full of Logan videos.
If you're wondering: the video runs from 5 to 6:15 p.m. "Would of liked to record longer," Mr. Airborne writes, "but local law enforcement showed up." The lesson: your time at Logan would be a lot more interesting if you could speed it up.
"I don't want to achieve immortality through my work," Woody Allen once said, "I want to achieve immortality through not dying. I don't want to live on in the hearts of my countrymen; I want to live on in my apartment." Failing real immortality, though, most of us have to rely on wills, trusts, copyrights, and other legal instruments to influence the world after our deaths. In Immortality and the Law: The Rising Power of the American Dead, law professor Ray Madoff contemplates the contest between the rights of the living and the rights of the dead. "American law," she explains, "grants more power to the dead than any other country in the world."
Madoff's story starts in nineteenth-century England, when there were very few laws governing the dead. Your dead body, for instance, wasn't even considered your property: a legal principle, corpus nullius in bonis, held that "the body belongs to no one," and prevented living individuals from specifying exactly what should happen to their bodies after death. It was, therefore, perfectly legal for medical students to steal bodies from cemeteries. The robbing of your grave might offend your relatives, but... corpus nullius in bonis. Death was an ungovernable frontier.
As time passed, of course, it became increasingly clear that a more robust legal framework was needed. That framework was built through through case law, as bizarre cases piled up one atop another. Here's an American court's account of one 1938 case in which a brother chose to dispose of his sister's body "at home":
In June 1938 Harriet was in failing health. She appears to have suffered some injury from a fall and during the night of June 9th she remained in a reclining chair in the front room of their home. About four o'clock in the morning of June 10th she died. The respondent [her brother] thereupon built a hot fire in the furnace in the basement of the house, tied a rope around the legs of his sister's body, dragged it down the cellar stairs, shoved it into the furnace and burned it. It was impossible to get it all into the fire box at once, but as the head and shoulders were consumed, he forced it in farther and farther until he was able to close the furnace door.
The case turned on the fact that Harriet had asked to be "buried" this way -- and it resulted in a stronger standard of "Christian burial." You've buried someone indecently, the court decided, if, were the facts about what you've done to become known, "the feelings and natural sentiments of the public would be outraged."
Nowadays, a vast legal edifice governs the way we treat the bodies, possessions, and even reputations of dead people. The law respects the copyrights of the dead, for seven decades after death; in some states, it allows the living to establish perpetual charitable trusts, and perpetual, tax-free private trusts, that can never be broken or modified. Some state laws even prevent celebrities from making reference, in their personal styles, to the personal styles of dead celebrities. Tennessee, Elvis's home state, recognizes this "right to publicity" in perpetuity.
Madoff argues that giving so many rights to the dead is a big mistake. Many of these laws are the results of lobbying from banks, media companies, or funeral directors; their cumulative effect, she writes, is to favor the past over the future, by committing tomorrow's resources to yesterday's problems. We're blind to the danger in part because we're such a young country. In Europe, while the dignity of the dead is legally protected, their other rights are more curtailed, so as to avoid "dead hand control." Early Americans were very aware of the danger of giving the dead too much lega power: Madoff cites Thomas Jefferson, who "warned against allowing the wishes of the dead to prevail over those of the living." "That our Creator made the earth for the use of the living and not of [the] dead, that those who exist not can have no use nor right in it, no authority or power over it," Jefferson wrote, is "self-evident... [H]e is not to be reasoned with who says that non-existence can control existence or that nothing can move something."
The count, who had been mowing, appeared at dinner in a grayish blouse and trousers and a soft white linen cap. He looked even more weather-beaten in complexion than he had in Moscow during the winter, if that were possible. His broad shoulders seemed to preserve in their enhanced stoop a memory of recent toil. His manner, a combination of gentle simplicity, awkward half-conquered consciousness, and half-discarded polish, was as cordial as ever. His piercing gray-green-blue eyes had lost none of their almost saturnine and withal melancholy expression. His sons were clad in the pretty blouse suits of coarse gray linen which are so common in Russia in the summer, and white linen caps.
The writer is Isabel Hapgood, a journalist who translated Russian literature and traveled throughout Russia; her Service Book of the Holy Orthodox-Catholic Church, an English translation of the Russian Orthodox liturgy, is known to many American Orthodox Christians simply as "the Hapgood." Marina Lidowsky, a professor of Slavic languages at Barnard, writes:
She was a formidable lady of many talents and vocations: a polyglot-translator of works by great literary masters, a prolific journalist and writer, a successful lecturer and administrator, a moral crusader, an organizer of charitable work, a liturgical scholar and a prospective musicologist as she harbored a project of a History of Russian Orthodox Church Music.
She was also a Boston native: she grew up in Worcester and graduated from Miss Porter's School in 1868. Tolstoy's theory of how to live a good life, according to Hopgood:
Men should divide their time each day between (1) hard labor unto perspiration and callosities; (2) the exercise of some useful handicraft; (3) exercise of the brain in writing and reading; (4) social intercourse, sixteen hours in all.
Over at The European, which has my favorite tagline ever ("Honest. Online. Neu."), Rolf-Dieter Heuer, the Director General of the CERN laboratories in Geneva, thinks out loud about the strangeness of modern physics and its relationship with theological and philosophical questions:
Particle physics is asking the question of how did things develop? Religion or philosophy ask about why things develop. But the boundary between the two is very interesting. I call it the interface of knowledge. People start asking questions like “if there was a Big Bang, why was it there?” For us physicists, time begins with the Big Bang. But the question remains whether anything existed before that moment. And was there something even before the thing that was before the Big Bang? Those are questions where knowledge becomes exhausted and belief starts to become important.
It's the nature of physics, he explains, to highlight "the boundary between knowledge and belief."
In the last decade, the biggest development in photography has to be the rise of the smartphone camera. Astronomical numbers of pictures are now taken with photography apps that let you apply cool effects right on your phone, before posting them directly to Facebook, Flickr, or Instagram. Nathan Jurgensen, writing in The Society Pages, asks: Why are so many of those effects vintage-themed? We want our photos to look pre-aged, he suggests, because "we want to endow the powerful feelings associated with nostalgia to our lives in the present":
The momentary popularity of the Hipstamatic-style photo serves to highlight the larger trend of our viewing the present as increasingly a potentially documented past. In fact, the phrase “nostalgia for the present” is borrowed from the great philosopher of postmodernism, Fredric Jameson, who states that “we draw back from our immersion in the here and now [...] and grasp it as a kind of thing.... ”We come to see what we do as always a potential document, imploding the present with the past, and ultimately making us nostalgic for the here and now.
The biggest problem with vintage photos? Eventually, Jurgensen argues, they'll stop looking vintage, and just look high-tech -- they'll just be what smartphone photos look like.
Writing at The Millions, Mark O'Connell proposes that we enjoy long novels in part because they're punishingly boring: like hostages suffering from Stockholm syndrome, we come to love our authors for the small doses of literary kindness they dole out amidst all the tedium:
You finish the last page of a book like Gravity’s Rainbow and -- even if you’ve spent much of it in a state of bewilderment or frustration or irritation -- you think to yourself, “that was monumental.” But it strikes me that this sense of monumentality, this gratified speechlessness that we tend to feel at such moments of closure and valediction, has at least as much to do with our own sense of achievement in having read the thing as it does with a sense of the author’s achievement in having written it.
Whether this is true or not depends, I suspect, on your personality. It's also worth noting that not all long novels have boring stretches. Nowadays tedium is almost a literary value in itself, in a postmodern sort of way. War and Peace and Our Mutual Friend, though, are page-turners!
If you live in Somerville, you may have noticed that someone's been posting helpful infographic maps, displaying data about crime or income, on telephone poles in your neighborhood. Good unmasks the info-bandit: his name is Tim Devin, a local artist and publisher.
One of Devin's maps.
Devin has posted two maps so far, one about crime and one about money, as well as a few surveys and poems. He calls his flyers "BBC Broadsides." If you enjoy this kind of thing, try exploring the WolframAlpha page for Boston, MA -- it's a gateway into the statistical universe of Boston.
Over at Rationally Speaking -- the blog of CUNY philosophy professor Massimo Pigliucci -- there's a fascinating debate unfolding about the meaning of Occam's razor. Occam's razor is an oft-cited principle which holds that simpler explanations, all things being equal, are usually better than more complex ones. Nowadays it seems that everyone is wielding the razor, from physicists and biologists to detectives and policy-makers -- and yet, Pigliucci argues, Occam's razor is actually a very strange idea, and probably a less useful one than many people think.
The Razor-wielding William of Occam.
Pigliucci draws on the philosopher Elliot Sober, who in a 1994 essay ("Let's Razor Occam's Razor") asked, in Pigliucci's words, "Why? On what basis are we justified to think that, as a matter of general practice, the simplest hypothesis is the most likely one to be true?" One possibility, of course, is that the simplest explanations tend to be true because the universe is, in some sense, simple. That's an ancient idea: Thomas Aquinas, for example, argued that, "if a thing can be done adequately by means of one, it is superfluous to do it by means of several, for we observe that nature does not employ two instruments where one suffices"; William of Occam, who invented the razor, did so in the 14th century. In fact, though, the universe isn't really simple in quite this way -- except, perhaps, in some deep mathematical sense. "The history of science," Pagliucci writes, "is replete with examples of simpler ('more elegant,' if you are aesthetically inclined) hypotheses that had to yield to more clumsy and complicated ones."
If Occam's razor isn't ontologically justified, Pigliucci continues, then it must be epistemically useful: favoring simple answers must make it easier, over time, to find the right answer. This is probably true, he concludes, in a variety of fields -- but not because of some mysterious, underlying fact about knowledge that cuts across disciplines. The reasons why Occam's razor is useful in physics are probably different from the reasons it's useful in biology. In most cases, moreover, you can get away with favoring simpler explanations only because there's a huge quantity of (quite complicated) background knowledge hiding inside the simplicity. It's actually because of all that background knowledge that we can present ideas simply in the first place! "Occam's razor," Pigliucci concludes, "is a sharp but not universal tool, and needs to be wielded with proper care." It's limited: unfortunately, for instance, "one cannot eliminate flying saucers a priori just because they are an explanation less likely to be the correct than, say, a meteor passing by."
That's not to say, obviously, that Occam's razor isn't tremendously useful. One commenter points to this fascinating paper in the Handbook of the Philosophy of Science, written by philosopher Kevin Kelly. The point of the razor, Kelly argues, isn't that it helps you find out the truth: it's that, as a tool for thinking, it gets you to the truth faster. Kelly writes:
Occam's razor does not point at the truth, even with high probability, but it does help one arrive at the truth with uniquely optimal efficiency, where efficiency is measured in terms of such epistemically pertinent considerations as the total number of errors and retractions of prior opinions incurred before converging to the truth and the elapsed times by which the retractions occur. Thus, in a definite sense, Ockham’s razor is demonstrably the uniquely most truth-conducive method for inferring general theories from particular facts -- even though no possible method can be guaranteed to point toward the truth with high probability in the short run.
The moral of the story: you cannot cite "Occam's razor" non-problematically in arguments with your friends. It's really useful only within lengthy, sustained processes of inquiry.
Vivian Maier is the Emily Dickinson of street photography. Maier worked as a nanny in New York and Chicago for more than a half-century, and took thousands of brilliant photographs, never sharing them with anyone except, occasionally, for the children in her charge. Some of her negatives were found by chance, when a local real estate agent purchased them at an auction house; now an exhibition of her photography is traveling the world (this summer it's in the U.K.).
One of Maier's self-portraits.
The real estate agent, John Maloof, immediately realized the value of the photographs. He did some sleuthing, figured out who Maier was (her name was scrawled on a slip of paper in one box of negatives), then found the rest and created a consolidated archive. Maloof posted the photos to Flickr and started a blog. It turned out that Maier, who had died only a few months earlier, had been supported in her old age by a few of the children she'd nannied. They'd never known about the photographs, and had allowed the negatives, which had been kept in a storage unit, to be auctioned off when the payments became overdue.
Maier's photographs are extraordinary -- the equal, at first glance, of the great street photographs by Henri Cartier-Bresson or Diane Arbus. She had an eye for expressive faces, and for points of contact between young and old, as in the beautiful photograph of a boy riding a horse beneath New York's elevated train. Read more about Maier at Mother Jones or at the Vivian Maier website. (All photos are © Vivian Maier / Maloof collection.)
According to Benjamin Franklin, the secret to success is, essentially, hard work: "Plough deep," he suggested, "while sluggards sleep." That's a helpful axiom for simple endeavors, but it's near-useless when we undertake complex ones. It's not laziness that prevents us from crash-proofing our financial markets or stopping global climate change. It's the sheer, mind-numbing complexity of these problems that prevents us from solving them. In Adapt: Why Success Always Starts with Failure, Tim Harford, a British economist, aims to provide a road map to success in a complex world. We normally think of success and failure as opposites -- but, Harford argues, successfully engaging with complexity requires embracing, even encouraging failure, over and over and over again.
Harford's book starts from a simple premise: the world, he argues, is more complicated now than ever before. It's faster, more connected, and more elaborately recursive. Modern societies depend on a network of globe-spanning imports, industries, and information; modern decision-makers oversee huge hierarchies. Organizations and governments are continually growing in complexity -- in fact, they seem on the verge of unmanageability. Harford's goal is to figure out how difficult problems actually get solved amidst such vast complexity. To that end, and in a Gladwellian spirit, he undertakes a variety of case studies, looking at everything from the U.S. military's counterinsurgency strategy in Iraq to the evaluation of international aid programs.
What Harford finds is that complex success depends upon repeated failure. No one person, no single plan can possibly get things right from the beginning, when there are so many moving parts. Instead, you have to embrace the unknowable, by developing as many plans as possible, and by trying them all out in a rigorous way. You need to encourage trial and error, and to make failure, when it inevitably happens, survivable. Only after you've figured out what doesn't work can you isolate what will -- and the only way to figure out what doesn't work is to actually try and see. Real-world failures, in short, are the only route to real-world success.
Harford argues, in essence, that failure is an untapped resource: Since it's inevitable, it needs to be harnessed. Many international aid programs, for example, turn out not to work, and money is spent until donors become disillusioned and move on to other programs. We know that many sensible-sounding programs will fail -- and therefore, Harford argues, we embrace and learn from that failure in an active way, by enacting many programs simultaneously in randomized trials. Accepting that many of the programs will fail right from the beginning will help us move more quickly towards the solutions that work. But that requires a difficult psychological adjustment: you have to be comfortable with failure, rather than ashamed of it.
Harford, like Nassim Taleb, contends that we ought to be very skeptical of anyone who claims know how to solve a complex problem; like Taleb, he has an essentially Tolstoyan sensibility. The world is bigger than the mind, and so real problem-solving requires huge teams of humble people willing to try things out and, if need be, to fail. Failure, therefore, isn't something to be ashamed of -- it's a necessary, unavoidable step in the making of progress. As Tolstoy put it: "Truth, like gold, is to be obtained not by its growth, but by washing away from it all that is not gold."
Synesthesia is a neurological condition in which your sensory circuits are connected: numbers might have colors associated with them, tastes might have words, sounds might have textures. Experiences in one sense modality borrow the qualities of other modalities. This wacky little film gives you an (exaggerated, Japanese) idea of what it's like. Famous synesthetes includes Vladimir Nabokov (colored letters), Olivier Messiaen (colored chords), Richard Feynman (colored equations), and David Hockney (colored, geometric musical notes).
Nabokov, from his memoir, Speak, Memory:
The long ‘a’ of the English alphabet... has for me the tint of weathered wood, but a French 'a' evokes polished ebony. This black group also includes hard ‘g’ (vulcanized rubber) and ‘r’ (a sooty rag being ripped). Oatmeal ‘n’, noodle-limp ‘l’, and the ivory-backed hand-mirror of ‘o’ take care of the white.... Passing on to the blue group, there is steely ‘x’, thundercloud ‘z’ and huckleberry ‘h’. Since a subtle interaction exists between sound and shape, I see ‘q’ as browner than ‘k’, while ‘s’ is not the light blue of ‘c’, but a curious mixture of azure and mother-of-pearl.
By taking 37,740 individual photographs with a special camera,, Nick Risinger, a Seattle photographer has managed to get the entire Milky Way into one huge, zoomable photograph. You can pan and zoom around the galaxy on his website.
Writing in The Chronicle of Higher Education, four media scholars say that college students, far from being slavishly in love with their cell phones, laptops, and Facebook feeds, are concerned about their addiction to technology, and want to talk about it:
[T]oday's students (age 18 and up) have significant concerns about the role of the new technologies in their lives. To be sure, most really do appreciate the power and convenience of the tools they use for social networking, entertainment, and learning; and many are serious multitaskers. But at the same time, when asked about those technologies, many appear to be more self-aware, reflective, and articulate about their concerns and confusions than they are generally given credit for being.
After surveying more than 300 students at six colleges, they say that professors are missing a big opportunity: college students are eager to think, in a sophisticated and reflective way, about their digital lives. Professors should help them.
Annette Kellerman, a professional swimmer and mermaid, invented the women's swimsuit right here in Boston in 1908:
Annette Kellerman, known in her time as “the Australian Mermaid,” was a competitive swimmer, diver, model, actress, stuntwoman, fitness guru, and, yes, professional mermaid in vaudeville and movies, who originally sewed stockings onto a man’s racing suit for less drag in the water. Then, one day, circa 1908, she forewent those old leg coverings and appeared on a Boston beach in a skin-tight onesie with the legs cut off mid-thigh. She was arrested.
Confessions of a Lexicographer -- "There are university courses in dictionary-making, but I am not sure how many of their graduates are now employed as professionals." (The Dabbler)
Michael Stipe -- On discovering he could write songs: "I watch people. I'm a voyeur. God knows you've seen me in action. I sit and watch." (Interview)
James Franco -- Before he was famous: "My dad introduced me to Tolkien and after that I was sort of off and running." (The Days of Yore)
The Inverted Index -- "I've made my email searchable back to 1995. I can type mailsearch f:ford procrastination and in response two commands run (mairix and mutt), and up pops a terminal window filled with 15 years of emails, from me to other people, about putting things off." (Ftrain)
One in Three Africans Now 'Middle Class' -- "Sales of refrigerators, television sets, mobile phones, motors and automobiles have surged in virtually every country in recent years." (Reuters Africa)
War Dog -- "The question of how the dog got into bin Laden's compound is no puzzle -- the same way the special ops team did, by being lowered from an MH-60s helicopter." (Foreign Policy)
[Image: A military dog handler carries his dog as part of training to build trust. U.S. Air Force photo/Airman 1st Class Allen Stokes, via Foreign Policy.]
A fierce debate is raging in the U.K. about a new proposal to let wealthy students pay for places at top universities -- even if they've been rejected through the regular admissions process. As it stands now, British universities have firm quotas for the number of students they can admit, and those places are filled through meritocratic competition. Once you get in, you pay a low, flat fee to attend (about $6,000 a year to attend Oxford). But David Willets, the education minister, is proposing to create new, "off-quota" places, open to students who haven't made the cut, as long as they can afford to pay substantially higher fees. Rage and confusion have been the immediate results of his proposal.
All this can be yours... for a price.
Proponents of the plan say that it will actually encourage social mobility: after all, these are "extra" places, so it's not like spots are being taken away from the less affluent. The extra revenue will benefit everyone, and help stabilize a shaky educational system. Moreover, it's not as though a horde of zombie underachieves will descend on the best schools. The students most likely to take advantage of the program would be those who almost made it in -- the kids on the margin between the 3,000 who were accepted at Oxford, for example, and the 17,000 who applied. Finally, they argue, funding could be reduced at lesser schools once wealthy students can pay to go elsewhere -- an added bonus! (As the U.K.'s business secretary puts it: "Institutions could very well find themselves in trouble if students can't see value. In circumstances where places are unfilled, we might withdraw those places, and institutions should not assume they will easily get them back.")
Opponents of the plan -- and there are lots of them -- declare it profoundly regressive. John Denham, a Labour MP, says that the government "intends to create a two-tier system -- one method of entry for the most able, another for those with access to private funds from one source or another." If you want to fund the universities, they say, then fund them. If you want to make extra places, then make them open to the most accomplished. Willets, meanwhile, has begun to backtrack, saying that the extra places could be paid for by charities, or by corporations looking to sponsor students.
These corporate sponsorships are already happening throughout the U.K. The accounting firm KPMG has already launched one such program -- it pays college tuitions for students at Durham University. They work for the firm while getting an accounting degree. The pharmaceutical firm GlaxoSmithKline has a similar system in place at the University of Nottingham: Their sponsored chemistry module, they explain, "introduces chemistry students to the medicinal chemistry skills the pharmaceutical industry requires while also enhancing knowledge transfer between industry and academia." In the GSK program, 12 students are selected by the university during their third year of school to pursue "an active research programme on a molecular target that the pharmaceutical company is developing." These are exciting, useful programs -- and they're not quite the same thing as the selling of "off-quota" places to wealthy students.
Who knows what will become of Willets' plan -- perhaps nothing. In any case, it highlights the surprising moral power of the university admissions system. Admissions officers, in their own ways, stand as guardians over the principles of fairness and openness, and the university derives prestige from its store of moral credibility. Making deals to sell that credibility can be dangerous. In the Middle Ages, Catholic clergy conceived of themselves as selling beneficence from an infinite "treasury" of holiness; they used the money from the indulgences they sold to build hospitals, churches, and leper colonies. But the system got out of control (as one sixteenth-century preacher famously put it, "As soon as money in the coffer rings, the soul from purgatory's fire springs"), and had to be Reformed. There are lots of good, practical, even intellectually substantive reasons for bringing private money into the university. But the treasury of holiness isn't, in fact, infinite.
If you live in a university town, you see them everywhere: ads soliciting your participation in clinical trials for new medications. You might assume that they're targeted at students. In The Professional Guinea Pig: Big Pharma and the Risky World of Human Subjects, Roberto Abadie, an anthropologist at the City University of New York, shows that that assumption would be wrong. In fact, many test subjects are "full-time volunteers [who] might enroll in five to eight trials a year, deriving a total estimate income of $15,000 to $20,000 in exceptionally good years." They are test subjects for a living.
Abadie spent eighteen months living in hostels and group homes in Philadelphia, getting to know the men and women who work as professional participants in Phase 1 clinical trials. (Phase 1 trials mark the first time that a new medication is used in human beings; Phase 2 and 3 trials are much larger, and take several more years to complete.) They're not an easy population to characterize. Many participants work low-paying jobs and join trials to supplement their income. Others participate full-time, and say they're "addicted" to the easy money. Some participants are chronically ill (with HIV, for example), and think of themselves as joining in the fight against their disease. In Philadelphia, almost all of the participants are African-American or Latino -- but there are also a few "white anarchists" in the mix.
What's it like to be a professional guinea pig? You might travel from one city to another to participate, getting paid anywhere from $1,200 for three or four days' work to $5,000 for a month-long trial. The more painful or invasive the trial, the higher the pay. (Trials of psychiatric drugs pay especially well.) Abadie says that participants worry about risks, but not too much -- they're used to them. Pharmaceutical firms find it easier to work with experienced participants, and participants might know one another from previous trials. One former full-time guinea pig, a man in his early thirties, describes the experience this way:
Manufacturing has been taken off, outside the country, so you are not allowed to do things anymore. They call it the new economy, the informational economy. And the other side of this informational economy is the mild torture economy, you are not asked to produce or to do something anymore, you are being asked to endure something. So, if you are a guinea pig, you are enduring something, people are doing things to you and you are just enduring it, you are not actually producing something. I feel that I am a worker, but it is not work ... it's about how much you can deal with being bored, that's the real hard part of it, the time and discomfort of being there.... I am letting people pay me in exchange for the control they have over me.
Maybe David Foster Wallace should have set The Pale King in the world of professional trial participants. (He set it among I.R.S. auditors who work in an office in Peoria, IL.)
The existence of professional guinea pigs, Abadie argues, is something new: until the mid-1970s, the vast majority of new drugs were tested on prisoners. "The fluidity and instability of the guinea pig workplace," he writes, "brings to mind the world of migrant agricultural workers." Perhaps, Abadie suggests, there ought to be a central registry for trial volunteers, or a union. But many of the participants he talks to aren't interested in organizing. "They point out that both the pharmaceutical industry and they themseles intend to make money performing clinical trials research," he writes. As far as they're concerned, being exploited is just what work is. "Working for the pharmaceutical industry may be exploitative," they tell him -- "but the same was true of their jobs driving trucks, making fast food, or sorting packages on a conveyor belt."
No one knows exactly what the death of Osama bin Laden will mean for the future of global terrorism. One thing, however, is certain: there will be tons of conspiracy theories about it. Matthew Gray of Foreign Affairs rounds up some of the best conspiracy theories and asks: Why is the Middle East so open to them in the first place?
Only a few days after bin Laden's death, some extraordinarily ornate conspiracy theories have already appeared, not just on the internet but in newspapers. One theory, Gray writes, has
proposed that bin Laden had been collaborating with Washington all along. Another one had it that bin Laden died years ago but that his body had been frozen and retained for later use by the United States; still others suggested that he remained alive.... Some have even suggested that the world’s most wanted terrorist was not real but an American invention.
There are conspiracy theories everywhere, of course -- but why are they so quick to sprout in the Middle East? In part, Gray points out, it's simply because the region "has been subject to an unusually high number of actual conspiracies in the past": "The overthrow of Iranian President Mohammed Mossadeq in 1953 was driven by a secret U.S. and British plot to remove him, and the 1956 Suez War was the result of a covert British-French-Israeli agreement struck in France." Conspiracy theories are also a natural response when you live in an authoritarian state: you're powerless, and in many ways the government really is conspiring against you.
Unfortunately, if you've been cast in a conspiracy theory, there's not much you can do about it. Counterargument is exactly what the conspiracy theorists expect from you -- in fact, it may make the theorizing more intense. The best thing to do, Gray concludes, is ignore the conspiracy theorists: "Most anti-Catholic and anti-Mason conspiracies in the United States have atrophied this way."
Writing in The Nation, English-professor-turned-journalist William Deresiewicz provides one of the best overviews I've read of the slow-motion train-wreck that is the downsizing of the academy. As he sees it, the problem is corporatization. Non-faculty administrators, obsessed with efficiency, are cutting out departments and faculty positions. At the same time, their own ranks are swelling: "From 1976 to 2001, the number of nonfaculty professionals ballooned nearly 240 percent, growing more than three times as fast as the faculty." Now universities, having bureaucratized themselves, find that there isn't enough money to fund the teaching and research that's their raison d'etre.
The problem isn't that administrators are spending all the money on themselves; it's that they have essentially non-academic priorities. They want to increase enrollment, cut costs, expand their schools, and move up in the U.S. News rankings. Academic excellence is only one priority among many. Over the long term, that has had appalling effects on the quality of the education students receive. The evidence is in: American colleges aren't making their students smarter, essentially because colleges are encouraging students to major in fluffy, non-academic subjects. Students are getting good grades, but in meaningless majors. And courses in those majors are, happily, cheaper to staff: instead of a tenured history professor, you can hire an adjunct professor of marketing.
At Ivy League universities, professors have been talking for years about a "crisis in the humanities." Deresiewicz puts that conversation in its larger, more important context. The fight to get students to major in English instead of neuroscience at Harvard is really at the margins of a much larger higher-ed crisis. It's the rigorous teaching of the liberal arts, rather than the humanities in themselves, that need defending. It's time, Deresiewicz argues, for some good old-fashioned outrage, both from faculty and from the public. Our colleges and universities are, after all, part of our education system as a whole. The same principles we insist upon at our elementary and high schools hold for colleges, too. That means rigorous curricula, small classes, and respected, experienced, and well-paid faculty:
There is a large, public debate right now about primary and secondary education. There is a smaller, less public debate about higher education. What I fail to understand is why they aren’t the same debate. We all know that students in elementary and high school learn best in small classrooms with the individualized attention of motivated teachers. It is the same in college.
Deresiewicz makes his case in the starkest possible terms, and leaves out all sorts of subtleties. In this case, that's a good thing. He synthesizes dozens of books on the crisis in higher education and gets to the heart of the matter. More students are going to college, but they're not getting smarter. Tuitions are going up, but so are class sizes. More students are going into debt, but serious departments are being closed around the country. Tuitions are being paid, but education is not happening. The question we need to be asking is, Where is all the money going? "There’s plenty of money," Deresiewicz writes -- but only "if we spend it on the right things."
Feel the full force of Deresiewicz's anger over at The Nation: "Faulty Towers: The Crisis in Higher Education."
Today we tend to see the world's societies in terms of "development." Yet only a half-century ago we might've thought of them differently -- in terms of "civilization." In the interim, that value-laden word has been largely set aside, and for good reason. It's a self-congratulatory, one-way word, which implicitly denigrates everyone else as uncivilized, even as barbaric or savage. Throughout history, conquering armies have marched under the banner of "civilization," unleashing the most appalling violence in its name.
That's not to say, of course, that "civilization" isn't a useful and important idea. In fact, it might be hard to be civilized without it. John Armstrong, a British philosopher, has gamely shouldered all of that rhetorical baggage to crusade in its defense. In his new book, In Search of Civilization: Remaking a Tarnished Idea, Armstrong argues that we need to reclaim the idea of civilization from history. Instead of burying it, we need to ask, "What should our idea of civilization be?"
Ingres' Portrait of Mme. Devauncey: According to Armstrong, she has
"the face of civilization itself... uniting inner poise and alert attention to the world."
Armstrong is famous for writing on philosophical subjects in an accessible, even personal way. At one moment, he's thinking through Plato, or Matthew Arnold; the next, he's telling a story about a teenage encounter with a Parisian prostitute. (Everything always circles back around to the central theme: moments after stumbling out of the "hot little attic room" to the sidewalk, the seventeen-year-old Armstrong finds himself outside a church, watching as a very civilized choir rehearses a "a limpid, simple -- and astonishingly beautiful -- piece by Fauré.") Armstrong's loose style is occasionally goofy, but it has one great strength: it lets him think about civilization in everyday terms, with all its contradictions and compromises intact.
We have to think about civilization that way, Armstrong argues, because "civilization," even though it's in some sense an abstract idea, is also an everyday balancing act. When we use the word "civilization" in the best way, we're usually thinking of four related but contradictory things. Sometimes a civilization is "a collective scheme of values," which is what we usually mean we talk about "Western civilization." At other times, though, civilization has to do with material progress ("we were miles from civilization"), or with the everyday arts of living ("he's very civilized"). Finally, it can refer to the exalted, sacred, and spiritual parts of life -- to religion, art, and other "life-giving ideas."
These aspects of civilization often work against one another -- but, unfortunately, you can't be civilized in only one dimension. A civilized society is materially advanced without being spiritually vacuous; it's rooted in shared values, but also honors the pursuit of individual, idiosyncratic pleasures. Being a civilized person is challenging, because it requires integrating these different aspects of civilization in your own, practical life. It's easy and natural, Armstrong argues, for individuals and societies to get out of balance. That's why we need to be able to talk about "civilization": it's the "sovereign conception" we can use to keep track of our successes and failures.
At just under two hundred pages, In Search of Civilization isn't going to rescue the word "civilization" from its recent history. It does offer, however, a persuasive and coherent vision of the way that civilization can be a useful idea in one's own life. Civilization, Armstrong writes, is "a project originating in the inner needs of individuals." It works its way out from there, expressing itself in "the way you live, where you live, what you do, how you act out friendships and cope with responsibilities and difficulties." History makes it hard to write about civilization, but we all do it anyway: "We are all writing" about civilization, Armstrong concludes, in "the book of life."
If the Cold War witnessed Space and Arms Races between the United States and the Soviet Union, our era is seeing a Growth Race between China and India. In The New York Review of Books, Amartya Sen asks the obvious question: Which country has the better quality of life? Where is life better?
The answer, of course, depends on what you value. India has built a democratic society, and its citizens enjoy tremendous civic freedoms; at the same time, the country is still struggling with tremendous economic inequality. In China, prosperity has been more widely shared, but political freedoms have been slow in coming. On the whole, though, life in China is better: life expectancy is longer, child mortality is lower, and the literacy rate is higher (94% in China, 74% in India). 97% of Chinese children have received immunizations against diphtheria, pertussis, and tetanus; in India, only 66% have received them. Democracy counts, of course, in a qualitative sense. But, Sen writes, "When we consider the impact of economic growth on people’s lives, comparisons favor China over India."
Moving forward, the two countries face very different challenges. In democratic India, the challenge is one of attention: Indians have to mobilize, and keep the political discussion focused on issues of inequality. In China, by contrast, the challenge has to do with accountability. Decisions are made from the top-down, and "there is little recourse or remedy when the government leaders alter their goals or suppress their failures." In both cases, it's important to look beyond broad measurements like Gross National Product. Growth has been important. In the coming decades, though, politics might be even more so.
Over at political discussion website Democratic Underground, user NNN0LHI has descended into the depths of the internet, unearthing a gem of an article from the on-the-whole-incomprehensible website of Edward Jay Epstein ("To enhance its labyrinthical concept, " the home page explains, "it contains no site map"). Epstein's piece on "The Lair of Bin Laden" is a real blast from the past. Check out, for instance, this amazing graphic from the Times of London, published in 2001 and depicting the Qaeda leader's presumed "mountain fortress":
The story [about an underground lair] probably reached its high point on NBC's Meet The Press on December 2nd when Tim Russert, the host of the program, provided Secretary of Defense Donald Rumsfeld with the artist's rendering of bin Laden's fortress. The interview proceeded:
Russert: The Times of London did a graphic, which I want to put on the screen for you and our viewers. This is it. This is a fortress. This is very much a complex, multi-tiered, bedrooms and offices on the top, as you can see, secret exits on the side and on the bottom, cut deep to avoid thermal detection so when our planes fly to try to determine if any human beings are in there, it's built so deeply down and embedded in the mountain and the rock it's hard to detect. And over here, valleys guarded, as you can see, by some Taliban soldiers. A ventilation system to allow people to breathe and to carry on. An arms and ammunition depot. And you can see here the exits leading into it and the entrances large enough to drive trucks and cars and even tanks. And it's own hydroelectric power to help keep lights on, even computer systems and telephone systems. It's a very sophisticated operation.
Rumsfeld: Oh, you bet. This is serious business. And there's not one of those. There are many of those. And they have been used very effectively. And I might add, Afghanistan is not the only country that has gone underground. Any number of countries have gone underground. The tunneling equipment that exists today is very powerful. It's dual use. It's available across the globe. And people have recognized the advantages of using underground protection for themselves.
When Hamlet exclaimed, "What a piece of work is a man," he cited, among other things, the wonderful fact that people are "noble in reason." As a species, we're justifiably proud of our ability to reason our way toward the truth.
Evolutionary theorists, of course, aren't content to admire -- they want to know why people are such good reasoners. In their new paper, "Why Do Humans Reason? Arguments for an Argumentative Theory" (just published in the journal Behavioral and Brain Sciences), cognitive scientists Dan Sperber and Hugo Mercier propose a new account of the origins of reasoning. Reasoning, they argue, actually didn't evolve to help us find the truth; it evolved to help us make, win, and evaluate arguments.
Sir Joshua Reynolds, Self-Portrait.
Sperber and Mercier start from the fact that we do plenty of thinking without articulating explicit reasons. We make choices, decisions, and inferences all the time, in a Gladwell-esque blink, without assembling a chain of reasons to back up our decisions. So the question is: What does reasoning add to our less elaborate systems for thinking and deciding? Some psychologists have argued that it helps us correct our intuitive mistakes; others, that it helps us project our thinking into the future.
Sperber and Mercier have a different proposal: The truth, they write, is simply that reasoning "enables people to exchange arguments." "The emergence of reasoning," they argue, "is best understood within the framework of the evolution of human communication." We don't need reasons to think, but to explain our thoughts to other people, especially to people who have no particular reason to trust us. In fact, even when we reason quietly, in our own heads, we do so "anticipating a dialogic context."
This view of reasoning suggests all sorts of things, some good, some bad. On the downside, it seems that "reasoning pushes people not towards the best decisions but towards decisions that are easier to justify." This is especially true in situations where there's no obvious, intuitive answer: If you can't decide between two equally palatable options, you're likely to choose the option for which you can generate the most arguments. On the other hand, the social nature of the reasoning process can push back against its rhetorical tendencies. Especially in groups, we become super-vigilant not only about others' reasoning, but about our own. If we reason in order to communicate, the authors argue, then we will reason better when we are communicating.
This, of course, wouldn't be news to Socrates -- it's why, in classrooms and courtrooms, we use the Socratic method. The point, though, isn't practical, but historical. Why are human beings today so noble in reason? Because of eons worth of social life. According to Sperber and Mercier, social life -- conversations, arguments, debates -- is the environment within which our reasoning skills have evolved.
If alien anthropologists were to visit our world, one of the weird things they'd promptly investigate would be the (apparent) connection between art and homosexuality. From Oscar Wilde to Robert Mapplethorpe, it seems that being artistic and being gay are, somehow, connected -- even if only in the collective imagination. As Christopher Reed, an English professor and art historian at Penn State, points out in his forthcoming Art and Homosexuality: A History of Ideas, "Calling someone 'arty' or 'artistic' has often been a euphemism for homosexuality." Reed's question is: why?
To understand the connection, Reed argues, you have to look at the histories of both art and homosexuality -- and you have to look very carefully, in a nuanced way. Reed starts with Ancient Greece and moves, painstakingly, into the present. What he shows is that "homosexuality" is a very modern term, which doesn't really apply to sexuality in past eras. The word "homosexuality" has a very definitive ring to it: it means that you belong firmly to one sex, and are attracted to people of that same sex. But societies throughout history have had more fluid ideas about sex and gender. In many societies -- like Navajo society, or the Tahiti that so intrigued Gaugain -- some people have lived as androgynous members of a third gender. Ancient Greek life was similarly fluid: The ancient Greeks, Reed writes, "sanctioned a role for homosexual behavior between males of different ages," while prohibiting most homosexual acts between adults. It's hard to figure out what to call that behavior; using the word "homosexual" just has the effect of obscuring its strangeness. (This was a problem for Victorian homosexuals, Reed points out, who tried to gain acceptance by using terms like "Greek love" when their lives were, in many ways, very un-Greek.)
The truth, it seems, is that people in the past weren't quite as invested in identity as we are today. The idea that homosexuality was an identity -- a whole way of being that saturated every aspect of your character, from your dress to your friends to your taste in books and music -- arose only in the mid-nineteenth century. In part, that rising sense of homosexual identity was driven by homophobic doctors, who worked to medicalize homosexual behavior. But that development was, in turn, part of a larger trend towards thinking of people in terms of clearly defined "types." In fact, it was during the same period that the type of "the artist" emerged. It used to be that a painter was just a painter, but now he was an Artist -- a person whose aesthetic sensibility shaped every part of his identity.
Art and homosexuality feel connected to us, Reed concludes, because artists and homosexuals asserted their identities around the same time, and in ways that emphasized their independence from an increasingly airtight society. (Think of Oscar Wilde, who declared independence from every kind of convention simultaneously.) There's nothing intrinsic about art that connects it with homosexuality, or vice versa. The connection, really, comes from the fact that "the artist" and "the homosexual" were two of the most challenging identities to emerge during the golden age of identity. That story is particularly, and peculiarly, modern. As for the future -- Reed doubts that the connection has staying power. History, he writes, shows that "the definitions of art and homosexuality are multiple and constantly evolving." And identities that once seemed vivid and threatening come, inevitably, to seem conventional and familiar. The special magnetism that brought these two identities together will only lessen with time, until it seems inexplicable -- something for art historians of the future to puzzle over.
As you may have heard, Osama bin Laden was killed on Sunday night in a daring commando raid on his compound in Abbottabad, a city about thirty miles from Islamabad, in Pakistan. Abbottabad, as Christopher Hitchens has pointed out, is named after Major General James Abbott, a British military officer who ruled over it as a player in the "Great Game" in the 1850s -- in fact, the entire district, Abbottabad Province, is named after him.
James Abbott was rocking that look way before Osama.
Upon his departure from Abbottabad, Abbott wrote a short poem about it, which is inscribed on a plaque in the city's Lady Garden Square. Here's the poem, "Abbottabad":
I remember the day when I first came here
And smelt the sweet Abbottabad air
The trees and ground covered with snow
Gave us indeed a brilliant show
To me the place seemed like a dream
And far ran a lonesome stream
The wind hissed as if welcoming us
The pine swayed creating a lot of fuss
And the tiny cuckoo sang it away
A song very melodious and gay
I adored the place from the first sight
And was happy that my coming here was right
And eight good years here passed very soon
And we leave you perhaps on a sunny noon
Oh, Abbottabad, we are leaving you now
To your natural beauty do I bow
Perhaps your wind's sound will never reach my ear
My gift for you is a few sad tears
I bid you farewell with a heavy heart
Never from my mind will your memories thwart
You can see the plaque itself in this Major-Abbott-themed photo tour of Abbottabad. It looks like a pretty nice place to hide from the largest manhunt in the history of the world.
Are all votes created equal? Not according to Jason Brennan, a political philosopher at Brown. In The Ethics of Voting, he asks the obvious-yet-unutterable question at the heart of American politics: what are all those uninformed, indifferent, lazy, and stupid people doing in the voting booth?
Voting, Brennan affirms, is a fundamental American right -- but that doesn't mean that voting is, in itself, a good deed. Like any complicated skill, voting can be done well or badly. To vote well, Brennan argues, you actually need to be thinking at a very high level. It's not enough to know which policies different candidates support. You also need to have "epistemically justified" opinions about those policies -- which, in many cases, means drawing on "social-scientific background knowledge." That knowledge is hard to acquire, which is why reasonable people can disagree about their votes while also voting well; the point is that they've done their due diligence and taken voting seriously.
Many voters, meanwhile, as Brennan sees it, have little interest in doing the hard work of voting. They vote instinctively, irrationally, or for narrowly imagined, purely self-interested reasons. These voters, Brennan says, are actually doing something ethically wrong when they vote this way. It's obvious, of course, that some voters vote badly: in the last Presidential election, eight percent of New Jersey voters claimed that Barack Obama was the anti-Christ. Clearly -- unlike more-informed Republicans who voted against Obama -- they went about voting in the wrong way. It's not just crazy people who vote badly, though. "Many politically active citizens," Brennan points out, "try to make the world better and vote with the best of intentions" -- but, "although they are politically engaged, they are nonetheless often ignorant of or misinformed about the relevant facts, or, worse, are simply irrational." These voters, Brennan writes, also "pollute democracy with their votes." They would be doing more good if they didn't vote at all.
Even informed people, in short, can misunderstand what voting is all about. It's not about fulfilling a duty, but about taking on an extra, entirely optional responsibility. Voting, in this sense, is like many other undertakings. "We are not obligated to become parents," Brennan points out, "but if we are to become parents, we ought to be responsible, good parents." The same goes for being a surgeon: if you're going to be one, you have to be a good one. You're not obligated to vote -- and so, if you do choose to do it, you must meet a very high standard of conduct to avoid an ethical misstep.
In a sense, Brennan is stating the obvious -- and, except for a few asides, he resolutely avoids the kind of practical discussions that would render the obvious unsayable. His relentless focus on the problem of "wrongful voting" pays off. Taken as a whole, Brennan's argument lodges a serious objection to research in political science and behavioral economics suggesting that even lazy voters can use shortcuts to vote well. These voters, Brennan insists, are voting badly. Ultimately his book suggests that we need to be more nuanced in our approach to voting. As a nation, we're always checking in on voter turnout -- but shouldn't we also be monitoring, and taking seriously, the quality of our votes?
Finally, Brennan notes, it's important to remember that voting isn't the only way to make a contribution to civil society. "[M]any activities stereotypically considered private," he writes, "such as being a conscientious employee, making art, running a for-profit business, or pursuing scientific discoveries, can also be exercises of civic virtue. For many people, in fact, these are better ways to exercise civic virtue" than voting. Voting is the last step in a long process of civic engagement - not the first.
The French version of Star Wars is really different from the American version -- who knew! (Pro tip: if you can speak French, try to pretend that you can't.)
[Via Marginal Revolution.]
Julia Moos at The Poynter Institute selects some of the most interesting front pages responding to the end of the hunt for Osama bin Laden. "Many of the front pages," she notes, "use the same photo of bin Laden, but treat it differently with the use of color, bold headlines, type size and placement."
Stanley Kubrick spent four years of his life, from 1967 to 1971, planning an epic film about Napoleon. He worked with an Oxford history professor and dozens of assistants to compile one of the world's largest Napoleonic archives -- including 17,000 photographs and drawings of Napoleon's world -- and sent teams out into the field to take 15,000 location photographs. He designed costumes, contacted actors, and reached out to the armies of Romania and Lithuania, planning to hire 30,000 troops to serve as extras during the battle scenes. In a note to himself, Kubrick wrote: "I expect to make the greatest film ever made."
It was never meant to be; the film was too expensive for the cash-starved studios of the late 1960s. In 2009, the art-book publisher Taschen released Stanley Kubrick's Napoleon: The Greatest Movie Never Made, a $1,500 collection of ten books about the film, nestled inside one hollowed-out "Napoleonic history" book. This year, they've released a cheaper "facsimile edition" -- it's only $45 at Amazon. It tells the same story of ambition, obsession, and ultimate defeat.
A costume made by Kubrick's designers for the Napoleon film.
The book, in a deliberate echo of the film, is rough around the edges. Rather than providing a seamless, synthesized account of Kubrick's vision, the editor, Alison Castle, has focused on the raw materials: the photographs, clippings, letters, and notes that Kubrick kept in binders and a huge, library-style card catalog. There are interviews with Kubrick, and a complete draft of the screenplay, with many marked-up pages from earlier drafts. Here and there you'll find introductory essays by Kubrick experts, or a historian's response to Kubrick's screenplay -- but the emphasis is on the small gestures, as in the collection of underlined passages and marginal notes that Castle compiles from Kubrick's personal library of books about the emperor. A special 'key card' included with the book gives you access to a huge online library of images.
If you're a Kubrick maniac, this is an amazing deal for $45. Even if you're not, though, it's a unique record of what the creative process looks like when it's arrested in full flight. It gives you an unusual opportunity to measure your own obsessive nature against the towering, extremely productive obsession of a genius. As you flip through the images and read Kubrick's notes, you realize the grand scale of his imaginative life. These 1,000 pages are only the froth on the wave of Kubrick's vision. The archive tells you a lot about Kubrick's unfinished film, but it also tells you a lot about how much a single person can accomplish using his own imagination. We never saw Napoleon -- but Kubrick certainly did.
Leon Neyfakh is the staff writer for Ideas. Amanda Katz is the deputy Ideas editor. Stephen Heuser is the Ideas editor.
Guest blogger Simon Waxman is Managing Editor of Boston Review and has written for WBUR, Alternet, McSweeney's, Jacobin, and others.
Guest blogger Elizabeth Manus is a writer living in New York City. She has been a book review editor at the Boston Phoenix, and a columnist for The New York Observer and Metro.
Guest blogger Sarah Laskow is a freelance writer and editor in New York City. She edits Smithsonian's SmartNews blog and has contributed to Salon, Good, The American Prospect, Bloomberg News, and other publications.
Guest blogger Joshua Glenn is a Boston-based writer, publisher, and freelance semiotician. He was the original Brainiac blogger, and is currently editor of the blog HiLobrow, publisher of a series of Radium Age science fiction novels, and co-author/co-editor of several books, including the story collection "Significant Objects" and the kids' field guide to life "Unbored."
Guest blogger Ruth Graham is a freelance journalist in New Hampshire, and a frequent Ideas contributor. She is a former features editor for the New York Sun, and has written for publications including Slate and the Wall Street Journal.
Joshua Rothman is a graduate student and Teaching Fellow in the Harvard English department, and an Instructor in Public Policy at the Harvard Kennedy School of Government. He teaches novels and political writing.