News your connection to The Boston Globe


Once derided as servants of British tabloids and dog-food marketers, Internet pollsters say the 2004 election may belong to them.

There must have been something wrong with the poll. Last September, in the first stages of California's recall election, most established pollsters were predicting a close vote on whether to oust Governor Gray Davis. But Knowledge Networks, a startup in Silicon Valley that conducts polls mostly for academic and business clients, was getting numbers that suggested that the race wasn't even close: Voters were going to resoundingly dump Davis and pick Arnold Schwarzenegger to replace him.

The firm, one of small but growing number of online pollsters, released its numbers to a ferocious response from Davis backers. "The only polling I've seen by the Internet," said a California Democratic party spokesman on Fox News, "is dog food companies and tabloids in Britain."

But the results in California -- where the recall passed by 11 points and Schwarzenegger was elected by 17 points -- vindicated Knowledge Networks's methods, and raised a fresh round of questions about the accuracy of traditional telephone polls in an era of rising cell phone use and growing distrust of pollsters. None of the telephone pollsters were predicting a Schwarzenegger victory that early in the campaign. Mike Dennis, Knowledge Network's vice president for government polling, thinks he knows why. Sheltered by the anonymity of cyberspace, he says, "people were more willing to admit they were voting for an actor."

Most political pollsters regard online polling as an inherently unreliable way to measure public opinion. For one thing, they say, only between two-thirds and three-quarters of Americans have Internet access. Internet polling "starts out ignoring one of the fundamentals of scientific survey research, which is that everybody in the population under study needs to have a chance to fall under the sample," says Nancy Belden, president of the National Association for Public Opinion Research. Says Frank Newport, editor-in-chief of Gallup, "We at Gallup do not believe you can generalize to the general population using Internet samplings."

But results, say the believers, speak for themselves. Three years before the California poll, a Harris online poll outperformed most of its telephone rivals in predicting almost exactly the outcome of the 2000 presidential election. And in Britain, online polling outfit YouGov has in four years gone from startup to one of the country's most prominent polling organizations. (The firm's first US poll, which began running in The Economist in July, currently shows George W. Bush and John Kerry in a dead heat.)

As Bush awaits the news on his post-convention bounce, 2004 is shaping up to be a pivotal year for the online polling industry. In the United States several major publications, including the Wall Street Journal, are experimenting with online polls. If Internet-based pollsters match their earlier success, or if beleaguered telephone pollsters misjudge the closely fought presidential race, some say, this year could be the beginning of the end for traditional polling.

Opinion polls in a democracy influence press coverage, how the public perceives candidates, and even, to a certain extent, how politicians decide to govern. In theory, polls are a great way for the public to communicate with politicians. But in practice, figuring out what the public is really thinking has never been easy.

American political polling has come a long way since its origins in the early 19th century, when newspapers would collect ballots from willing participants and simply add up the results. These straw polls lasted until 1936, when Literary Digest famously predicted that Republican Alf Landon would defeat Franklin Roosevelt. The problem was that although Literary Digest surveyed literally millions of Americans, it sent ballots mostly to those who subscribed to magazines or had telephone service -- people less likely to vote Democratic. After doing a good job in previous elections, Literary Digest had been blindsided by social change that lead to a Republican bias in its sample.

The 1936 debacle set the stage for the emergence of scientific polling, pioneered by Gallup. Founded in 1935, Gallup had the audacity to predict how millions of Americans would vote in the 1936 election based on a survey of only a few thousand. The theory held that if the company ensured every American voter from all walks of life had an equal chance to be interviewed, the poll would be more accurate. It was -- unlike Literary Digest, Gallup correctly forecast FDR's reelection.

Today, most polls work by selecting a random sample of telephone numbers, dialing them repeatedly in an attempt to reach every name on the list, and then weighting the results to make sure no part of the population -- blacks, conservatives, women -- are overrepresented. In theory, everyone in America with a telephone has an equal chance to be called.

But the theory doesn't always work. Internet political polling got its first foothold in Britain, where the polling industry had suffered a string of debacles, most famously in 1992, when established pollsters like Mori and Gallup predicted a tie or a narrow Labor Party victory, only to be proven disastrously wrong a few days later when Conservative prime minister John Major was reelected by a comfortable eight percentage points.

In 2000, the fledgling firm YouGov started from scratch using the Internet. Founded in London by former Conservative Party operative Stephan Shakespeare, YouGov recruited a panel of British Internet users to answer poll questions, offering a small fee to those who filled out questionnaires on the firm's secure website. For each poll, the firm selected from among its volunteers what it deemed a sample that represented Britain. Encouraged by the success of Harris's 2000 presidential poll, Shakespeare thought he could deliver a more accurate prediction in the 2001 British general election -- a claim that was vindicated when they called Tony Blair's 10-point victory within one point.

YouGov's Internet polling technique challenged one of the fundamentals of modern polling: random sampling. At first glance, the YouGov methodology, which essentially allows people to opt into the poll, "would seem to be a regression, a return to the straw polls of the pre-1930s," wrote Morris Fiorina and Jon Krosnick, Stanford political scientists who have worked with YouGov.

But that thinking, according to Fiorina, who says he became a believer in Internet polling after the California recall election, rests on a fallacy. Just because calling random numbers is one valid way to come up with a representative group doesn't mean it's the only way.

This idea of opt-in panels remains very controversial, even among other Internet-based pollsters. Knowledge Networks, the California online survey company, takes a more traditional approach to assembling a good sample. Instead of sending out scatter-shot e-mail solicitations, the company calls random numbers and asks respondents if they would be willing to be on the company's Internet panel. If respondents don't already have Internet access, the company buys it for them.

Dennis maintains this method combines the advantages of Internet polling with the statistical rigor of random-digit dialing. He is contemptuous of self-selecting panels. "The opt-in panel is just one or two massive blunders away from causing a lot of harm to the entire industry. . . ," says Dennis. "It could reduce public confidence in survey taking."

In theory, Internet polls like YouGov's offer big advantages over telephone interviews. They are cheaper and much faster. And, as Knowledge Networks discovered in California, respondents may be more likely to be truthful. YouGov's Shakespeare likes to cite a 2002 study by Novartis, a French market-research firm, in which people were asked whether they recognized different brands of bottled water. One of the brands was fictional. Still, 22 percent of the participants in the interviews claimed to have heard of it, compared to only 2 percent of the respondents in the self-completed sample. "In a face-to-face interaction," Shakespeare says, "people are likely to give the view the interviewer expects or wants."

In political polls, that tendency translates into a reluctance to confess to unfashionable views to a stranger over the phone. For instance, black candidates routinely poll higher than they perform on election day. Adam Berinsky, a political scientist at MIT, documents the phenomenon in his recent book "Silent Voices: Public Opinion and Political Participation in America" (Princeton), which focused on the 1989 New York mayoral election. Polls showed David Dinkins defeating Rudolph Giuliani handily, but on election day the race was far closer than predicted. All the "don't knows," Berinsky says, broke for Giuliani. "It's older Jewish Democrats, and people who didn't like talking about themselves. These are the kinds of people who are less likely to give a response, but were more likely to vote for Giuliani," he says.

The online approach also allows for vastly more in-depth questioning. "If I send out an invitation to 100,000 e-mails to go to a secure website to go to a survey where you can only vote once . . . within a limited time frame, I'll get 30,000 responses," says pollster John Zogby, whose firm is running an Internet-based presidential tracking poll for the Wall Street Journal this year. "With 30,000 responses, I can do really detailed analysis." The result is, potentially, a far more nuanced picture of public opinion that ever before.

. . .

But perhaps Internet polling's greatest selling point is the growing uncertainty about telephone polls. The big problem, say researchers, is that response rates have plummeted. This spring, the Pew Center for the People and the Press in Washington found that only 27 percent of people it called actually participated in one of their surveys, down from 36 percent in a similar poll in 1997.

Part of the problem is technology -- more people are using cell phones and caller ID to screen calls. But the real culprit seems to be a change in attitudes toward pollsters. "Surprisingly, the results we have here suggest it has more to do with cooperation than contact," says Mike Dimock, Pew's research director. "We were still reaching people at roughly the same rate as 1997, but we really had more trouble with people hanging up on us."

So far, there's no evidence that people who hang up vote any differently than those who don't. But telephone pollsters live in mortal fear that a correlation between voting intention and willingness to participate in surveys may emerge, leading to another system failure. Something is causing more and more Americans to hang up on pollsters. "If that something is correlated to political variables," says MIT's Berinsky, "we're in trouble."

For the moment, traditional pollsters say they aren't losing sleep. "In this period, I have no worries," says Belden of the National Association for Public Opinion Research. "Which is not to say that we're not all concerned with the new concerns [about telephone polling] that constantly arrive. So far, we're still able to measure attitudes and behaviors quite accurately."

For an industry where the fear of getting a big election wrong runs deep, Fiorina says, the objective should be to find the least bad way of measuring public opinion in a diverse country of 300 million people. "Nobody's arguing that Internet polling is this perfect solution," he says. "I'd much rather have face-to-face polls with 80 percent response rates, but you just don't get that any more."

Both Internet and telephone pollsters hint darkly that the other side will fumble an election and harm the whole industry. But, as Literary Digest learned in 1936, the problem might not become apparent until it's too late. Says Berinsky: "We won't know that something is wrong until it goes wrong."

Alan Wirzbicki is a writer living in Washington.

Today (free)
Yesterday (free)
Past 30 days
Last 12 months
 Advanced search / Historic Archives