(Jude Buffum for the Boston Globe)

Stopping Google

With one company now the world's chief gateway to information, some critics are hatching ways to fight its influence

By Drake Bennett
June 22, 2008
  • Print|
  • Single Page|
  • |
Text size +

GOOGLE MAY BE widely admired for its technical wizardry and its quick, accurate search engine, but one of the company's most impressive accomplishments has been its ability to grow as powerful as it is while still remaining, in the minds of most Americans, fundamentally likable.

The company today is a behemoth, with more than 15,000 employees and a market value as big as Coca-Cola and Boeing combined. Its search engine is the tool of first resort for expert researchers and schoolkids alike; for suspicious employers, first-daters, long-lost friends, blackmailers, reporters, and police investigators - in short, for seekers of any and all sorts of information. In April, the most recent month for which it compiled statistics, Nielsen Online found that 62 percent of all US Internet searches were done using Google. Yahoo, the next largest player, had only 17.5 percent of the market.

Despite its size and dominance, Google has avoided the public suspicion and vilification that have plagued powerful companies from Standard Oil to Microsoft. Instead, protected by its reputation for innovation, its famed "Don't Be Evil" mantra, and the ever-improving precision of its search engine, Google has remained for the most part a trusted, even a beloved, brand.

But as Google's influence grows, a number of scholars and programmers have begun to argue that the company is acquiring too much power over our lives - invading our privacy, shaping our preferences, and controlling how we learn about and understand the world around us. To counter its pervasive effects, they are developing strategies to push back against Google, dilute its growing dominance of the information sphere, and make it more publicly accountable. The solutions range from programs one can install on one's computer to proposed laws forcing Google to reveal parts of its proprietary search algorithm. A few experts and privacy activists are pushing for public funding for alternative search technologies, and one legal scholar wants to give individuals and companies a "right of reply" when searches bring up sites that slander them or appropriate their intellectual property.

"Google knows more and more about us, but right now there's almost nothing we can do to find out exactly what it does with that information," says Frank Pasquale, an associate professor of law at Seton Hall and one of the leading proponents of reining in Google. "We want to make powerful entities on the Internet accountable."

Some of the suggestions for fighting back are more practical than others, but taken together they represent an argument that "searching" is no longer a neutral tool, but has become a social force in itself - Google's hidden algorithms have the power to make or break reputations and fortunes, to shape public debates, and to change our view of the world.

The challenge is how to do this without undermining an online application that, even its critics concede, is one of the greatest learning and labor-saving devices of our time.


he most commonly voiced fear about Google is its unique capacity to track what we're thinking based on what we're looking for. Like many websites, Google leaves identifying "cookies" on users' computers - but unlike, say, a shopping site, what Google can track is every name, place, and topic we search. The company can learn even more about people who use Gmail, the social networking site Orkut, or another of Google's popular personalized services.

"What worries me about Google is that they have access to an incredibly sensitive range of personal data, the depth and breadth of which is unlike anything we've ever seen before," says Kevin Bankston, a lawyer with the Electronic Frontier Foundation, an advocacy group. "A log of your search history is as close to a printout of your brain as we've ever had."

Concern about Web search records has already led to pressure from regulators in Europe, where privacy protections are generally stronger than in the United States. As a result, Google agreed last year to limit the amount of time it keeps personalized user information to 18 months and to cut the life span of its cookies from 30 years to two. Other major search engines have made similar concessions. This spring a major EU Internet privacy working group advocated reducing the personal data expiration period further, to six months, a recommendation Google has declined to follow.

For privacy advocates, however, the problem isn't simply how long information is kept but what it's used for, and several worry about how Google uses the personal information it collects.

Google's privacy policy, which is available on its website, promises that the company will ask for permission from users before using personal information for any purpose other than that for which it was collected - which, in most cases, is to improve the tailoring of search results, advertising, and the company's other personalized applications.

According to Mike Yang, a senior product counsel at Google, that privacy policy is legally binding, and any change to it would have to be announced beforehand. The company, he argues, would be loath to make changes that might offend users.

"Maintaining user trust is very important to us. If we lose our users' trust, we would lose those users very, very quickly," he said in a telephone interview.

But some experts worry that this promise provides only limited protection. They worry that even if Google has no plans to use the personal information it keeps, the government might compel it to turn over search information, as it tried to do in 2005 as part of an investigation into online pornography - though in that case Google, unlike the other major Internet companies subpoenaed by the Justice Department, fought the request in federal court and eventually won.

Privacy advocates worry, too, that Google might go ahead and amend its privacy policy. They point to Amazon, which in 2000 changed its policy from one that prohibited the selling or renting of customers' personal information to one that classified customer information as an asset that could be bought or sold in the event of a company takeover.

"What I want in the privacy policy is something that says we will use your information for x, y, and z and we will not use it for anything else, and we will never change this policy," says Helen Nissenbaum, a professor in the department of media, culture, and communication at New York University.

In the meantime, Nissenbaum and others are working on tools that help individual users protect their privacy while using Google. Nissenbaum, with Daniel Howe, a computer science graduate student at NYU, designed TrackMeNot, a program that runs with the Firefox Web browser. When the user does a Web search, the program also sends out randomly generated dummy queries, so that someone looking at a user's search records would be unable to tell which was the real search query. "It's like white noise," says Nissenbaum.

To a similar end, the online privacy activist and longtime Google critic Daniel Brandt set up an online service called Scroogle, a website that allows users to submit Google searches without leaving footprints with the company. Scroogle fields queries and then relays them, using its own servers, to Google, thereby screening users' IP addresses and intercepting any cookies. According to Brandt, his site now processes about 140,000 searches a day.

Alongside these privacy concerns, which have grown hand-in-hand with the Web itself, a new worry is arising: What does it mean when a single company becomes our main doorway to the entire content of the Web? Internet search is now by far the most important public tool for finding information, and Google controls the largest share of the search market. As a result, the first few results that come up in a Google search carry outsized importance: People are much more likely to click on the first or second result than the 11th, and unlikely even to glance at the 34th. So the seemingly simple question of how Google decides to rank its findings has assumed immense importance, effectively deciding which sites get visited and which are passed over, what information gets read and what goes unnoticed.

As Greg Lastowka, an associate professor of law at Rutgers, wrote in a paper published last fall, Google "tells us what words mean, what things look like, where to buy things, and who or what is most important to us. Google's control over 'results' constitutes an awesome ability to set the course of human knowledge."

Seen this way, the concern is not with Google's access to our personal information, but in Google's power to order all information. Critics worry about the implications of a single company shaping public opinion, especially since - unlike the phone book's alphabetical order, or the library's Dewey Decimal system - there is little transparency in how Google orders the world for us. In the long run, scholars like Lastowka and Frank Pasquale argue, search engine algorithms could end up privileging sites full of erroneous or slanderous or heavily biased information, marginalizing opposing viewpoints. Search engine companies could manipulate rankings to maximize advertising revenue, targeting particular sites for favor or disfavor. Pasquale worries that, as Google makes deals with everyone from the Associated Press to Warner Music for content, the company has extra incentive to favor them over their competitors.

There is no evidence that Google systematically distorts its results. According to a Google spokesman, "It's in our best interest to act responsibly and be as transparent as possible." The problem, critics argue, is that the workings of Google's search algorithm are a closely guarded secret, so we have to take the company at its word.

In the United States, there have been two court cases dealing with this issue, lawsuits brought against Google by online companies that saw their rankings, and, as a result, their earnings, suddenly and precipitously drop, and that accused Google of having intentionally targeted them - one was a company that offered strategies to improve Google rankings, a practice Google has publicly condemned. In both cases, the courts ruled for Google, arguing that whether or not it had manipulated its rankings, those rankings were "evaluative opinions" and therefore protected by the First Amendment.

One response, in light of the legal protection that Google enjoys, is to craft new laws around the use of search engines themselves. In Finland, for example, it is now illegal for companies to do Web searches on prospective hires, in much the same way it is illegal in America to use an employee's age or sexual preference in a hiring decision.

Another is an idea put forward by Pasquale of Seton Hall. In a few recent papers, he has proposed what he calls a "right of reply" to search results. If, for example, the top results to a query about an individual are slanderous or otherwise damaging to his reputation, that person, Pasquale argues, should have the right to put an asterisk by the findings that links to a rebuttal.

Pasquale and others have also argued that it may be time to rethink the legal protection Google's rankings now enjoy. The company's secret page-ranking algorithm is at the heart of Google's success: It was the founding technology of the company, and has been modified over the years to produce more useful results and foil companies that try to manipulate it. But critics now suggest that Google's technology is now too influential to remain one company's black box.

Google and its defenders argue that making the search algorithm public would be a disaster, not only for the company, which would lose much of its competitive advantage, but for Web searching itself, since everyone who wanted to game the rankings would have a road map for how to do it. In response, Oren Bracha, an assistant professor of law at the University of Texas, suggests that cases of potential search engine bias could be treated the way terrorism trials with classified information now are: in a sealed proceeding that prevented evidence from leaking out into the wider world.

Still, to other Google watchers, such measures would ultimately end up backfiring. A "right of reply" would be difficult to put into practice, and could end up being used by companies to ensure that their links show up on all Web searches that highlighted their competitors. And even privacy protections, points out John Palfrey, executive director of Harvard's Berkman Center for Internet & Society, can have their costs, making search engines themselves less efficient and making it harder to gather information about criminals and terrorists.

Even Google critics admit that the current set of responses is in many ways imperfect. But they are the start, they argue, of a broader discussion about how we fit Internet search into our current notions about freedom of speech, fairness, and access. In various ways, search engines fill the role of the newspaper, the phone book, the encyclopedia, and the public library, but they are different from each, and we're still figuring out how - and whether - to regulate them.

To Pasquale and others, search engines, like the railroads and the telephone, are technologies that, because of their great importance, demand a level of public control and accountability, Google most especially. Pasquale has gone so far as to advocate for a Federal Search Commission along the lines of today's Federal Communications Commission.

In a sense, Google is now grappling with the consequences of its runaway success. It has been so good at making so much information so readily available that its own search function has come to seem less like a private service and more like a right. In theory, of course, it is easy for a Google user to defect to another search engine. But there is a reason "Google" has become a verb: Google has so outpaced its rivals that it has begun to look like a monopoly, a necessity where users have only one real option. And the more we come to rely on Google, the more Google may have to listen to the rest of us.

Drake Bennett is the staff writer for Ideas. E-mail

Correction: Because of a reporting error, a story in the Ideas section on June 22 about critics of Google incorrectly described the country's sexual-orientation discrimination laws. In most states it is not illegal for private-sector employers to use sexual orientation as the basis for employment decisions.

  • Print
  • Print
  • Single page
  • Single page
  • Reprints
  • Reprints
  • Share
  • Share
  • Comment
  • Comment
  • Share on DiggShare on Digg
  • Tag with Save this article
  • powered by