Google has reacted to how racist searches can bring up the White House and other locations by promising to extend its “Googlebomb” protection to Google Maps.
Google has apologized for the situation and promised a fix is on the way. From its blog post today:
This week, we had some problems with Google Maps, which was displaying results for certain offensive search queries. Like many of you, we were deeply upset by this issue, and we are fixing it now. We apologize this has taken some time to resolve, and want to share more about what we are doing to correct the problem.
The Racist Listings
The problem attracted attention this week after it was discovered that searching for “n–ga house” would bring up the White House on Google Maps:
A similar problem hit Howard University, a historic and predominantly Black college. However, the problem wasn’t just restricted to prominent places in the Washington DC area nor slurs against Blacks.
Racist Listings In Google Maps That Will Shock You & Why They May Be Happening is our story from yesterday showing that this problem involved slurs against other races, involved places like small record stores and also happened for profanity like “bulls**t.”
Crowdsourcing The Web Goes Bad
As we explained yesterday, our assumption was that these weird, strange and offensive results were happening because Google was making use of content from across the web in attempt to better understand what places were relevant for. This is something it began to do as part of what was called its Pigeon Update last year.
Google’s post today has confirmed this, saying:
At Google, we work hard to bring people the information they are looking for, including information about the physical world through Google Maps. Our ranking systems are designed to return results that match a person’s query. For Maps, this means using content about businesses and other public places from across the web. But this week, we heard about a failure in our system—loud and clear. Certain offensive search terms were triggering unexpected maps results, typically because people had used the offensive term in online discussions of the place. This surfaced inappropriate results that users likely weren’t looking for.
To understand more, say Google knows about a local sporting goods store. The owner of that store might explain in the description it provides to Google Maps that it sells baseball, football and hockey equipment. It also sells other sporting equipment, but if these things aren’t also listed in its description or on its associated web site, the store might not be deemed relevant for those things.
With the Pigeon Update, Google sought to correct this. Imagine that some customer of the site wrote a blog post saying that the store was a great place to get skiing equipment. Google, seeing the business named in that post, might effectively add this information to the business listing, making it relevant for skiing equipment. To our understanding, there doesn’t even have to be a link to the business site or listing in Google Maps. Using a business name alone might be enough to create the connection.
That’s a simplified explanation, of course. But it helps explain how we then ended up with places showing up for racist terms. If people are mentioning places alongside racial slurs or derogatory language, Google’s Pigeon technology — despite its good intentions — is making those places relevant for those terms. It’s also a problem that’s probably been happening for weeks or months but only noticed now.
The Googlebomb Fix
It’s important to understand that this has happened not because of some overt hacking attempt, as with that Android peeing on the Apple logo in an area of Google Maps last month. Rather, this seems almost certainly an unexpected side-effect of using the entire web to determine what places are relevant for without trying to filter out sensitive terms.
That leads to the whole Googlebomb fix (or Google Bomb, if you prefer). Googlebombing is a reference to how people used to link to pages with embarrassing words in the links that they wanted pages to rank for. Google, in general, considers links to be “votes” in favor of pages that get them. The words in links are like votes that a page should be relevant for those terms.
This is why for a period of time, a search for “miserable failure” caused the official page for former US president George W. Bush to rank tops on Google for that phrase. There was a campaign to try and rank the page that way, with a call for people to link to the page with those words. It worked.
In January 2007, Google finally put a Googlebomb fix in place. In short, the fix looks to see if the words in a link pointing at a page actually appear on the page itself. If not, then the page is far less likely to rank for those words. Since the Bush page didn’t have the words “miserable failure” on it, it no longer ranked for that phrase. When it made use of the word “failure” a few months later, it briefly ranked for the word “failure” until that word disappeared.
Now Google says it’s going to use the same technology to solve its problem with Google Maps:
Building upon a key algorithmic change we developed for Google Search, we’ve started to update our ranking system to address the majority of these searches—this will gradually roll out globally and we’ll continue to refine our systems over time. Simply put, you shouldn’t see these kinds of results in Google Maps, and we’re taking steps to make sure you don’t.
With Google Maps, there really isn’t any evidence of an orchestrated campaign to rank any of these places for any of these terms, as had been the situation with Googlebombs. Rather, it was an unfortunate side-effect of using the entire web to help determine the context of local places.
Still, the Googlebomb fix will likely work the same. As long as these places don’t use any of these slurs or derogatory terms on their own sites or in their own business listings, they probably won’t be relevant for them. Likely, Google will also create a filter of certain words that no site is allowed to be relevant for.