“We want Google to provide a “one-stop” site where users can opt out of tracking across the company’s platforms.”
2018 STUDY REVIVES DEBATE ABOUT GOOGLE’S ROLE IN FILTER BUBBLES
Finding #1: Most people saw results unique to them, even when logged out and in private browsing mode.
Finding #2: Google included links for some participants that it did not include for others.
Finding #3: We saw significant variation within the News and Videos infoboxes.
Finding #4: Private browsing mode and being logged out of Google offered almost zero filter bubble protection.
A new study from DuckDuckGo, a Google rival, found that users saw very different results when searching for terms such as “gun control,” “immigration,” and “vaccinations,” even after controlling for time and location. One participant saw a National Rifle Association video at the top of the results page for “gun control,” another saw Wikipedia at the top, while a third got the NRA video but no result from Wikipedia in any of the first 10 links.
Over the years, there has been considerable discussion of Google’s “filter bubble” problem. Put simply, it’s the manipulation of your search results based on your personal data. In practice this means links are moved up or down or added to your Google search results, necessitating the filtering of other search results altogether. These editorialized results are informed by the personal information Google has on you (like your search, browsing, and purchase history), and puts you in a bubble based on what Google’s algorithms think you’re most likely to click on. The filter bubble is particularly pernicious when searching for political topics. That’s because undecided and inquisitive voters turn to search engines to conduct basic research on candidates and issues in the critical time when they are forming their opinions on them.