Your Identity Shapes Your Media: Search Algorithm Politics
Art by Helen Cui
As technology continues to develop and AI becomes increasingly prevalent in our daily lives, those working behind the screens have never been more influential in swaying public opinion. With new and emerging Information and Communications Technology (ICT) such as social networking, social media platforms, and search engines, the number of people reliant upon these technologies for political information has greatly increased. AI’s entrance into ICT has allowed for cheaper and more efficient production and distribution of information, however, these advanced algorithms have also contributed greatly to the forefront problem of political polarization through a new issue of “filter bubbles.”
The term “filter bubble,” coined by internet activist Eli Pariser a decade ago is defined to be the situation in which algorithms filter through content, aiming to please the user by showing only information that the algorithm believes is favored. As a result of this, users are less exposed to divergent viewpoints and isolated in their own informational bubbles. These filter bubbles, created by advanced algorithms often used to power various AI technologies, are not random and operate under a three-step process. The algorithm first attempts to obtain a general scope of the user, figuring out their background and interests. It then feeds the user content and services it believes fit them best, and finally, with more searches and information input, the algorithm is able to skew itself to yield more and more targeted information.
Filter bubbles are also often centered around politics and tend to provide those of different parties with different content. A common example of this is the right-leaning Republic pro-business filter bubble versus the left-leaning Democratic pro-environment filter bubble. When two users, each belonging to the opposite filter bubble, Google searched “BP,” their algorithms returned different results even when the user inputs were the same. The right-leaning Republican pro-business filter bubble was provided with investment news regarding British Petroleum, whereas the left-leaning Democratic pro-environment filter bubble received information on the Deepwater Horizon oil spill.
Thus, the output provided by today’s advanced search algorithms is no longer dependent on just the search input, but a user’s overall input. The results from a Wall Street Journal study, show that “the top fifty internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each.” These cookies and tracking beacons allow your favorite sites and platforms to collect data on everything from your daily life to your interests and political beliefs to provide you with personalized ads to meet a certain revenue quota. As flawlessly phrased by Pariser, “your identity shapes your media.” Whether it be where you live, who your friends are, or your favorite media platforms, everything is data collected and used to shape what you see online, which then shapes the decisions you make, where you live, who to be friends with and the media you consume.
The convenience that comes with utilizing filter bubbles that generate personalized search results may seem appealing to many users. However, the harms outweigh the benefits. With the surge in social media usage, which heavily utilizes filter bubbles to make revenue, the bad situation of political polarization is made worse. As demonstrated in the recent presidential election, America is becoming increasingly hyperpartisan and more politically polarized than ever, just as Pariser predicted ten years ago. Political differences are magnified online as social media allows fake news and conspiracy theories to spread like wildfire, often even faster than real facts. These theories turn citizens against one another and cause issues of distrust – further encouraged by the president – regarding journalism and other news organizations necessary to protect democracy. And with every citizen in their own filter bubble, even the exact same news triggers wildly divergent interpretations as a result of constantly being surrounded by one’s own beliefs.
The new rise in distrust in the media has unquestionably steered citizens from certain extremist media sources and is one of the few things Republicans and Democrats enthusiastically agree upon nowadays. Nonetheless, the number of people who rely on social media for news continues to increase. In fact, according to the Pew Research Center, “the share of Americans who often get their news from social media grew 10 percentage points to 28 percent last year.” However, those who get the majority of their news from social media also admit to being “less informed about current events and more likely to have been exposed to conspiracy theories.” This is often a result of such strong exposure to just one set of viewpoints. Many platforms have taken steps attempting to combat this issue, but these actions have backfired, doing nothing but amplifying the problem.
Pariser’s original solution to filter bubbles was increased exposure to different viewpoints. However, research shows that people do see opposing views on social media. Media platforms, especially those offered to users free of cost, rely heavily on posts that elicit strong reactions in order to maintain engagement. This means that the users are often “seeing the most acerbic of opposing views, which can lead people to be even more repelled by them,” further contributing to polarization. Additionally, one of the most profound and concerning issues regarding filter bubbles is the blindspot of citizens to this matter. Many people are unaware that filter bubbles exist and “more than 60% of Facebook users are entirely unaware of any curation on Facebook at all.” Solving polarization is impossible without education on the matter.
Phenomena such as filter bubbles have existed for a long time and are only becoming more profound and threatening to democracy. In Barack Obama’s farewell address, he addressed a concept similar to filter bubbles, echo chambers, as a threat to American democracy, noting that we “retreat into our own bubbles…especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions…we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there.” Our society is far from what it was decades ago, technology is developing faster than ever, and it is crucial that citizens, especially users of these platforms, are aware of what goes on behind the screen. This is no longer a simple trade-off between convenience and cyber-security with our nation’s democracy on the line.