Do algorithms and social media platforms control our perception of the world?

CONTRIBUTED BY  NIV SINGER VIA UNSPLASH
CONTRIBUTED BY NIV SINGER VIA UNSPLASH

 

PERSONALIZATION AND recommender systems (RS) are being widely used on social media platforms such as Facebook, as increasing volumes of information are generated daily on these platforms. RS makes precise predictions based on mathematical calculations to understand what information a user may or may not want to have on their social media feeds. Thanks to such features, the users do not have to waste their precious time searching for online content, as algorithms direct users to contents specifically catered to their taste. Despite the benefits, personalized content raises concerns about the influence on news reception and the impact of political processes in our society. 

 

Time is money: the drive to collect user attention

   Among many different types of RS, collaborative-filtering (CF) is the most commonly used on Facebook. CF utilizes user ratings from past data to evaluate a user’s preference for certain items and make predictions. This type of RS consists of two stages in which the first stage finds other users with similar content preferences and compares them to the targeted user. The similarity between Facebook users is often measured by common items such as groups, pages, events etc. that the users have liked in the past [1]. In the second stage, RS suggests content to the target user by taking ratings from similar users who have indicated their preferences [2]. In other words, CF uses previous data of like-minded users to predict how a user would rate an item. For social media users, RS has proven to be a helpful tool to navigate through the overwhelming information on these platforms by making content consumption more time-efficient and attention-grabbing.

   As Michael Goldhaber expressed in his Wired article from 1997, “the currency of the New Economy won’t be money, but attention.” Because there is an excess amount of information available online, Goldhaber believed that attention would become a scarce commodity and therefore valuable. Goldhaber uses the term “attention economy” to describe this phenomenon [3].

   Goldhaber’s future is here now. Competing amongst themselves, social media companies engineer their platforms to maximize user engagement. Under the attention economy, personalization via RS is extremely necessary for social networking sites to survive. These websites seek attention from their users by providing algorithmically hand-picked sources of information [4]. Combined with advertisement, all that gathered attention translates directly into profit. Facebook alone made 17 billion dollars in advertising by garnering attention from their users through the use of algorithms in the third quarter of 2019 [5]. 

 

RS and news reception

   There is, however, a flip side of the coin. The need to grab internet users’ attention has shown to impact our news reception [6]. It has a negative effect on the health of our democracy. Social media sites are becoming the primary source of news for many people today. But with RS spoon-feeding contents predicted to be liked by the users, it has been noted as polarizing people and creating “echo chambers” and “filter bubbles.” Several studies too point out that CF-based RS reinforces political homogeneity among social media users. A study done by Noordeh et al. showed that recommendations produced by CF reduce the diversity of content a user is exposed to [9]. Many scholars have expressed their concerns regarding RS’s ability to worsen the quality of public opinion. Political selective exposure through RS poses harmful political and societal consequences.

   In an interview with The Yonsei Annals, Song Hyun-jin (Asst. Prof, Dept. of Communication) explains why this is an important issue. “There is a real worry [...] whether people would get the same side of information [...] without knowing then that they are missing some important stuff from the other side.” The filtering of information based on user preferences allows the newsreader to ignore stories that are against one’s own opinions. People will not be exposed to a wide range of information that is necessary to make well-informed political decisions. Professor Song also points out that personalized content not only affects the diversity of content but also what political opinion is consistently reflected in the information recommended by CF. “Even if you are getting some political content, now the question is what side of contents that you are getting and what drives that kind of selection behavior.” When it comes to political exposure on social media, there is a close relationship between RS, news reception, and political selective exposure. Political selective exposure is the tendency to create a news environment that reflects the political ideology of an individual [10]. With the advancements of algorithms, either the internet users themselves selectively choose their information environment, called user-driven tailoring, or the systems utilized by the big tech companies will do it for the user, which is referred to as system-driven tailoring [11]. But why expose oneself to news that is politically selected in the first place if the readers’ ultimate goal is to gain a collective overview of the political scene? Some psychological factors point to people’s inherent need to reduce information that is not congruent with their viewpoints: it is by default cognitively easier to process information that aligns with one’s own beliefs. Information that is congruous with existing belief systems is generally believed to be more convincing.

   However, a study in 2014 concluded that user-driven customization could reduce political selective exposure created by system-driven customization. The study showed when users explicitly choose their news recommendations, they were more exposed to counter-attitudinal information compared to users who did not have a direct influence on the selection process [12]. There are, however, still concerns regarding the negative impacts system-driven customization has on our democracy. 

CONTRIBUTED  BY JOSHUA HOEHNE VIA UNSPLASH
CONTRIBUTED BY JOSHUA HOEHNE VIA UNSPLASH

 

A threat to democracy

   In order for fruitful deliberation to take place, the participants must have a general awareness of the entire political scene. However, since system-driven recommendations exacerbate political selective exposure, leading to polarization, echo chambers, and filter bubbles, many argue that it does more harm than good to our democratic society, especially in times of pre-election periods. Therefore, RS taints the fundamental purpose of discussion by preventing participants from joining the collective process of deliberation, as counter-arguments are hidden.

   In the digital age of the attention economy and personalized content, free will comes at a price. James Williams, former Google strategist and now an Oxford-trained philosopher, expressed in his book “Stand Out of Our Light” that social media’s craving for attention through algorithms not only risks causing harm to our freedom and sense of autonomy but also our joint ability to hold any political opinion [13]. The efficiency offered by RS and personalization may be helpful to navigate through the infinite stream of information; however, in encouraging individuals to take a biased stance due to system-driven customization, it poses a detrimental risk when applied in the distribution of political news.

 

[1] Facebook Engineering

[2] MDPI

[3] Wired

[4] Science Direct

[5] Facebook Investor Relations

[6] Sage Journals

[7] Proceedings of the National Academy of Sciences of the United States of America

[8] Association for Computing Machinery

[9] ResearchGate

[10] Wiley Online Library

[11] Science Direct

[12] Sage Journals

[13] Cambridge Core

 

저작권자 © The Yonsei Annals 무단전재 및 재배포 금지