Internal documents confirm Facebook’s shortcomings

CONTRIBTUED BY SOKOLOWSKI DAWID VIA UNSPLASH
CONTRIBTUED BY SOKOLOWSKI DAWID VIA UNSPLASH

IN SEPTEMBER of 2021, Frances Haugen, a former product manager at Facebook, released thousands of documents that implicate Facebook in spreading hate, inciting violence, and contributing to negative body images and suicidal thoughts in teenagers. These documents reveal that Facebook was aware of its platform’s issues and its detrimental effect on society but refrained from rectifying the situation. While these documents may not be shocking to most, given Facebook’s history of controversies, it is the first time that Facebook’s own internal documents acknowledge the company’s shortcomings. Most of Facebook’s issues stem from its policy to base all decisions on the company’s interest and profits, not the well-being of the public. For instance, Facebook failed to moderate hate speech in a myriad of foreign countries because the company lacked expert knowledge of those nations. Haugen’s allegations have ignited a fierce debate about Facebook’s alleged unchecked power on social media and independence in defining and moderating only certain hate content.

 

The whistleblower complaint

   Since its founding, Facebook has shown a predictable response to criticism in all of the controversies that it has faced[1]. Zuckerberg and his executives issue formal apologies, repeat vague talking points, weather attacks from politicians, and pledge to rectify its practices. All the while, Facebook’s long-term profits remain unscathed. However, the most recent controversy has proved especially damaging. In October of 2021, Frances Haugen testified that Facebook knowingly uses its platform to perpetuate hate and incendiary speech in the U.S. Senate and U.K. Parliament[2]. Because these allegations have plagued Facebook for over a decade now, some believe that these claims are conventionally accepted but ultimately baseless. However, Haugen has tangible proof to back up her claims of Facebook’s wrongdoings: tens of thousands of Facebook’s internal documents reveal Facebook’s ill-equipped attempts to combat hate speech, particularly its lack of foreign linguistic knowledge, and Zuckerberg’s personal interference in decision-making.

   Facebook’s Civic Integrity team, a unique division dedicated to putting public safety over profit, was often overruled by Zuckerberg personally when it came to the moderation of hate speech abroad. Zuckerberg’s personal involvement in specific cases does not surprise experts familiar with Facebook’s internal management[3]. Experts have repeatedly expressed concern over Mark Zuckerberg’s influence over Facebook compared to the influence other CEOs exert over their own companies. Zukerberg controls more than 60% of the voting shares; this means that Facebook’s board is nothing more than a glorified advisory committee and lacks a check and balance mechanism. Zukerberg ultimately makes all of the major decisions regarding Facebook, from the privacy options in settings to decisions on how algorithms work. Furthermore, he takes into consideration a given situation’s potential for profit or political learning when standardizing rules for hate speech moderation, resulting in a lack of consistency in Zuckerberg’s decisions. 

   For example, Zukerberg directed the team to adhere to a restrictive censorship law in Vietnam by monitoring any content against the state, even if it meant potentially infringing on the public’s right to free speech. In this stance, Zuckerberg classified posts against the Vietnamese state as hate, not as free speech, which he argued is against Facebook’s policy. In another instance, Zuckerberg chose to ignore state law and side with the vast majority of the population[4]; in July of 2020, China imposed a rigid security law on Hong Kong, aimed to crack down on public dissent and political opposition towards China and the Hong Kong government. Hong Kong authorities made a myriad of requests that Facebook turn over the protestors’ user data for criminal prosecution cases. In most of the cases, Facebook did not turn over personal data of private citizens in Hong Kong; the company viewed this law as a violation of the public’s right to free speech and did not classify the anti-Chinese protests as hate speech.

   The leaked documents also revealed that Facebook only removes 3-5% of hate speech from its platform, despite earlier claims of removing more than 90%. Facebook’s efforts to police hate and incendiary speech were futile because the company lacked employees with adept cultural and linguistic knowledge of the Middle East, Asia, and Africa. Despite being well aware of its ineptitude in non-Western regions for years, Facebook did not develop any artificial-intelligence (AI) solutions to solve this issue[5]. Due to a distinct lack of widespread AI technology and algorithmic solutions, Facebook has been selective in curbing the anti-Muslim hate rhetoric seen in India by far-right nationalists and Prime Minister Narendra Modi’s ruling party, the Bharatiya Janata Party (BJP). On a video posted on Facebook, Narsinghanand Saraswati, a Hindu priest, stated, “People need to learn that this is not the time to protest, but the time to go to war [against Muslims].” He continued, “It’s time for every Hindu to invoke the warrior in them. . . The day Hindus take weapons and start killing these Love Jihadis[6], this Love Jihad will come to an end.” By October of 2021, Saraswati’s video was viewed 1.4 million times[7]. In late October of 2021, Facebook removed the inflammatory video in question only after TIME Magazine raised awareness about it.

   Documents suggest that Facebook was aware of anti-Muslim rhetoric being shared on its platform, but the algorithms did not flag it as hate speech due to a political bias towards right-wing ideologies. As mentioned, the Indian right-wing, including the Rashtriya Swayamsevak Sangh (RSS)[8], champion the “Love Jihadis” theory. Despite not being members of the Modi government, the RSS has close relations with Modi’s government. Facebook’s internal documents acknowledge that the RSS engaged in “fear-mongering, anti-Muslim narratives [targeting] pro-Hindu populations with V&I [violence and incitement] content,” which violate Facebook’s policies against hate speech. Regardless of such evidence, Facebook chose not to designate the RSS as a dangerous organization, which would have banned said group from Facebook, due to “political sensitivities'' towards the Modi government. Facebook has denied such accusations of favoring a certain foreign political party or Islamophobic sentiment and reiterated its claims to expand its team to include experts in the Hindi language and to reduce hate speech in India. However, domestic Indian news coverage and Facebook’s own internal documents dispute Facebook’s defense.

   According to Professor Daryl Bockett (Prof., Int. Relations, UIC), “It is important to remember that Facebook has not created a single one of these social problems; it is merely a device for amplifying and disseminating unsavory ideas.” Facebook’s failure to moderate hate speech in non-Western countries can be constructed as “racism playing a role, leading Facebook to de-prioritize ‘other’ groups.” However, he added that “it also reflects the reality that most of Facebook’s key stakeholders, including shareholders, regulators, and customers in their key markets, do not have a clue about the happenings in other countries.” In this case, there is no meaningful difference between bias and lack of interest, and the accountability structure pushes Facebook to focus more on the United States and Europe rather than the “Global South.”

 

Change in PR strategy

   In light of Haugen’s testimony, Facebook has officially changed its defense strategy[9]. Instead of apologizing, Facebook has vehemently denied all of Haugen’s claims and is defending itself, emphasizing that it moderates all hate speech unequivocally. One of Haugen’s most damaging allegations was that Facebook intentionally promotes hate speech. She explained, “Facebook makes more money when [people] consume more content. People enjoy engaging with [comments] that elicit an emotional reaction, and the more anger they get exposed to, the more they interact and the more they consume.” In response to allegations about the company pushing and not moderating hate content, Zuckerberg argued that they were “illogical,” and that Facebook does not prioritize its profits over the public’s safety. He rationalized, “[Facebook makes] money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” In reference to the internal documents that prove Facebook’s infractions and Haugen’s allegations, Zuckerberg responded, “If [Facebook] wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?” He also disputed Facebook’s lack of knowledgeable employees in foreign cultures and languages. He defended his company's track record, “[Facebook employs] so many more people dedicated to [combating hate content] than any other company in our [industry] – even ones larger than us.” 

   Haugen herself has also come under attack from Facebook executives[10]. Lena Pietsch, Facebook’s director of policy communications, attacked Haugen’s credibility as a witness. She described Haugen as an employee “who worked for the company for less than two years, had no direct reports, [and] never attended a decision-point meeting with C-level executives.” Haugen admits that this view is not entirely untrue. In her testimony before the U.S. Senate, Haugen admitted more than six times that she did not work on many of the projects that she was questioned on or alleged that Facebook failed in.

 

Proposed changes and opposition

   Whenever Facebook faces a major PR crisis and experiences public backlash, debates resurface around how to limit Facebook’s influence without hindering free speech[11]. In the European Union (EU), lawmakers have proposed a new law, the Digital Services Act (DSA), on which the vote was postponed due to Haugen’s testimony. The passing of the DSA would be a landmark decision, exerting great oversight over big tech companies. This law would classify misinformation as illegal content and force Facebook to make its algorithms transparent to the public, specifically the ones that collect people’s personal data and use it to target content for users. This data restriction would greatly affect Facebook’s advertising business and independence. While Facebook has expressed its displeasure at the proposed law, Haugen testified that the DSA would force Facebook to change its business model and create products that align with users’ well-being, not profits.

   Some lawmakers and politicians are hesitant to support the proposed law because it infringes on the principles of free business practices, the mark of democratic nations[12]. Julia Reda, a former German member of the European Parliament and an expert in copyright law, expressed concerns over the EU’s debate regarding the DSA. She alleges that if this law is passed, it will result in “a complete departure from the tried and tested system of limited liability for Internet services and threaten our freedom of communication on the Internet." Similarly, civil liberties campaigners in the United Kingdom have voiced concerns over the Online Safety Bill, a proposed law that grants the Culture Secretary great power in order to protect Facebook users from harmful content. Critics worry that this is a power grab move by the ruling U.K. government and will politicize the regulation of online content; Carnegie Trust, a reputable think tank that influenced earlier versions of the Online Safety Bill, disagrees with the government’s final version of the bill as well. The think tank stated, "To meet the [United Kingdom]’s international commitments on free speech, there should be a separation of powers between the executive and a communications regulator."

 

*                 *                 *

 

   Frances Haugen’s testimony is unlikely to be the last public relations crisis Facebook faces. The way the company will deal with future crises and whether it will change its business practices of its own accord, by law, or at all remains to be seen. For now, users must remain vigilant; being more critical consumers of information is the best defense against harmful business practices. The public also has a responsibility to vet information for themselves and create a healthier forum for public discourse.

 

[1] The Washington Post

[2] The Guardian

[3] The Opinion Pages

[4] RFA

[5] Al Jazeera English

[6] Love Jihad: An anti-Muslim conspiracy theory that perpetuates the notion that Muslim men marry and convert Hindu women to Islam in an attempt to start a holy war, a jihad, against Hinduism

[7] TIME

[8] Rashtriya Swayamsevak Sangh (RSS): The largest Hindu nationalist group in India

[9] The Guardian

[10] CNBC News

[11] Silicon Republic

[12] Politico

저작권자 © The Yonsei Annals 무단전재 및 재배포 금지