Facebook and Whistleblowers
From BigTechWiki
Jump to navigationJump to search
- According to the Wall Street Journal, “Time and again, the documents show, in the U.S. and overseas, Facebook’s own researchers have identified the platform’s ill effects, in areas including teen mental health, political discourse and human trafficking. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them. Sometimes the company held back for fear of hurting its business. In other cases, Facebook made changes that backfired. Even Mr. Zuckerberg’s pet initiatives have been thwarted by his own systems and algorithms.”[1]
- Facebook let high-profile accounts harass, incite violence without any consequences
- Wall Street Journal Headline: “Facebook Says Its Rules Apply To All. Company Documents Reveal A Secret Elite That’s Exempt.”
- Facebook had different content moderation standards for high-profile accounts, like politicians, actors, and sports stars. High-profile accounts were protected by the company’s “Cross Check” program, that at minimum allowed them to post rule-violating content while a Facebook employee manually reviewed the post. Other accounts were whitelisted and rendered immune from enforcement actions. In 2020, Facebook had at least 5.8 million Cross Check accounts.
- Facebook claimed it was neutral in content moderation decisions, but it knowingly let high-profile accounts violate company standards. An internal Facebook review found that Facebook’s favoritism of high-profile accounts was “not publicly defensible,” adding “we are not actually doing what we say publicly.”
- Soccer star Neymar posted a “revenge porn” image for more than a day because of his Cross Check status.
- Facebook often had political incumbents as Cross Check members, while challengers were not in the system, effectively giving political incumbents significant leeway in their Facebook posts.
- Facebook misled its Oversight Board about Cross Check accounts. Facebook said its enforcement system for Cross Check accounts was used in “a small number of decisions,” but the Wall Street Journal found that “In practice, most of the content flagged by the XCheck system faced no subsequent review.”
- Facebook had no plans to hold high-profile users to the same content moderation standards as everyone else.
- Facebook knew Instagram was toxic for teen girls, but downplayed concerns in public
- Wall Street Journal Headline: “Facebook Knows Instagram Is Toxic For Teen Girls, Company Documents Show.”[2]
- Privately, Facebook knew it worsened body images for teenage boys and girls. Facebook’s internal research found that 32 percent of teenage girls said that “when they felt about their bodies, Instagram made them feel worse.” Thirteen percent of British users and six percent of American users traced their desire to kill themselves to Instagram. Fourteen percent of teen boys in the U.S. said Instagram mad them feel worse about themselves.
- One college student said Instagram’s selfie beauty filter minimized her Nicaraguan features and made her look more European.
- One teenage girl said, “Every time I feel good about myself, I go over to Instagram, and then it all goes away.”
- According to the Wall Street Journal, “The features that Instagram identifies as most harmful to teens appear to be at the platform’s core. The tendency to share only the best moments, a pressure to look perfect and an addictive product can send teens spiraling toward eating disorders, an unhealthy sense of their own bodies and depression, March 2020 internal research states.”
- Facebook’s research concluded that some of the problems Instagram created with teen mental health were specific to Instagram and not found in social media more broadly.
- Publicly, Facebook said its products could improve mental health. Facebook, instead of referencing its own research showing the negative effects of Instagram, cited outside institutes that found little correlation between social media use and depression.
- Facebook still sought to expand its base of younger users. Publicly, after the release of the Facebook papers, Facebook CEO Mark Zuckerberg claimed he was refocusing the company to focus on young adults.
- Facebook planned to introduce products for young children, even toddlers
- Wall Street Journal Headline: “Facebook’s Effort To Attract Preteens Goes Beyond Instagram Kids, Documents Show.”[3]
- A Facebook presentation proposed introducing products to six age brackets. One bracket was adults, the five other brackets were for different ages of children. The youngest bracket was from birth to age four.
- A Facebook presentation questioned if there might be a Facebook product to engage children during play dates.
- Facebook manipulated its algorithms to promote angrier, more sensationalist content
- Wall Street Journal Headline: “Facebook Tried To Make Its Platform A Healthier Place. It Got Angrier Instead.”
- Washington Post Headline: “Five Points For Anger, One For A ‘Like’: How Facebook’s Formula Fostered Rage And Misinformation.”
- In 2018, Facebook changed its news feed algorithm in response to declining user engagement. The new algorithm sought to increase engagement between users. It weighed emotional reactions like an “angry” or “sad” emoji five times more heavily as “likes.”
- Facebook whistleblower Frances Haugen said, “Anger and hate is the easiest way to grow on Facebook.”
- Facebook claimed its changes were to strengthen bonds between users, but in fact the changes produced an increase in sensationalist and angry content on the platform, as well as an increase in misinformation.
- BuzzFeed CEO Jonah Peretti wrote that BuzzFeed felt obligated to produce racially-charged or junk science content to perform well under Facebook’s algorithm.
- In Poland political parties pushed more negative content to succeed under Facebook’s algorithm.
- Facebook resisted changing harmful attributes to its news feed algorithm, concerned that more changes risked reducing user engagement.
- Facebook ran experiments on its users, including turning off safety features
- According to the Wall Street Journal, “The culture of experimentation ran deep at Facebook, as engineers pulled levers and measured the results. An experiment in 2012 that was published in 2014 sought to manipulate the emotional valence of posts shown in users’ feeds to be more positive or more negative, and then observed whether their own posts changed to match those moods, raising ethical concerns, The Post reported at the time.
- Another, reported by Haugen to Congress this month, involved turning off safety measures for a subset of users as a comparison to see if the measures worked at all.”
- Facebook took only limited action to stop human trafficking on Facebook
- Wall Street Journal Headline: “Facebook Employees Flag Drug Cartels And Human Traffickers. The Company’s Response Is Weak, Documents Show.”[4]
- CNN Headline: “Facebook Has Known It Has A Human Trafficking Problem For Years. It Still Hasn't Fully Fixed It.”
- USA Today Headline: “Facebook Papers Reveal Company Knew It Profited From Sex Trafficking But Took Limited Action To Stop It.”
- Facebook learned that a Mexican drug cartel used Facebook to “recruit, train and pay hit men,” but “the company didn’t stop the cartel from posting on Facebook or Instagram.”
- Facebook kept up posts recruiting for domestic servitude in Saudi Arabia because it was not against the platform’s terms of service.
- According to the Wall Street Journal, Facebook prioritized retaining users and “at times placating authoritarian governments” over content moderation in developing countries. For example, Facebook took only limited action to stop human trafficking after Apple threatened to delist Facebook’s apps from its App Store.
- Facebook was unable to control anti-vaccine sentiment on Facebook
- According to the Wall Street Journal, “false and misleading coronavirus information was rampant on” Facebook.
- CNN Headline: “Facebook Is Having A Tougher Time Managing Vaccine Misinformation Than It Is Letting On, Leaks Suggest.”
- Associated Press Headline: “Facebook Froze As Anti-Vaccine Comments Swarmed Users.”
- Facebook “shelved” and ignored suggestions from its employees on how to combat COVID vaccine conspiracy content on its platform. Critics said the reason Facebook was slow to take action was because it was worried of the impact on its profits.
- Facebook refused to publicly share how COVID and vaccine misinformation moved through its platform.
- Facebook relied on artificial intelligence for content moderation, but it was buggy and was projected to remove only two percent of hate speech
- Wall Street Journal Headline: “Facebook Says AI Will Clean Up The Platform. Its Own Engineers Have Doubts.”
- Facebook relied on AI enforcement for content moderation, but its AI systems were unable to distinguish between cockfighting and car crashes. Facebook relied on AI because it was less expensive than investing sufficient resources into human content reviewers, a more effective system.
- Privately, Facebook engineers predicted its AI only caught two percent of hate speech on the platform, but publicly Facebook said its AI would remove “the vast majority” of all problematic content. To this day, Facebook refuses to be transparent about how effective its systems in removing hate speech are.
- Facebook made “ad hoc” decisions to suppress political content
- According to the Wall Street Journal, Facebook sought to remove “harmful communities” from growing on Facebook, but “The reality is that Facebook is making decisions on an ad hoc basis, in essence playing whack- a-mole with movements it deems dangerous. By taking on the role of refereeing public discourse, Facebook has strayed from the public commitment to neutrality long espoused by Chief Executive Mark Zuckerberg.”
- Facebook made only belated effort to stop election misinformation after January 6th riots
- After the January 6th riots, Facebook executive Sheryl Sandberg downplayed Facebook’s role in fomenting the violence. But Facebook took only belated steps to halt the “Stop The Steal” movement on Facebook, and it only swung into action after the riots.
- According to Politico, “Both the inability to firm up policies for borderline content and the lack of plans around coordinated but authentic misinformation campaigns reflect Facebook’s reluctance to work through issues until they are already major problems.”
- Facebook’s D.C. lobbying shop was controlled by Republicans, struggled to employ Democrats
- Nearly twice as many Republicans as Democrats lobby for Facebook. In 2021, Facebook struggled to recruit a high-profile Democrat to run Facebook’s lobbying shop under the Biden administration.
- After Facebook papers release, growing push to regulate Facebook
- The Federal Trade Commission was reviewing disclosures from the Facebook Papers.
- The SEC was in communications with Facebook whistleblower Frances Haugen’s attorneys.
- Sen. Richard Blumenthal (D-CT) wrote, “I think the FTC should be really angry if Facebook concealed this material from them as it did from us in the Congress and the public.”
- Miscellaneous
- Forbes Headline: “Facebook Can Be Toxic For Female Politicians, Company Documents Show.”
- Facebook Internal Research: Minority Users Felt Facebook Was “Censoring” Minority Groups And Described Being Banned “For Speaking Out To Their Communities About Their Lived Experiences.”
- ↑ https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353
- ↑ https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739?mod=hp_lead_pos7&mod=article_inline
- ↑ https://www.wsj.com/articles/facebook-instagram-kids-tweens-attract-11632849667?mod=article_inline
- ↑ https://www.wsj.com/articles/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline