Difference between revisions of "Facebook and Disinformation"
From BigTechWiki
Jump to navigationJump to search
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
* Laura Edelson said she and her team used their research to study Facebook’s “apparent amplification of partisan | * New York University researcher Laura Edelson said she and her team used their research to study Facebook’s “apparent amplification of partisan misinformation”<ref>https://twitter.com/lauraedelson2/status/1422736707485634563</ref> before Facebook abruptly shut down her account and the accounts of two of her colleagues at the NYU Ad Observatory.<ref>https://www.vox.com/recode/22612151/laura-edelson-facebook-nyu-ad-observatory-social-media-researcher</ref> | ||
*Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem. | |||
* Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”<ref>https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353</ref> | |||
* In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times. | |||
* Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.” | |||
* Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users. | |||
*Chinese state-owned outlets used Facebook to spread disinformation and propaganda supporting Russian claims over the invasion of Ukraine, even after Facebook allegedly took efforts to tamp down.<ref>https://www.computerweekly.com/news/252514981/Chinese-state-media-use-Facebook-to-push-pro-Russia-disinformation-on-Ukraine-war</ref> |
Latest revision as of 18:55, 23 March 2022
- New York University researcher Laura Edelson said she and her team used their research to study Facebook’s “apparent amplification of partisan misinformation”[1] before Facebook abruptly shut down her account and the accounts of two of her colleagues at the NYU Ad Observatory.[2]
- Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.
- Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”[3]
- In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.
- Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”
- Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.
- Chinese state-owned outlets used Facebook to spread disinformation and propaganda supporting Russian claims over the invasion of Ukraine, even after Facebook allegedly took efforts to tamp down.[4]
- ↑ https://twitter.com/lauraedelson2/status/1422736707485634563
- ↑ https://www.vox.com/recode/22612151/laura-edelson-facebook-nyu-ad-observatory-social-media-researcher
- ↑ https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353
- ↑ https://www.computerweekly.com/news/252514981/Chinese-state-media-use-Facebook-to-push-pro-Russia-disinformation-on-Ukraine-war