<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.bigtechwiki.com/index.php?action=history&amp;feed=atom&amp;title=Facebook_and_Disinformation</id>
	<title>Facebook and Disinformation - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://www.bigtechwiki.com/index.php?action=history&amp;feed=atom&amp;title=Facebook_and_Disinformation"/>
	<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;action=history"/>
	<updated>2026-04-10T13:08:34Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.37.1</generator>
	<entry>
		<id>https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=398&amp;oldid=prev</id>
		<title>Btw admin at 18:55, 23 March 2022</title>
		<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=398&amp;oldid=prev"/>
		<updated>2022-03-23T18:55:35Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 18:55, 23 March 2022&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l5&quot;&gt;Line 5:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 5:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*Chinese state-owned outlets used Facebook to spread disinformation and propaganda supporting Russian claims over the invasion of Ukraine, even after Facebook allegedly took efforts to tamp down.&amp;lt;ref&gt;https://www.computerweekly.com/news/252514981/Chinese-state-media-use-Facebook-to-push-pro-Russia-disinformation-on-Ukraine-war&amp;lt;/ref&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Btw admin</name></author>
	</entry>
	<entry>
		<id>https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=397&amp;oldid=prev</id>
		<title>Btw admin at 18:48, 23 March 2022</title>
		<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=397&amp;oldid=prev"/>
		<updated>2022-03-23T18:48:32Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 18:48, 23 March 2022&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Laura Edelson said she and her team used their research to study Facebook’s “apparent amplification of partisan &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;misinformation&lt;/del&gt;.&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;” &lt;/del&gt;Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;New York University researcher &lt;/ins&gt;Laura Edelson said she and her team used their research to study Facebook’s “apparent amplification of partisan &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;misinformation”&amp;lt;ref&amp;gt;https://twitter.com/lauraedelson2/status/1422736707485634563&amp;lt;/ref&amp;gt; before Facebook abruptly shut down her account and the accounts of two of her colleagues at the NYU Ad Observatory.&amp;lt;ref&amp;gt;https://www.vox&lt;/ins&gt;.&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;com/recode/22612151/laura-edelson-facebook-nyu-ad-observatory-social-media-researcher&amp;lt;/ref&amp;gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/del&gt;* Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”&amp;lt;ref&amp;gt;https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/ins&gt;Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/del&gt;* In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”&amp;lt;ref&amp;gt;https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/del&gt;* Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;*&lt;/del&gt;* Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Btw admin</name></author>
	</entry>
	<entry>
		<id>https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=396&amp;oldid=prev</id>
		<title>Btw admin at 18:44, 23 March 2022</title>
		<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=396&amp;oldid=prev"/>
		<updated>2022-03-23T18:44:26Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 18:44, 23 March 2022&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Laura Edelson &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Said Her And Her Team Used Their Research To &lt;/del&gt;study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Laura Edelson &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;said she and her team used their research to &lt;/ins&gt;study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”&amp;lt;ref&amp;gt;https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals.”&amp;lt;ref&amp;gt;https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353&amp;lt;/ref&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Btw admin</name></author>
	</entry>
	<entry>
		<id>https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=259&amp;oldid=prev</id>
		<title>Btw admin at 17:13, 23 February 2022</title>
		<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=259&amp;oldid=prev"/>
		<updated>2022-02-23T17:13:43Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 17:13, 23 February 2022&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Laura Edelson Said Her And Her Team Used Their Research To study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Laura Edelson Said Her And Her Team Used Their Research To study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;“animals”&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;“animals.”&amp;lt;ref&amp;gt;https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353&amp;lt;/ref&amp;gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;** Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Btw admin</name></author>
	</entry>
	<entry>
		<id>https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=97&amp;oldid=prev</id>
		<title>Btw admin: Created page with &quot;* Laura Edelson Said Her And Her Team Used Their Research To study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social med...&quot;</title>
		<link rel="alternate" type="text/html" href="https://www.bigtechwiki.com/index.php?title=Facebook_and_Disinformation&amp;diff=97&amp;oldid=prev"/>
		<updated>2022-02-11T20:13:16Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;* Laura Edelson Said Her And Her Team Used Their Research To study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social med...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;* Laura Edelson Said Her And Her Team Used Their Research To study Facebook’s “apparent amplification of partisan misinformation.” Edelson and her team found that Facebook users engaged with misinformation more than other kinds of information on the platform. The team’s findings led Edelson to believe we were “racing against the clock” to understand how disinformation spread on social media. Edelson called understanding how misinformation spread on social media a “right now” problem.&lt;br /&gt;
** Facebook’s XCheck allowed whitelisted users to post inflammatory claims even when they had been deemed false by Facebook’s fact checkers. Misleading posts by whitelisted users said that vaccines were deadly, that Hillary Clinton had covered up pedophile rings and that Trump called asylum seekers “animals”&lt;br /&gt;
** In 2020, posts by whitelisted users that contained misinformation had been viewed at least 16.4 billion times.&lt;br /&gt;
** Facebook itself was a proliferator of misinformation. An internal review of Facebook’s whitelisting practice said “we are not actually doing what we say we do publicly.” Facebook even lied to its own oversight board about XCheck and the whitelisting of users, saying the system was used in “a small number of decisions.”&lt;br /&gt;
** Zuckerberg opposed reforming Facebook’s algorithm to stop it from rewarding misinformation over business concerns and fears it would hurt efforts to increase engagement by users.&lt;/div&gt;</summary>
		<author><name>Btw admin</name></author>
	</entry>
</feed>