Difference between revisions of "NYU Study on Misinformation"
From BigTechWiki
Jump to navigationJump to search
(Created page with "* In August 2021, Facebook blocked a team of researchers from NYU’s Ad observatory from accessing their site. The researchers launched a tool enabling users to share data about which political ads the users are shown and why those ads were targeted at them. * The researchers said they were working to “uncover systemic flaws in the Facebook ad library, to identify misinformation in political ads [...] and to study Facebook’s apparent amplification of partisan misin...") |
|||
Line 1: | Line 1: | ||
* In August 2021, Facebook blocked a team of researchers from NYU’s Ad observatory from accessing their site. The researchers launched a tool enabling users to share data about which political ads the users are shown and why those ads were targeted at them. | * In August 2021, Facebook blocked a team of researchers from NYU’s Ad observatory from accessing their site. The researchers launched a tool enabling users to share data about which political ads the users are shown and why those ads were targeted at them.<ref>https://www.npr.org/2021/08/04/1024791053/facebook-boots-nyu-disinformation-researchers-off-its-platform-and-critics-cry-f</ref> | ||
* The researchers said they were working to “uncover systemic flaws in the Facebook ad library, to identify misinformation in political ads [...] and to study Facebook’s apparent amplification of partisan misinformation.” The NYU researchers said they had just begun “trying to figure out what role the platform may have played leading up to the capitol assault on Jan. 6” when Facebook shut down their accounts. The NYU Researchers were also looking into whether Facebook was contributing to vaccine hesitancy and sowing distrust in elections. | * The researchers said they were working to “uncover systemic flaws in the Facebook ad library, to identify misinformation in political ads [...] and to study Facebook’s apparent amplification of partisan misinformation.” The NYU researchers said they had just begun “trying to figure out what role the platform may have played leading up to the capitol assault on Jan. 6” when Facebook shut down their accounts. The NYU Researchers were also looking into whether Facebook was contributing to vaccine hesitancy and sowing distrust in elections.<ref>https://apnews.com/article/technology-business-5d3021ed9f193bf249c3af158b128d18</ref> | ||
* NYU’s researchers believed their research was “responsible and in the public interest.” The researchers believed they were being blocked by Facebook because the platform was “protecting itself from scrutiny and accountability.” | * NYU’s researchers believed their research was “responsible and in the public interest.” The researchers believed they were being blocked by Facebook because the platform was “protecting itself from scrutiny and accountability.”<ref>https://www.nytimes.com/2021/08/10/opinion/facebook-misinformation.html</ref> | ||
* The NYU Researchers had been studying Facebook’s platform for three years before the site blocked them. The research team developed and deployed a browser extension called Ad Observer that allowed users to voluntarily share information about ads that Facebook showed them with the researchers. The team had used their research to “study Facebook’s apparent amplification of partisan misinformation.” NYU’s researchers said their work had been “able to demonstrate that extreme, unreliable news sources get more engagement [...] on Facebook, at the expense of accurate posts and reporting.” NYU’s research found that Facebook users engaged with misinformation more than other kinds of information on the platform.” Their work showcased that Facebook had failed to disclose who paid for some political ads. The NYU researchers found misleading ads thriving on Facebook in November 2020 despite the platform's policies banning misinformation in ads. | * The NYU Researchers had been studying Facebook’s platform for three years before the site blocked them. The research team developed and deployed a browser extension called Ad Observer that allowed users to voluntarily share information about ads that Facebook showed them with the researchers. The team had used their research to “study Facebook’s apparent amplification of partisan misinformation.” NYU’s researchers said their work had been “able to demonstrate that extreme, unreliable news sources get more engagement [...] on Facebook, at the expense of accurate posts and reporting.” NYU’s research found that Facebook users engaged with misinformation more than other kinds of information on the platform.” Their work showcased that Facebook had failed to disclose who paid for some political ads. The NYU researchers found misleading ads thriving on Facebook in November 2020 despite the platform's policies banning misinformation in ads. | ||
* In Oct. 2020, Facebook sent a cease-and-desist letter to the NYU Ad Observatory project demanding it stopped collecting data on the platform’s political ad targeting. Facebook said the Ad Observatory’s work violated the provisions in its terms and service that prohibited bulk data collection from its site. Facebook threatened “additional enforcement action” against the NYU Ad Observatory if it did not stop collecting data. Facebook said if the NYU Ad Observatory did not stop collecting user data the platform could make technical changes to its code that would block them from conducting further research. When Facebook sent their cease-and-desist letter to NYU’s Ad Observatory In Oct. 2020, Edelson said “the only thing that would prompt us to stop doing this would be if Facebook would do it themselves.” | * In Oct. 2020, Facebook sent a cease-and-desist letter to the NYU Ad Observatory project demanding it stopped collecting data on the platform’s political ad targeting. Facebook said the Ad Observatory’s work violated the provisions in its terms and service that prohibited bulk data collection from its site. Facebook threatened “additional enforcement action” against the NYU Ad Observatory if it did not stop collecting data. Facebook said if the NYU Ad Observatory did not stop collecting user data the platform could make technical changes to its code that would block them from conducting further research. When Facebook sent their cease-and-desist letter to NYU’s Ad Observatory In Oct. 2020, Edelson said “the only thing that would prompt us to stop doing this would be if Facebook would do it themselves.” |
Latest revision as of 16:54, 23 February 2022
- In August 2021, Facebook blocked a team of researchers from NYU’s Ad observatory from accessing their site. The researchers launched a tool enabling users to share data about which political ads the users are shown and why those ads were targeted at them.[1]
- The researchers said they were working to “uncover systemic flaws in the Facebook ad library, to identify misinformation in political ads [...] and to study Facebook’s apparent amplification of partisan misinformation.” The NYU researchers said they had just begun “trying to figure out what role the platform may have played leading up to the capitol assault on Jan. 6” when Facebook shut down their accounts. The NYU Researchers were also looking into whether Facebook was contributing to vaccine hesitancy and sowing distrust in elections.[2]
- NYU’s researchers believed their research was “responsible and in the public interest.” The researchers believed they were being blocked by Facebook because the platform was “protecting itself from scrutiny and accountability.”[3]
- The NYU Researchers had been studying Facebook’s platform for three years before the site blocked them. The research team developed and deployed a browser extension called Ad Observer that allowed users to voluntarily share information about ads that Facebook showed them with the researchers. The team had used their research to “study Facebook’s apparent amplification of partisan misinformation.” NYU’s researchers said their work had been “able to demonstrate that extreme, unreliable news sources get more engagement [...] on Facebook, at the expense of accurate posts and reporting.” NYU’s research found that Facebook users engaged with misinformation more than other kinds of information on the platform.” Their work showcased that Facebook had failed to disclose who paid for some political ads. The NYU researchers found misleading ads thriving on Facebook in November 2020 despite the platform's policies banning misinformation in ads.
- In Oct. 2020, Facebook sent a cease-and-desist letter to the NYU Ad Observatory project demanding it stopped collecting data on the platform’s political ad targeting. Facebook said the Ad Observatory’s work violated the provisions in its terms and service that prohibited bulk data collection from its site. Facebook threatened “additional enforcement action” against the NYU Ad Observatory if it did not stop collecting data. Facebook said if the NYU Ad Observatory did not stop collecting user data the platform could make technical changes to its code that would block them from conducting further research. When Facebook sent their cease-and-desist letter to NYU’s Ad Observatory In Oct. 2020, Edelson said “the only thing that would prompt us to stop doing this would be if Facebook would do it themselves.”
- In Aug. 2021, Facebook shut down the NYU’s researchers’ access to Facebook hours after they informed the platform that their research was studying the spread of disinformation on Jan. 6th across the site. Facebook disabled the NYU’s researchers’ personal accounts, as well as their access to the platform’s API’s, and disabled other apps and pages associated with their research project. “The company also cut off the researchers’ access to Facebook’s APIs, technology that is used to share data from Facebook to other apps or services, and disabled other apps and Pages associated with the research project, according to Mike Clark, a director of product management on Facebook’s privacy team.” By shutting the researchers’ Facebook accounts, the platform made it impossible for them to continue their research.
- NYU’s researchers said Facebook had “denied us important access to continue to do much of our work.” the NYU extension provided researchers insight as to which entities are trying to influence the public and how they’re doing it. Facebook denied the researchers access that was necessary for their work to continue. The NYU researcher said “there [was] still a lot of important research we want[ed] to do.” Laura Edelson, who helped oversee NYU’s Ad Observatory project, said Facebook didn’t like the findings of their work and was “taking measures to silence us.” Researchers felt that Facebook was trying to intimidate them and send a message to other independent researchers attempting to study their platform. NYU’s researchers learned they were blocked from Facebook through an automated email from the platform.
- In 2020, NYU’s research team created a browser extension called ‘Ad Observer’, which allowed individual users to participate in their research by automatically copying any ads they encountered on Facebook. Users voluntarily downloaded Ad Observer, which would automatically send the ads they saw on Facebook and the information on why the ads targeted them to NYU’s Researchers. 16,000 people had downloaded Ad Observer since its launch. When users downloaded the browser extension, they agreed to send the ads they saw on Facebook. NYU’s Researchers inferred which political ads were targeting which groups of users, which was information Facebook didn’t publicize. NYU’s researchers said they only collected identifying information on Facebook’s advertisers.
- The NYU researchers said they were privacy and cybersecurity researchers whose careers were “built on protecting users.” Thus they were careful to ensure the Ad Observer Tool only collected limited and anonymous information from the users who agreed to participate in their research. NYU Researchers decided to make the extensions source code public so that people could review it and see that it did not collect user information. NYU’s Researcher said they didn’t collect any information that wasn’t about an ad or already public, and that the code for the browser extension was public and had been reviewed by outside experts. NYU’s researchers said they only collected identifying information on Facebook’s advertisers and that Ad Observer only looked at the information inside the frame of the ad, not the comments or reactions below it. Edelson rejected the suggestion that her tool Ad Observer was an automated scraper. She explained that scraping was when a program was written to automatically scroll through a website and have a computer drive how the browser worked and what was downloaded. She defended the fact that Ad Observer was user driven
- The code for Ad Observer was publicly available and had been reviewed by outside experts. Edelson said she and her team made their data and code public because she realized it wasn’t “a fair thing to ask” people “to just trust me.” She believed that by doing so Facebook is the one saying “don’t look behind this curtain”. Mozilla, known for their focus on privacy, reviewed Ad Observer’s code twice before recommending it to users. Mozilla’s Chief Security Officer said Mozilla’s review of Ad Observer “assured us that it respect[ed] user privacy and support[ed] transparency.
- NYU researchers wanted to make it possible for journalists, researchers, policy makers and others to search political ads by state and contest to see what messages were directed to specific audiences and their funding. NYU researchers said the public “deserve[d] more transparency about the systems the [Facebook] use[d] to sell the public’s attention to advertisers.” NYU researchers said the public “deserve[d] more transparency” on Facebook’s algorithm for promoting content.” Edelson defended her team’s data collection, saying the data she and her team collected was “otherwise unavailable to the public.” While Facebook published its own library of political ads with information on who paid for the ad and when it ran, it did not include details on how the ad was targeted. NYU’s researchers said Ad Observer “provide[d] a way to see which entities [were] trying to influence the public and how they’re doing it.”
- Edelson believed we were “racing against the clock” to understand how the spread of disinformation on social media happened and believed that understanding how misinformation spread on social media was a “right now” problem. Edelson said she and her team were working to “uncover systemic flaws in the Facebook ad library, to identify misinformation in political ads [...] and to study Facebook’s apparent amplification of partisan misinformation.” Edelson said we “really need[ed] to put the pieces together” on how ads with a certain message were publicized and targeted.
- The Verge: “No one I have spoken to at Facebook believe[d] that NYU’s work [was] not fundamentally in the public interest.” “Facebook already makes similar data publicly available through its online ad archive, but the NYU researchers say it’s incomplete and sometimes inaccurate — among other things, they say, many political ads are never labeled as such. no one i have spoken to at Facebook believes that NYU work was not fundamentally in the public interest.”
- Facebook and the NYU researchers had been in a long running standoff over their Ad Observer and ad observatory tools. Facebook claimed NYU’s researchers were scraping data with Ad Observer, which violated the platform’s terms of service. Facebook claimed that NYU’s researchers “knowingly violated our terms against scraping.” Facebook claimed the NYU researcher’s browser extension violated their privacy rules because it collected information about advertisers, like their names, Facebook IDs and photos. Edelson believed Facebook’s argument that her work tracked users without their consent meant “those advertiser names [...] [were] private user information.” Facebook claimed the browser extension could’ve been used to identify information about users who interacted with the ads but did not consent to share their information. Facebook claimed Ad Observer collected data about Facebook users who did not install it or consent to the collection and claimed Ad Observer “was programmed to evade our detection systems.” Facebook said it had made “repeated attempts to bring their research into compliance with our terms.” Facebook said NYU’s research about their platform “may be well-intentioned” but “the ongoing and continued violations of protections against scraping [could not] be ignored and should be remediated.”
- Facebook said their “enforcement actions” against the NYU researchers “were consistent with our normal enforcement practices in these kinds of circumstances. ”We enforce neutrally across the board, regardless of the publicly-expressed intentions of those in violation. The enforcement actions we took against these researchers were consistent with our normal enforcement practices in these kinds of circumstances.” By only suspending the Facebook accounts of the NYU researchers, Facebook didn’t actually shut down the ad observer project itself. By suspending the accounts of Edelson and her colleagues for repeat violations of its terms of service, Facebook has made it impossible for them to continue a different project—the Ad Observatory, not Observer—that helps journalists and academics analyze political ad data the platform shares directly.
- Facebook initially said the ban on NYU’s researchers’ access to the site was necessary in part because of a deal the platform reached with the FTC in 2019 after the Cambridge Analytica scandal. Facebook initially claimed the ban was necessary because researchers were scraping users’ personal information which was a violation of the company’s terms of service. A spokesman for Facebook said the FTC’s consent decree had required the site to implement a “comprehensive privacy program” that “protect[ed] the privacy, confidentiality and integrity” of user data.
- However Facebook was not in violation of the FTC’s consent decree because the consent decree didn’t prohibit what NYU’s researchers had been doing. It was Facebook’s own privacy program, not the FTC’s consent decree, that prohibited what the NYU researchers had been doing.
- The FTC’s consent decree required Facebook to get a user’s consent before sharing their data with someone else. NYU’s researchers’ ad observer relied on users agreeing to share data, not Facebook itself, rendering the FTC’s order irrelevant. “section 2 requires Facebook to get a user’s consent before sharing their data with someone else. since the ad observer relies on users agreeing to share data, not Facebook itself, that isn’t relevant. “the FTC “hope[d]” Facebook was “not invoking privacy – much less the FTC consent order – as a pretext to advance other aims.”
- The FTC criticized Facebook’s decision to bar the NYU researchers from their platform. the FTC said Facebook rationale for shutting down the NYU researchers’ accounts was “inaccurate” and “misleading” noting that their consent decree did not bar Facebook from creating exceptions for good faith research in the public interest.
- The FTC’s rebuttal was described as a “rare” occurrence for the commission by the Washington Post
- Facebook said NYU could study its platform with tools the company provided and didn’t need Ad Observer. Facebook said they had offered NYU’s researchers “ways to obtain data that did not violate our terms” and “a number of privacy protected methods to collect and analyze data.” Facebook claimed they “went above and beyond to explain” NYU’s violations “and offered them an additional privacy-safe dataset containing targeting information for 1.65 million political ads.” NYU’s researchers called the data Facebook made available “woefully inadequate” and “complicated to use.” NYU’s researchers found that the archive of political ads Facebook made available to researchers was missing more than 100,000 ads. NYU researchers said if they used the data Facebook made public for research, “We simply could not study the spread of misinformation on such topics as elections, the capitol riot and covid-19 vaccines.”
- Facebook’s analytics tool for researchers only showed how many likes and shares a particular post received, but did not disclose how many people saw the post. Facebook did not make information about non-political ads available to researchers, nor did it make ad targeting data available through their ad library.
- Facebook required researchers studying the ‘Facebook open research and transparency platform’ to access data on a laptop furnished by Facebook. by requiring researchers to use laptops they furnished, Facebook prevented them from using their own machine-learning classifiers and other tools on the data available.
- Lawmakers pressed Facebook on why it disabled the NYU researchers’ accounts. Senators Amy Klobuchar, Mark Warner and Chris Coons pressed Facebook on how many researchers or journalists had had their accounts disabled that year, why, and how Facebook was working to better accommodate research. Senator Mark Warner called Facebook’s move to block NYU researchers “deeply concerning.” Senator Amy Klobuchar responded to Facebook blocking researchers by saying it was “vital that social media companies both protect[ed] user data and improve[d] transparency.”