Facebook has moved to block third-party ad transparency tools used by nonprofits and news organizations, although the company claims its efforts are aimed at preventing malicious actors from scraping user data and not intended to punish critical reporting on its platform. The company has done so by inserting code that prevents the use of web plugins that automate the collection of information from the social network, according to ProPublica, which has been leading the investigative reporting around political ad buying and ad-based discrimination on Facebook for more than two years.
ProPublica operates a searchable database, called the Facebook Political Ad Collector, that displays information about more than 120,000 political ads thanks to the roughly 22,000 users who use its custom plugin. The organization says that its plugin now no longer works, but its database remains online and searchable.
Facebook’s Rob Leathern, the company’s vice president of product who specializes in ad transparency, says the change was not intended to lock out journalists and other nonprofit and advocacy groups. Instead, Leathern wrote on Twitter, the company is trying to prevent “people’s data from being misused — our top priority. Plugins that scrape ads can expose people’s info if misused.” Considering the Cambridge Analytica data privacy scandal that rocked the company last year, this sounds like a reasonable measure to take, although shutting out nonprofits from the process of holding political ad buyers accountable does not help Facebook’s public image.
Alex Stamos, Facebook’s former chief security officer and an outspoken voice on data privacy and the social network in general, made a similar argument on Twitter, writing, “These changes are about ad blockers, not people researching ad abuse,” and following it up with, “FB needs to build out the ad archive API ASAP, including targeting and reach data.”
I tried to have this discussion at the time, which was like throwing myself in front of the moral panic freight train.
There is a balancing act that the tech platforms have to walk between data protection (data monopoly, if you prefer) and creating some risk by opening APIs.
— Alex Stamos (@alexstamos) January 29, 2019
To be fair, Facebook has made a significant number of other data and API restrictions designed specifically to prevent another Cambridge Analytica. And Stamos says that the fear is not necessarily restricted to private industry, but also government overreach. “For example, DHS was trying to get bids for companies to build databases of immigrant social media history,” he wrote. “If you don’t want Palantir scraping every Muslim’s stream on FB, there needs to be anti-scraping protection.
Additionally, Leathern points to a blog post Facebook published earlier this month that outlines new ad transparency and election interference measures planned for later this year, which includes a new set of transparency tools for advertisers set to release at the end of June. When reached for comment, a Facebook pointed to Leathern’s comments.
We know we have more to do on the transparency front – but we also want to make sure that providing more transparency doesn’t come at the cost of exposing people’s private information.
— Rob Leathern (@robleathern) January 28, 2019
According to ProPublica, Facebook urged the organization to shut down its project and transition over to the company’s own public archive for political ads, which launched in May of last year when Facebook first put into effect its new ad transparency rules in the US. Yet ProPublica says Facebook’s tool is incomplete and the organization has routinely publicized ads run by organizations that were not recorded in the archive, including political ads run by the National Rifle Association and a pro-Bernie Sanders electoral reform advocacy group.
ProPublica says Facebook’s archive is also only available in three countries and, most importantly, doesn’t include “affinity” information. So-called “affinity” targeting is how Facebook advertisers in the past have targeted groups by protected classes like race, gender, and religion, and it’s based upon information Facebook says must be self-classified by the user.
Facebook tells ProPublica that including these microtargeting labels in its archive, which it has severely restricted in the case of housing and employment ads thanks to an entirely separate ProPublica investigation, would somehow be a privacy risk to users, although is not clear how. Facebook is allegedly developing an API so organizations can more easily access political ad data on the platform, but it’s currently in beta and restricted to keyword searches only, ProPublica reports.
Until then, or when Facebook launches its new tools in June, it sounds like the data-collection software developed outside Facebook and without explicit platform access will no longer be able to function.