Facebook Rolls Out News Feed Change That Blocks Watchdogs from Gathering Data
The tweak, which targets the code in accessibility features for visually impaired users, drew ire from researchers and those who monitor the platform
By: Corin Faife
Facebook has begun rolling out an update that is interfering with watchdogs monitoring the platform.
The Markup has found evidence that Facebook is adding changes to its website code that foils automated data collection of news feed posts—a technique that groups like NYU’s Ad Observatory, The Markup, and other researchers and journalists use to audit what’s happening on the platform on a large scale.
The changes, which attach junk code to HTML features meant to improve accessibility for visually impaired users, also impact browser-based ad blocking services on the platform. The new code risks damaging the user experience for people who are visually impaired, a group that has struggled to use the platform in the past.
The updates add superfluous text to news feed posts in the form of ARIA tags, an element of HTML code that is not rendered visually by a standard web browser but is used by screen reader software to map the structure and read aloud the contents of a page. Such code is also used by organizations like NYU’s Ad Observatory to identify sponsored posts on the platform and weed them out for further scrutiny.
“We constantly make code changes across our services, but we did not make any code changes to block these research projects,” Lindy Wagner, communications manager at Facebook, said in an email to The Markup.
Following the changes, the Citizen Browser project experienced a drop in data collection rates from early September, prompting the investigation that uncovered these changes to the code. At around the same time, users of certain ad blockers noticed a decrease in their effectiveness.
Laura Edelson, a Ph.D. candidate in computer science at NYU’s Tandon School of Engineering and founder of the Ad Observatory project, expressed dismay at Facebook’s latest move impacting data collection. The website update had at first caused a sharp drop in the amount of data collected by the Ad Observatory, she said, but a fix was found that allowed the team to collect data at normal levels.
“I think it’s unfortunate that Facebook is continuing to fight with researchers rather than work with them,” she said.
Facebook has used similar tweaks to attempt to frustrate researchers and ad blockers in the past, often with the result of making the platform less accessible to visually impaired users.
In 2019, the company made changes to obfuscate its code in a way that blocked ad collection efforts by ProPublica, Mozilla, and British ad transparency group WhoTargetsMe. And in 2020, Quartz reported that visually impaired users had been unable to hear a legible label distinguishing between sponsored and nonsponsored posts for the previous two years because the platform had added numerous junk characters to the text to reduce the efficiency of ad blocking software.
In its latest update, Facebook seems to have implemented the code in a way that prevents screen readers from reading the new tags. As the update has not yet been rolled out to all users, it’s unclear what, if any, impact the change may have on visually impaired users. In at least one circumstance, a developer from The Markup who was testing the new code found that the Microsoft Narrator screen reader read aloud a string of junk characters as an unintelligible word when accessing the site through the Google Chrome browser.
“Our accessibility features largely appear to be working as normal, however we are investigating the claim,” Facebook’s Wagner said.
Jared Smith, associate director of accessibility research and training nonprofit WebAIM, expressed concerns about the code in Facebook’s web update after reviewing it for The Markup.
According to Smith, the new updates break many basic rules of accessibility design. Rather than presenting a clear and simplified structure, he said, the accessibility code was hugely complex, potentially heralding problems down the road.
“When you see thousands and thousands of patterns of ARIA attributes—code that could be used for accessibility but doesn’t seem to support accessibility—it poses a scenario where things could jump the rails and really negatively impact accessibility,” said Smith.
“We’ve seen misuse of technologies like this for things like search engine optimization, but this is on an entirely different scale,” he added.
Facebook users have complained about new features that were rolled out without being compatible with screen readers in the past. But more recently the company has received plaudits for using AI-powered image recognition to generate alt text for images, which allowed visually impaired users to access more content in the news feed.
In July 2020, a blog post from the Facebook engineering team trumpeted an extensive rebuild of the site that was apparently made with accessibility in mind. This included requirements for Facebook developers to use a code linting plugin (similar to a spelling autocorrect) that would highlight violations of ARIA standards.
“I suspect that the Facebook team implementing these apparent anti-transparency mechanisms does not realize that there are potential accessibility consequences to what they’re doing,” said Blake E. Reid, a professor at the University of Colorado Law School who focuses on accessibility and technology policy.
Sen. Ron Wyden, who has been critical of the company in the past, told The Markup in an emailed statement that Facebook’s latest move showed a disregard for visually impaired users.
“It is contemptible that Facebook would misuse accessibility features for users with disabilities just to foil legitimate research and journalism,” he said.
Facebook has long claimed that it wants to share data with researchers, Edelson said, but in practice numerous social scientists have faced obstacles when trying to work with the platform.
In August of this year, Facebook disabled the accounts of NYU Ad Observatory researchers for alleged violations of its terms of service with the researchers’ own ad collector. (At the time, The Markup’s senior executives published a press release critical of Facebook’s actions.)
And reporting by The New York Times brought to light the fact that Facebook had given incomplete data to misinformation researchers from the high profile Social Science One research group, potentially undermining the findings of years of academic studies. The error was first uncovered by a university professor who found discrepancies between numbers in the Social Science One data and Facebook’s recently published Widely Viewed Content Report.
“At what point does the research community stop thinking of Facebook as a positive actor in this space?” Edelson said.
This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.