WASHINGTON (NYTIMES) - Facebook said on Tuesday (July 31) that it had identified a political influence campaign that was potentially built to disrupt the US mid-term elections, with the company detecting and removing 32 pages and fake accounts that had engaged in activity around divisive social issues.
The jolting disclosure, delivered to lawmakers in private briefings on Capitol Hill this week and in a public Facebook post on Tuesday, underscored how behind-the-scenes interference in the November elections had begun.
Here's what you need to know about the plot:
1. WHICH ACCOUNTS WERE REMOVED?
Facebook said the recently purged accounts - eight Facebook pages, 17 Facebook profiles and seven Instagram accounts - were created between March 2017 and May 2018 and were first discovered two weeks ago.
More than 290,000 accounts followed at least one of the suspect pages, which had names like Aztlan Warriors, Black Elevation, Mindful Being and Resisters, the company said.
It said that there were more than 9,500 Facebook posts created by the accounts and one piece of content on Instagram.
Facebook said the bad actors had created 30 Facebook events since May 2017, most of which had been scheduled over the past year. The company did not know if people had showed up to those events.
2. WHO WAS BEHIND THE ACCOUNTS?
The company did not definitively link the campaign to Russia. But Facebook officials said some of the tools and techniques used by the accounts were similar to those used by the Internet Research Agency, the Kremlin-linked group that was at the centre of an indictment this year alleging interference in the 2016 presidential election.
"At this point in our investigation, we do not have enough technical evidence to state definitively who is behind it," said Mr Nathaniel Gleicher, Facebook's head of cyber security policy. "But we can say that these accounts engaged in some similar activity and have connected with known IRA accounts," he said, referring to the Internet Research Agency.
Mr Gleicher said an account known to be associated with the agency had been listed as an administrator of one of the pages for seven minutes.
Mr Alex Stamos, the company's chief security officer, said intelligence agencies were in a better position to make an attribution by combining the technical details from Facebook with their own knowledge of the political motivations and goals of countries and other threat actors.
3. WHAT WERE THE ACTIVITIES BEING PERPETRATED BY THE ACCOUNTS?
Like the 2016 Russian interference campaign, the recently detected campaign sought to amplify divisive social issues, including through organising real-world events.
Facebook said it had discovered coordinated activity around issues like a sequel to last year's deadly "Unite the Right" white supremacist rally in Charlottesville, Virginia. Activity was also detected around #AbolishICE, a left-wing campaign on social media that seeks to end the Immigration and Customs Enforcement agency.
Among the campaign's efforts was organising support for a counter-protest to the conservative rally. Specifically, the account called Resisters, which interacted with one Internet Research Agency account in 2017, created an Aug 10 event, "No Unite the Right 2 - DC", to counter a planned white supremacist rally in Washington on Aug 11 and 12 by the same group that organised the racist march in Charlottesville one year earlier.
Although other Facebook pages are promoting the counter-protest, the social network said that the Resisters page was the first, and that it had coordinated with administrators for five other apparently real pages to co-host its page - publicising details about transportation and other logistics. A person familiar with the matter said the page was created June 24.
That event page has been taken down, and Facebook has notified roughly 2,600 users who had indicated interest in attending the event, and 600 more who planned to attend, about the suspicious activity behind it.
4. HOW MUCH MONEY WAS INVOLVED?
Between April 2017 and June 2018, the accounts ran 150 ads costing US$11,000 (S$14,978.26). They were paid for in American and Canadian dollars.
The pages created roughly 30 events over a similar period, the largest of which attracted interest from 4,700 accounts.
Finding suspicious activity was harder this time around, Facebook said. Unlike many of the alleged Russian trolls in 2016, who paid for Facebook ads in roubles and occasionally used Russian Internet protocol addresses, these accounts used advanced security techniques to avoid detection. For instance, they disguised their Internet traffic using virtual private networks and Internet phone services, and they used third parties to buy ads for them.
5. HOW DID FACEBOOK DETECT THE ACCOUNTS?
Facebook executives characterised the battle with foreign campaigns as a cat-and-mouse game, but said they were making progress to detect suspicious activity more quickly. After being caught flat-footed by the Internet Research Agency's efforts ahead of the 2016 presidential election, Facebook has expanded its security team, hired counter-terrorism experts and recruited workers with government security clearances.
The company is using artificial intelligence and teams of human reviewers to detect automated accounts and suspicious election-related activity.
It has also tried to make it harder for Russian-style influence campaigns to use covert Facebook ads to sway public opinion, by requiring political advertisers in the United States to register with a domestic mailing address and by making all political ads visible in a public database.
Despite Facebook's efforts, stopping coordinated influence campaigns has proved difficult. False news flourished before the Mexican elections in July, and the company has been cracking down on misinformation ahead of Brazil's national elections in October.
The US mid-terms, though, are a major test for the company, which is trying to show that it can handle its role as a global arbiter of conversation and commerce - even with interference by others.