CHARLOTTESVILLE (Virginia) • •Facebook has revealed that hundreds of Russia-based accounts had run anti-Hillary Clinton ads precisely aimed at the social network's users whose demographic profiles implied a vulnerability to political propaganda.
It will take time to prove whether the account owners had any relationship with the Russian government, but one thing is clear: Facebook has contributed to, and profited from, the erosion of democratic norms in the United States and elsewhere.
The audacity of a hostile foreign power trying to influence US voters rightly troubles us. But it should trouble us more that Facebook makes such manipulation so easy, and renders political ads exempt from the basic accountability and transparency that healthy democracy demands.
The majority of the Facebook ads did not directly mention a presidential candidate, according to Mr Alex Stamos, head of security at Facebook, but "appeared to focus on amplifying divisive social and political messages across the ideological spectrum - touching on topics from LGBT matters to race issues to immigration to gun rights".
The ads - about 3,000 placed by 470 accounts and pages spending about US$100,000 (S$136,000) - were what the advertising industry calls "dark posts", seen only by a very specific audience, obscured by the flow of posts within a Facebook News Feed and ephemeral.
Facebook calls its "dark post" service "unpublished page post ads". This should not surprise us. Anyone can deploy Facebook ads. They are affordable and easy. That's one reason that Facebook has grown so quickly, taking in US$27.6 billion in revenue in 2016, virtually all of it from advertisers, by serving up the attention of two billion Facebook users across the globe.
The service is popular among advertisers for its efficiency, effectiveness and responsiveness. Facebook gives rich and instant feedback to advertisers, allowing them to quickly tailor ads to improve outcomes or customise messages even more.
There is nothing mysterious or untoward about the system itself, as long as it's being used for commerce instead of politics. What's alarming is that Facebook executives don't seem to grasp, or appreciate, the difference.
A core principle in political advertising is transparency - political ads are supposed to be easily visible to everyone, and everyone is supposed to understand they are political ads, and where they come from.
And it's expensive to run even one version of an ad in traditional outlets, let alone a dozen different versions.
Moreover, in the case of federal campaigns in the US, the 2002 McCain-Feingold Campaign-Finance Act requires candidates to state they approve of an ad and so take responsibility for its content.
None of that transparency matters to Facebook. Ads on the site meant for, say, 20-to 30-year-old home-owning Latino men in Northern Virginia would not be viewed by anyone else, and would run only briefly before vanishing.
The potential for abuse is vast. An ad could falsely accuse a candidate of the worst malfeasance a day before election day, and the victim would have no way of even knowing it happened.
Ads could stoke ethnic hatred and no one could prepare or respond before serious harm occurs.
Unfortunately, the range of potential responses to this problem is limited. The First Amendment grants broad protections to publishers like Facebook. Diplomacy, even the harsh kind, has failed to dissuade Russia from meddling. And it's even less likely to under the current administration.
Mr Daniel Kreiss, a communication scholar at the University of North Carolina, proposes that sites such as Facebook, Twitter and YouTube keep a repository of campaign ads so regulators, scholars, the media and the public can examine and expose them. B
ut the firms have no impetus to concur and coordinate. And Congress is unlikely to reform a system that campaigns are just learning to master.
Facebook has no incentive to change its ways. The money is too great, the issue too nebulous to alienate more than a few of its users. The more that Facebook saturates our lives and communities, the harder it is to live without it.
Facebook has pledged to install better filtering systems using artificial intelligence and machine-learning to flag accounts that are run by automated "bots" or violate the site's terms of service.
But these are just new versions of the technologies that have caused the problem in the first place. And there would be no accountability beyond Facebook's word. The fact remains that in the arms race to keep propaganda flowing, human beings review troublesome accounts only long after the damage has been done.
Our best hopes sit in Brussels and London.
European regulators have been watching Facebook and Google for years. They have taken strong actions against both companies for violating European consumer data protection standards and business competition laws.
The British government is probing the role Facebook and its use of citizens' data played in the 2016 Brexit referendum and this year's elections.
We are in the midst of a global Internet-based assault on democracy. Scholars at the Oxford Internet Institute have tracked armies of volunteers and bots as they move propaganda across Facebook and Twitter in efforts to undermine trust in democracy or to elect their preferred candidates in Britain, France, India, the Philippines and elsewhere. We now know agents in Russia are exploiting the powerful Facebook advertising system directly.
In the 21st-century social media information war, faith in democracy is the first casualty.
•Siva Vaidhyanathan, a professor of media studies at the University of Virginia, is writing a book about Facebook.