I spoke with a Bloomberg reporter on Tuesday about this issue.
Russian trolls are learning from their past mistakes and hiding the origin of pages and domains they use to spread their fake news, propaganda, and disinformation. They use US physical addresses when spreading anti-American messages, Anglicized names, have fake American friends and followers, and they use a VPN with an American exit point when working.
This week I actually had a senior say that it is useless trying to educate the public to inoculate them against the spread of fake news and propaganda. I differ in my opinion.
I had a series of “discussions” on Facebook, just yesterday, and I upset some Americans when I said that a typical American is ignorant when it comes to propaganda and fake news. When they felt insulted, I shared:
This simple objective observation is true but most people deny the truth. We can only overturn this simple fact with education. We must try. If not, our way of life will cease to exist.
It shows how hard it has become to differentiate between legitimate speech and foreign influence operations.By Leonid Bershidsky
Facebook’s widely publicized discovery of a possible influence operation through “inauthentic” accounts warrants some scrutiny — and some reflection about the difference between the genuine political debate on the social networks and its simulated version.
Facebook said on Tuesday that it shut down eight pages, 17 profiles and seven Instagram accounts that violated its “ban on coordinated inauthentic behavior.” That’s a euphemism for setting up entities to amplify politically charged messages — the core activity for which Special Counsel Robert Mueller indicted the alleged owner and employees of the Internet Research Agency (IRA), a troll factory based in St. Petersburg, Russia. The Mueller indictment described this activity as a “conspiracy to defraud the United States” through deceiving government agencies about foreign participation in domestic activities.
If the accounts Facebook has recently discovered were foreign-operated, the same crime has been committed again. Facebook, however, said it couldn’t determine yet who operated the accounts because “these bad actors have been more careful to cover their tracks, in part due to the actions we’ve taken to prevent abuse over the past year.” According to Facebook, they used virtual private networks to obscure their identities and paid third parties to run ads on their behalf. All the transactions, $11,000 worth, were in U.S. and Canadian dollars. Facebook didn’t catch the “bad actors” because they used rubles and Russian IP addresses, as the IRA did in 2016; it used information from law enforcement to provide leads, and it traced some of the accounts through tenuous links with the now-disabled IRA ones.
The problem with this Facebook narrative is that its stated determination to make abuse harder predictably led to more diligent obfuscation, not less abuse. One doesn’t need huge resources to spoof an IP address or route small payments through the U.S. “We may never be able to identify the source with the same level of confidence we had in naming the IRA last year,” Facebook’s Nathaniel Gleicher wrote. And Facebook Chief Security Officer Alex Stamos admitted that “technical forensics are insufficient to provide high confidence attribution at this time.”
So has the company’s alleged stance against abuse made the situation better or worse? It has definitely complicated the detection of troll farm activities for both Facebook and U.S. law enforcement. But rather than make a real effort to identify its users, which would have made it easy to check for authenticity, and, crucially for the legal issue at stake, to ascertain the citizenship of the person behind the account, Facebook prefers to toss some crumbs to law enforcement from time to time to demonstrate vigilance. That’s exactly what it’s done now.
Of the 33 entities Facebook has disabled, only four had more than 10 followers. The social network’s claim that “more than 290,000 accounts followed at least one of these pages” really concerns those four accounts. Facebook shared the data on eight of the disabled entities with the Atlantic Council’s Digital Forensic Research Lab, with has closely studied Russian trolling techniques, and the information it has released so far shows that even the relative popularity of these pages was likely accidental. One of the pages, ReSisterz, purportedly a feminist and anti-fascist one, achieved its highest engagement by far with a post about an anti-rape device invented in South Africa. The other disabled entities targeted the radical fringes of various minorities. All were either anti-Trump or politically neutral.
One might see why a troll factory like the IRA might want to set up such social network entities: First they build an audience based on a certain confirmation bias, then, come election time, it starts carrying targeted messages to that audience. If Russia wants further to confuse certain groups of people in the U.S., which are already prone to confusion, it needs to mix propaganda into their habitual information diet. “The Russian operation in 2014 through 2017 showed how easily disinformation actors could seed their falsehoods into genuine American communities on the right and the left; Americans thus became the unwitting amplifiers of Russian information operations,” the Atlantic Council’s Ben Nimmo and Graham Brookie wrote.
Here’s the problem, though. Only the ReSisterz page contained telltale errors that point to its administrators’ Russian origin. The others mainly plagiarized material found elsewhere on the social networks and the web, making it hard even for the Atlantic Council’s lab to come to any conclusions about their provenance. How long before politicians and law enforcement agencies start putting pressure on Facebook to take down pages advocating radical causes simply because they look “inauthentic”? I found other pages on Facebook using the same and similar names and memes as the disabled accounts — it could easily be their turn tomorrow.
One needn’t go too far to see how this could work. As it disabled the 33 suspect entities, it also shut down an event — a counter-demonstration against the Unite the Right march planned for August 10 in Washington, D.C. The ReSisterz page was one of its organizers along with five other Facebook pages, which the company deemed legitimate. But for these five pages, the outcome is the same as for the allegedly troll-created one. They’re essentially told they can’t organize their demonstration via Facebook. The 2,600 users who expressed interest in the event and the 600 who indicated they’d attend will hear from Facebook; they’ll be told someone may have tried to cheat them.
This treatment erases the line between legitimate speech and the kind created in a troll factory test tube. The line wasn’t particularly bold in the first place: Russian trolls, or any other kind, don’t create social divisions or even most of the content that springs from them, they just amplify existing voices, often radical ones.
“It would be dangerous to fall into the disinformation trap, but ruinous to believe or claim that every user who holds opposing views is part of a Russian information operation,” Nimmo and Brookie cautioned.
But what Facebook does by refusing to embrace proper identification — that is, by allowing duplicate accounts and making it easy to assume an identity — is create an enormous gray area where the authenticity of speech, protest, patriotism and any other kind of belief and intent is a matter of opinion. In this gray area, those whose opinions matter for political or business reasons will end up as enforcers.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
To contact the editor responsible for this story:
Therese Raphael at email@example.com