Russia’s recent cyber adventures have the hallmarks of an old Soviet specialty: disinformation.
While many Americans have just awoken to the world of disinformation — sometimes known as “fake news” — in the recent presidential election, Moscow’s efforts date back decades and have become increasingly prominent over the past decade as techniques have been updated for the digital age.
The spread of disinformation through active measures was a central tactic of Soviet information operations as a way to influence foreign governments and their populations, undermine relations between nations, and weaken those who opposed communism.
Dezinformatziya, as Russians call it, is meant to instill fear and confuse audiences, blurring the lines between truth, falsehood and reality.
Disinformation can spread conspiracy theories and reinforce “filter bubbles” that isolate readers and viewers from alternate viewpoints and can create a cloud of confusion and paranoia.
I recently led an independent study on how Russia uses disinformation to influence ethnic Russians who live in former Soviet Union states.
In our report, we highlight how disinformation is effective because it is quick, cheap and yields high rewards. We assessed that Russia has a number of tools in its disinformation toolbox — including its state-run, Russian and English-language television network Russian Television (RT) platforms such as the Sputnik news service that enable audiences outside Russia to subscribe to their services for free, and the deployment of trolls in the blogosphere and on social media.
In some instances, Russia’s troll armies manage multiple fake accounts and each account post articles on social media 50 to 100 times a day.
Even before the explosion of social media, the effects of Russia’s modern disinformation efforts could be seen during the 2008 Russo-Georgia war, when Russia used picture defacement to spread fake images and news stories. Russia hacked into Georgian infrastructure and the official website of then-Georgian President Mikheil Saakashvili.
More recently, in Ukraine, Russia spread many conspiracy theories and fake stories in 2014, during the Crimea crisis and the downing of Malaysia Airlines Flight 17. In the latter case, Western sources have pointed to a Russian missile being in the downing of the plane, and believe it was either fired by the Russians themselves or Ukrainian separatists allied with Moscow.
If unaddressed, Russia’s disinformation tactics will have at least three significant consequences for global stability.
First, Russia’s disinformation machinery could inspire other countries, terrorists, transnational criminal organizations and individuals to emulate this behavior. An immediate effect could be attempts to disrupt the coming elections in Germany, France, Serbia, and the Netherlands.
Different groups act for different reasons. Some purveyors of disinformation, like the Islamic State, have ideological motives. Others look at it as an easy way to make an extra dollar.
In Macedonia, for example, young people found it lucrative to set up websites to share fake news that interfered with the U.S. election. Their motives were more economic than political.
Second, disinformation can bleed into conspiracy theories that can instantly spread on social media and may have major national security implications.
In December, Pakistani Defense Minister Khawaja Muhammad tweeted incorrectly that Israel was threatening Pakistan with nuclear weapons after this disinformation appeared on AWD News, a fake news website.
While the Israeli Defense Ministry questioned this claim, the defense minister’s tweet was reposted hundreds of times, spreading fake content within a matter of seconds.
Earlier in December, stolen emails containing a brief exchange between John Podesta and the owner of a pizzeria, Comet Ping Pong, in Washington, D.C., led a fake news story that a child sex ring was being run out of the restaurant. A North Carolina man then drove to Washington and fired a gun, saying he wanted to rescue sex slaves he was convinced were harbored at the restaurant.
Third, Russia’s disinformation campaigns help highlight one of the biggest and most dangerous challenges in Western society: an inability to think critically about information.
In a 24/7 news cycle and social-media universe, media consumers have developed an unprecedented need to access, process and spread information, be it true or not.
But the youth in particular have a limited ability to identify fake news content. If information seems newsworthy and provides high entertainment value, few people will think hard before tweeting it out or posting it on Facebook.
The challenge is further amplified when notable figures spread disinformation via social media, reinforcing the virtual conspiracy bubble. And when disinformation comes from those believed to be credible — such as friends, family and colleagues — it is even more likely to be treated uncritically, as factual information.
Countering Russia’s disinformation will require a set of actions including debunking fake news, messaging and creating counter-narratives for targeted populations who are sympathetic to Russian President Vladimir Putin’s policies both in U.S. and globally; and strengthening individual and media literacy programs around the world.
The muddling mix of rumors, lies and news has dominated our conversation to the point that we are no longer certain what is true. Citizens, news and social media outlets, both organizations and users, cannot fall victim to disinformation. We have a responsibility to think critically about the information and stories that are disseminated. Without doing so, we will continue to empower Russia and others to spread disinformation and propaganda, risk our security and create distrust in our society.
Vera Zakem is a research scientist who leads initiatives on European stability, media and information influence at CNA, a non-profit research and analysis organization in Arlington, Virginia. The views expressed here are those of the author alone and do not represent the views of CNA or any of its sponsors.