Disinformation · Fake News · Information operations · Information Warfare

Cognitive Mindhacks: How Attackers Spread Disinformation Campaigns


At last researchers are starting to investigate “how” disinformation and fake news works, not simply throwing up their hands and saying ‘they’re doing it’. 

Enraptured Minds: Strategic Gaming of Cognitive Mindhacks” looks to be an interesting and probably educational presentation. 

Unfortunately, I cannot attend the presentation, so if someone can contact Fyodor YarochkinLion Gu, and Vladimir Kropotov, I’d like to learn more about their findings.  

</end editorial>



11/6/2017

By Kelly Sheridan

Researchers investigate the tools and techniques behind cyber propaganda and fake news and how it changes public opinion. 

Disinformation campaigns, otherwise known as cyber propaganda, cognitive hacking, information warfare, and the more common “fake news,” have roots in history but are increasingly relevant, and dangerous, as actors manipulate Web tools to sway public opinion.

Content promotion services have been in the “gray market” for a while, but fake news didn’t start to gain widespread attention until the 2016 US Presidential election, explains Vladimir Kropotov, senior researcher for Trend Micro’s Forward-Looking Threat Research (FTR).

In a few weeks, Kropotov will join fellow FTR senior researchers Fyodor Yarochkin and Lion Gu to present tools and techniques used among cyber propaganda perpetrators around the world in a Black Hat Europe presentation titled “Enraptured Minds: Strategic Gaming of Cognitive Mindhacks.”

“Information distributes too fast, and people can make wrong decisions based on information from unreliable sources,” Kropotov explains.

Researchers believe the success of fake news campaigns relies on three distinct components: social networks, motivation, and tools and services. The absence of one of these factors will make the spread of disinformation more difficult, if not impossible, they say.

The Dark Web is full of tools to spread fake news campaigns. Some of these can be used for legitimate purposes, such as content marketing, but their power can also be leveraged to disseminate propaganda and influence public opinion. A few are available on the gray and legitimate markets, but these don’t have the anonymity of the underground.

Today, most fake news campaigns are considered to be politically motivated. However, researchers say, other motives exist and the same tools can be used to achieve them.

“I think people in big companies and enterprises should be aware of the availability of services like this,” says Yarochkin. For example, he explains, these services can be used to promote content intended to make particular companies hot on the stock market. PR agencies can use the same tools as threat actors to spread information in the wake of a crisis.

“It’s not purely underground services,” says Kropotov of the tools used to spread fake news. “The same technologies have been used widely by media agencies and in advertising.”

The researchers learned that Chinese, Russian, Middle Eastern, and English-based underground marketplaces all offer services for anyone who wants to launch disinformation campaigns.

“One of our ideas was to watch how [fake news] looks to the United States, and Russia, and Arabic-speaking countries,” explains Yarochkin. Tools used to spread fake news vary from place to place, and each reflects the social and online culture of its respective region.

Example: The Chinese marketplace

As an example of geography-specific tools, consider the Chinese underground, which researchers also analyzed. Given the difficulty of accessing certain social media platforms outside China, most of these tools are unsurprisingly limited to the Chinese market.

One such service, called Xiezuobang, charges money to create and distribute content. Pricing varies depending on the platform where the article will be published. While it could be used for content marketing, the service could easily be abused to spread propaganda, researchers say.

Several Chinese websites advertise a “public opinion monitoring system” that can allegedly survey and influence opinions in popular social media networks and forums, depending on the customer’s specific area of interest. One of these, the Boryou Public Opinion Influencing System, says it can monitor 3,000 websites and forums, and add automatic posts and replies at a reported rate of 100 posts per minute.

“An administrator could gather feedback from websites and forums, if they want to know what people are saying and thinking,” says Gu.

On the Chinese underground, researchers also found services leveraging social media to sway public opinion. Brokers offer paid posts and reposts to distribute content on Chinese social media networks. Clients pay to have their content posted by influential users; the more popular the user, the more expensive it is to have that user repost your content.

“Buyers will bank on a celebrity’s visibility as a potent means to deliver their desired content to an expansive pool of audience,” researchers say. A famous Weibo user with 78.25 million followers, for example, costs $180,000 for their visibility.

Kelly Sheridan is Associate Editor at Dark Reading. She started her career in business tech journalism at Insurance & Technology and most recently reported for InformationWeek, where she covered Microsoft and business IT. Sheridan earned her BA at Villanova University. View Full Bio

Source: https://www.darkreading.com/cloud/cognitive-mindhacks-how-attackers-spread-disinformation-campaigns/d/d-id/1330334?piddl_msgid=329957

Advertisements