Information operations · Information Warfare · Russia

High-Tech Proposed Social Media Data Analysis Center Misses Most Foreign Information Warfare Efforts


This is a good initiative, but it lacks some basic functions which should be incorporated.

  1. There does not appear to be any group of legal advisors, who must be motivated to make quick decisions. Time is of the essence and a legal determination should be almost immediate. This and heavy “bureaucracy” have been impediments to the Global Engagement Center in the past.
  2. There does not appear to be any liaison or shared functionality with the other members of the community, such as DOJ and DOS, to seize domains which violate the Foreign Agent Reporting Act or to identify individuals that act as Foreign Influence Agents through their online and social media activity. 
  3. As written, much of this approach depends on high-technology approaches reliant on coding to pick up massive data surges or “disinformation” or deliberate attempts to influence a mass audience, an event, or a group of events. Using high-tech tools, the proposed center should have the ability to automatically detect developing themes and memes and also have the capability to automatically generate a response, for human editing and approval.
  4. Until Artificial Intelligence is as capable of experienced human analysts (have achieved singularity), a pool of experienced propaganda, disinformation, misinformation, and fake news experts must be on hand.  
  5. The center must be granted immunity from prosecution as situations develop in new fields. 
  6. International legal experts and liaisons from international bodies of all types, to include the military, must be involved to offer new legal solutions to ongoing consistent and constant immoral, unethical, and often illegal activities, such as those tactics, techniques and procedures used by the Russian information warfare program which have not been stifled.  
  7. The center does not seem to have any relationship with the Deputy National Security Advisor for Strategic Communications, the Under Secretary of State for Public Diplomacy and Public Affairs, or the Global Engagement Center. 
  8. The center appears poised to only remove offending data on social media sites and will not generate fair and objective facts which would offer a truthful counter to fake news, disinformation, propaganda, and misinformation. This should be a mandatory process in a whole-of-nation and other efforts. A credible source of information containing the truth as best can be established should be immediately established.  As the Director of the US Information Agency, Edward R. Murrow testified to Congress, ” truth is the best propaganda and lies are the worst. To be persuasive we must be believable; to be believable we must be credible; to be credible we must be truthful. It is as simple as that.”
  9. A network of experts must be established to share “lessons learned” to counter disinformation, propaganda, and fake news.
  10. This entire network must be funded on a long-term ongoing basis, to preclude information disasters such as the one surrounding the US 2016 election.
  11. To help remove the zombie and bot-net computers used by foreign information warfare programs, a robust civilian cybersecurity program must be established, to secure home and small business computers as targets for zombie and bot programs. 
  12. An IC and Cyber Command program should be established to support this center’s efforts with classified programs, coordinated and synchronized to achieve greater effects. 

If this effort uses only high-tech approaches, 90 percent of foreign information warfare programs will continue, unabated.

</end editorial>



Can the IC police foreign disinformation on social media?

May 28, 2019

An altered video of House Speaker Nancy Pelosi appearing to stutter and slur her words generated millions of views on Facebook and other social networks late last week and was retweeted by President Donald Trump.

The social network appended an alert to users looking to share the altered video that the material was the subject of additional reporting, but it declined to remove it.

Monika Bickert, Facebook’s vice president for global policy management, said in a May 24 interview on CNN that the company draws a distinction between misinformation that is a threat to public safety and misinformation that is merely political.

“If there were misinformation that was, let’s say, tied to an ongoing riot or the threat of some physical violence somewhere in the world, we would work with safety organizations on the ground to confirm falsity and the link to violence, and then we actually would remove that misinformation,” Bickert said. “But when we’re talking about political discourse and the misinformation around that, we think the right approach is to let people make an informed choice,” she added later in the interview.

Some lawmakers are looking for the federal government to get into the “informed choice” business, when it comes to malicious online disinformation generated abroad.

The Senate Intelligence Committee passed an authorization bill last week that includes a provision offered by Sen. Mark Warner (D-Va.) giving the Director of National Intelligence and Secretary of Defense authority to establish a new, $30 million Social Media Data Analysis Center to analyze and publicize data around ongoing foreign influence operations online.

The center envisioned by the bill would be run by a non-profit but funded by the government. It would pull in representatives from social media companies, non-governmental organizations, data journalists, research centers and academics to sift through and analyze data across multiple social media platforms to detect and expose the clandestine foreign propaganda campaigns that U.S. intelligence agencies and disinformation experts say are becoming increasingly commonplace in the digital ecosystem.

Harvey Rishikof, former senior policy advisor to the director of national intelligence, said the government can have more success defusing the impact of disinformation campaigns not by pushing for laws to authorize the removal of content, but by clearly and publicly mapping out the origins of that content. In demonstrating how a propaganda campaign was created and distributed through the information stream, it can drain that campaign of its power to organically influence a debate.

“It’s going to be a requirement for [the U.S. government] to be quite clean and clear, and then the question is whether or not you believe it, because the credibility of governments is under attack,” Rishikof said at a May 21 cybersecurity event hosted by the American Bar Association. “You’re looking for independent authenticators that a rational third party would say doesn’t have any prejudice to be involved in the reveal of what is taking place.

The Department of Justice put out a policy last year detailing how it would respond to ongoing foreign influence campaigns, but such operations often intentionally recruit Americans to co-sponsor online groups and pages that blur the lines between foreign interference and protected free speech.

Deputy Assistant Attorney General Adam Hickey told the House Oversight and Government Reform Committee May 22 that among the strategy’s first principles is to avoid partisan politics.

“Victim notifications, defensive counterintelligence briefings and public safety announcements are traditional department activities, but they must be conducted with particular sensitivity in the context of foreign influence and elections,” said Hickey. “In some circumstances, exposure can be counterproductive or otherwise imprudent.”

The perception that the government might be placing its thumb on the scales of American political debate has spooked DOJ and other agencies from taking a harder line. While intelligence and law enforcement agencies worked to uncover and expose foreign online campaigns leading up to the 2018 mid-term elections, they preferred to pass that information along to private social media companies to remove and publicize those actions.

To establish the proposed Social Media Data Analysis Center, U.S. officials would need to work out information sharing protocols among the government, social media companies and the public and develop rules around which groups are eligible to participate in the center. They would also need to negotiate with social media companies and researchers over privacy protections and what data and metadata would be shared for analysis.

If funded, the center wouldn’t go live until at least 2021. Under the terms of the bill, the Director of National Intelligence would need to submit a report to Congress by next March, laying out funding needs, liability protections for all parties involved in the center, proposed penalties for misusing the data and any changes to the center’s mission needed to “fully capture broader unlawful activities that intersect with, complement or support information warfare tactics.”

Source: https://fcw.com/articles/2019/05/28/warner-intel-misinformation-center.aspx?m=1

Advertisements

One thought on “High-Tech Proposed Social Media Data Analysis Center Misses Most Foreign Information Warfare Efforts

Comments are closed.