Fake News · Information operations · Information Warfare

Facebook adding 3,000 staff to review violent content


, May 4, 2017

The announcement comes after criticism aimed at Facebook for a series of live broadcast murders and rapes. The company is also working to tackle “fake news” and disinformation.

Facebook announced Wednesday that it plans to hire an extra 3,000 people to review videos and other posts after a series of murders, suicides and rapes were broadcast live on the social media platform.  In come cases, it took Facebook hours to remove the content.

“We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down,” Facebook founder and CEO Mark Zuckerberg wrote in post.

The new hires will be added over the next year and join 4,500 staff already at the company’s community operations division.

The additional staff will also “help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said.

“And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they’re about to harm themselves, or because they’re in danger from someone else,” he added.

Zuckerberg said the Facebook Live video platform had also been a force for good, citing a recent example of someone who was going to commit suicide.

“We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate,” he said.

Monitoring content is a difficult balancing act for Facebook, which bans promoting violence or hate speech. Journalists or activists may post pictures or videos of violence, such as police shootings or scenes from war that may serve a legitimate public purpose. Others may use Facebook Live to promote a charity or other beneficial cause.

However, the company also doesn’t want to be viewed as censoring content or policing opinions of its nearly 2 billion users, who use the platform not just to follow friend and family but also access and share news.

Amid a flurry of talk over “fake news” impacting the US election, Facebook in December took a number of steps it said would help counter the spread of disinformation.

Last month, it recognized in a white paper that governments and organized non-state actors are conducting sophisticated disinformation and misinformation on Facebook to achieve domestic and international political objectives.

“While information operations have a long history, social media platforms can serve as a new tool of collection and dissemination for these activities. Through the adept use of social media, information operators may attempt to distort public discourse, recruit supporters and financiers, or affect political or military outcomes,” the white paper said.

Source: http://www.marketexpress.in/2017/05/facebook-adding-3000-staff-to-review-violent-content.html

Advertisements