Facebook allows violent content… with warnings

Oct 23, 2013 | Facebook marketing, Regulation, Social media

Facebook is adding a new security feature to its website that it says will keep users from stumbling across violent or potentially offensive content on its website, following an outcry over the discovery of beheading videos on the site. In a new move, Facebook has changed its rules to allow the videos, arguing that users […]

Facebook is adding a new security feature to its website that it says will keep users from stumbling across violent or potentially offensive content on its website, following an outcry over the discovery of beheading videos on the site.


In a new move, Facebook has changed its rules to allow the videos, arguing that users should be free to view them and then condemn the content.
A temporary ban was imposed in May as the site evaluated its policy after complaints about the video of a woman being beheaded by a Mexican drug cartel.
But now the block has been removed on the grounds that Facebook is used to share information about world events such as acts of terrorism. The social network said in a statement that it is working on ways to warn people about the content they might see.
The company insists its approach would be different if the actions in the footage were “encouraged” or “celebrated”.
Its new rules also specify that videos and photos which “glorify violence” will be removed.
A Facebook spokesman said: “Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events. People are sharing this video on Facebook to condemn it. If the video were being celebrated, or the actions in it encouraged, our approach would be different.
“However, since some people object to graphic video of this nature, we are working to give people additional control over the content they see. This may include warning them in advance that the image they are about to see contains graphic content.”
Under the latest rules, violent content will only be removed where there is a “genuine risk of physical harm”.
“You may not credibly threaten others, or organise acts of real-world violence,” the company said. “Organisations with a record of terrorist or violent criminal activity are not allowed to maintain a presence on our site. We also prohibit promoting, planning or celebrating any of your actions if they have, or could, result in financial harm to others, including theft and vandalism”.
Commenting on the move, Dr John Baptista, Associate Professor of Information Systems at Warwick Business School and researches social media, said: “It is impossible to create rules for everything on the web so I mostly believe in self-regulation based on common sense and guiding principles.
“This relies on companies like Facebook being sensible and responsible, however, in this case sadly I think Facebook is late in responding to feedback and it is very disappointing to see them not being proactive in addressing a serious issue such as this. I hope they do take the feedback seriously and respond positively before society responds with more regulations and use this as a stick to beat open governance models.”