Facebook to rethink their image policy
There is to be a call to introduce safeguards in order to prevent users from accidently being exposed to horrific images, according to a Facebook’s safety adviser. This move succeeds various complaints about photos depicting severed heads taken in a region of Syria controlled by the jihadist group Islamic State (IS). The company, at first, refused to remove the images, claiming that they had not gone against its guidelines. Later, it blocked the material following being contacted by the BBC. Stephen Balkam, chief executive of the US’s Family Online Safety Institute (Fosi), said that he planned to bring up the issue next month at a meeting of the Facebook Safety Advisory Board.
“There may be instances in which graphic photos and videos, like the beheadings in Syria, can be justified as being in the public interest. However, if they are hosted on Facebook or other social media platforms, there should be two barriers put in place. First, an interstitial, or cover page over the graphic images. With an interstitial in place, a user, particularly a child, will not have the image appear in their timeline or be easily seen if they are sent a link to the images. Secondly, there should be an age gate, saying that you must verify that you are 18 years of age. While this is easily circumvented, it does at least warm the user and may well deter both kids and adults alike.”
Previously, Mr Balkam had criticised the social network in 2013 after it rejected calls to remove a video clip displaying a woman who was beheaded in Mexico. During this time, the website briefly placed a warning over the video clip, and then removed it on the grounds that it was ‘glorifying’ violence. Facebook has a policy that specifies that the sharing of graphic content for sadistic pleasure is forbidden. However, the uses of gruesome images which are designed discourage violence or to highlight a crucial issue is accepted.
“We do sometimes see people come to Facebook to condemn or report on violence, to do so in a responsible manner, which may include warning people about the nature of content in the videos and imagery they’re sharing and carefully selecting the audience for the content. Our goal is to strike a balance between allowing people to comment on the often brutal world around them, whilst protecting people from the most graphic of content.”
Recently, controversy was focussed on photos of severed heads which were posted on a Facebook page operated by a group named the Raqqa Media Center (RMC), which is situated in an Islamic State-controlled city in Syria. Early this week, an Al Jazeera broadcast producer reported this material to the website. She got a response saying; “It doesn’t violate our community standards.” Following that, the BBC contacted Facebook, at first, a spokesman supported the decision saying that “the page is run by a Syrian opposition group, not IS”.
He then mentioned a piece in a 2013 blog post in which RMC complained that its members had been harassed by Isis, the name formerly used by IS. Facebook’s rules have banned IS and “terrorist groups” from using their website. There was an image though that was posted on 24th August 2014 which depicted a man’s foot against a severed head – this was captioned by Arabic text that appeared to glorify the violence depicted.
It said: “Our people in Tabqa suffered, now we are stepping on the heads of people working in the airports”. This could have been overlooked due to the fact that Facebook provides a translation service powered by Microsoft, this produced a complicated interpretation. Despite this, Facebook’s review team do have employees that speak Arabic.
An expert has stated that Facebook must do more in order to protect the 1.3 billion users from these horrific images and videos.
A social networks researcher at the University of Oxford has said; “Other sites have long worked this out. Reddit, for instance, now uses the ‘not safe for work tag’ to restrict images that are violent in nature and clearly reprehensible to people. Facebook should follow in those footsteps and have, if not a zero-tolerance policy, at least some way for the community to very easily tag something as vulgar or violent.”