Facebook moves to address extremist content
Facebook has launched a raft of changes designed to head off future criticism of the company for failing to act on issues relating to extremist right-wing content.
The social media giant has announced it will create a 'supreme court' style oversight committee which will determine whether content should be removed or remain on Facebook's pages. This body, which will eventually consist of around 40 individuals, is expected to make final rulings on dozens of cases a year and will only be activated once a user has worked through all of Facebook's existing appeal processes around content.
In addition, Facebook will begin using real-life body camera footage from the US and UK to better train its automated systems to identify real-life shootings as they are being screened live so as to avoid a repeat of the Christchurch gunman's use of Facebook Live to stream his killing spree to the world.
"With this initiative, we aim to improve our detection of real-world, first-person footage of violent events and avoid incorrectly detecting other types of footage such as fictional content from movies or video games," Facebook said in its blog post announcing the new initiatives.
Facebook is also increasing the resources it devotes to tracking down extremist content and removing it from its pages.
"To date, we have identified a wide range of groups as terrorist organisations based on their behaviour, not their ideologies, and we do not allow them to have a presence on our services. While our intent was always to use these techniques across different dangerous organisations, we initially focused on global terrorist groups like ISIS and al-Qaeda. This has led to the removal of more than 26 million pieces of content related global terrorist groups like ISIS and al-Qaeda in the last two years, 99% of which we proactively identified and removed before anyone reported it to us."
That programme will now be expanded to include right wing extremists and the company has already banned 200 white supremacy groups, it says.
Finally, the company will direct New Zealand-based users who are searching for extremist content to sites set up to help users move away from hate speech and extremist groups.
Already in operation in the US, the process will be deployed in Australia and Indonesia with New Zealand following later on, once Facebook has identified an appropriate partner to engage with.
Despite all of this, the company continues to avoid making changes to the way its core algorithms work to direct users to more and more extreme content or to address the use of Facebook Live as an unmoderated platform for sharing of objectionable material.
In its blog post the company signs off by saying, "We know that bad actors will continue to attempt to skirt our detection with more sophisticated efforts and we are committed to advancing our work and sharing our progress."
You must be logged in in order to post comments. Log In