Facebook Inc. said on Thursday that it is starting to alert some users that they may have seen “extremist content” on the social media network.
“Are you concerned that someone you know is becoming an extremist?” a notification said, according to screenshots published on Twitter. Another alert said, “you may have been exposed to harmful extremist content recently.” Both included links to “get support.”
The world’s largest social media network has long been pushed by legislators and civil rights groups on its platforms to combat terrorism, including the U.S. domestic movements that participated in the January 6 riot when Donald Trump’s supporters were trying to stop the U.S. Congress from recognizing that Joe Biden was victorious in the November elections.
Facebook said the small test was run in the US as a pilot for a global approach to prevent radicalization on its site, which only works on its main platform.
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” said a Facebook spokesperson in an emailed statement. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”
The campaign, which includes major technology platforms to counter violence against extremist content online, began in New Zealand after the attack in 2019 and was live-streamed on Facebook.
Facebook claimed the test was aimed at identifying individuals who may have been exposed to rule-breaking extremist content as well as those who have previously been subject to Facebook’s enforcement.
The company, which has recently reinforced its restrictions against violent and hate organizations, said it removes certain content and accounts that break its rules before it is seen by users, but that other stuff may be accessed before it is enforced.
Discover more from Baller Alert
Subscribe to get the latest posts sent to your email.