Facebook is asking some U.S. users whether they may have been exposed to extremist content, or if they are worried that someone they know might be becoming an extremist.
“We are partnering with NGOs (non-governmental organizations) and academic experts in this space and hope to have more to share in the future,” a Facebook spokesperson said in an emailed statement.
The test comes as the world’s largest social media network continues to face intense scrutiny from critics, experts and politicians to curb extremism across its platforms. A closer eye has been placed on Facebook in the U.S. after the deadly insurrection at the Capitol on Jan. 6 when supporters of then-President Donald Trump tried to prevent Congress from certifying Joe Biden’s presidency.
Facebook banned Trump soon after the events and, in May, Facebook’s independent oversight board decided to uphold Trump’s ban on the platform and urged the platform to investigate what role it might have played in the Capitol riot.
One of the queries captured in a screenshot asks users, “Are you concerned that someone you know is becoming an extremist?”
That same query said, “We care about preventing extremism on Facebook. Others in your situation have received confidential support.”
Another query informs users that they may have been recently exposed to extremist content.
“Violent groups try to manipulate your anger and disappointment,” Facebook asks. “You can take action now to protect yourself and others.”
Both queries direct users to a support page.
While commending Facebook’s efforts, Jake Hyman a spokesman for the Anti-Defamation League (ADL), said Friday he wants the social media company to be more transparent going forward with this process.
“It’s somewhat encouraging that Facebook is beginning to explore ways to provide individuals who may have been exposed to extremist content more personalized support — something they should have started doing years ago,” Hyman said. “That being said, the questions raised by experts related to Facebook’s similar work around individual support for suicide prevention are concerning.
“If Facebook moves forward with this program, they must do so in a much more transparent and ethical manner. The seriousness of the problem requires serious resourcing from Facebook,” Hyman added.
Oren Segal, the vice president of the ADL’s Center on Extremism, agrees.
“I don’t think it’s unreasonable to ask for more transparency to assure this is happening in an ethical manner, the companies that have not done a great job in the exploitation of their platforms by extremists,” Segal said. “We want to make sure these companies do a good job with the solution than they did with the problem.”
Facebook has repeatedly vowed “to enforce our policies and will remove content and accounts that violate our rules. We are providing these additional resources to give people exposed to this content more information and help others.”