Just last week, Facebook finally banned militia groups and pages that advocate for violence on its platform. But Recode’s quick Facebook search for “militia” groups and pages on Friday surfaced over a dozen results for national and local militia groups, most of them private, with many of them openly calling for violence against protesters.

Two of these groups that Recode accessed had a combined 25,000 members and included posts where members encouraged and celebrated shooting people involved in recent Black Lives Matter protests. Some groups did not contain “militia” in the title but still encouraged members to take up arms. One page, called the “The III% Organization,” contained overtly racist and violent posts, such as a meme comparing BLM protesters to dogs and joking about running them over with a car.

After Recode flagged seven of these groups and pages to Facebook, the company took down four of them for violating its policies, and said it independently took down another.

Militia groups that organize on Facebook are under particular scrutiny this week after a 17-year-old who is a self-identified militia member was arrested on suspicion of killing two people protesting the police shooting of Jacob Blake in Kenosha, Wisconsin. In the aftermath of that shooting, Facebook has faced sharp criticism, including from its own employees, for initially failing to remove a Kenosha militia page despite prior complaints from at least two Facebook users about the group inciting violence. The company eventually took the page and an associated event down, but only after suspected shooter Kyle Rittenhouse allegedly killed two protesters and injured another on Tuesday night. Facebook said Rittenhouse was not a member of the Kenosha militia page in question.

Many civil rights groups leaders, employees, and politicians have long accused the company of not doing enough to stop the spread of violent rhetoric on its platform.

The social media giant’s CEO Mark Zuckerberg said in a company meeting Thursday that the initial decision to not take down the Kenosha militia group’s page was a mistake, according to internal remarks first reported by BuzzFeed News, which the company later posted publicly. Zuckerberg said the company is working to take down any posts praising the alleged shooter, and that it’s all part of Facebook’s recently expanded policy against dangerous groups and threats.

While the militia groups Recode found on Friday represented a small fraction of Facebook’s some 2.7 billion users, their continued presence on the platform despite its new policies signals how big of a challenge it is for Facebook to stop people from using its platform to organize violence and amplify hate speech. While Facebook, Twitter, and other platforms have adopted stricter guidelines over the years around violent speech, they’ve struggled to catch harmful content in real-time, while balancing concerns about limiting free speech online with strict enforcement.

“The continued presence of these militia Facebook groups and the concerning content that they contain represent multiple layers of failure on the part of Facebook to adhere to its own policies that it repeatedly pushes in press releases and statements to media,” said Katie Paul, director of tech watchdog group Tech Transparency Project, which has been researching some of these militia groups.

A spokesperson for Facebook issued the following statement to Recode, in part: “The shootings in Kenosha have been painful for everyone, especially our black community. Mark addressed this at yesterday’s weekly company Q&A … We launched [the dangerous individuals and organizations] policy last week and we’re still scaling up our enforcement of it by a team of specialists on our Dangerous Organizations team.”

tinyurlis.gdu.nuclck.ruulvis.netshrtco.de