Backlash to the nationwide Black Lives Matter protests has exploded on the internet. A public Facebook group called FY BLM, which stood for “fuck you Black Lives Matter,” and launched on July 6, attracted more than 22,000 followers.
This afternoon, a company spokesperson said they were looking into whether the FY BLM page violated its standards. Minutes later, the page had been removed from the platform.
A Facebook spokesperson told the Daily Dot, “we removed this group for violating our policies on hate speech.”
The page repeatedly likened Black Lives Matter, a movement began in reaction to police brutality against Black people and the relatively meager attention given to their deaths, to a terrorist organization and hate group.
Claiming that the movement is the equivalent to the Ku Klux Klan, or “the Klan with a tan,” was a common, and appalling, refrain on the page. It has also likened Black Lives Matter to the Nazi party.
The page’s avatar claimed that it existed to “[call] out the woke hypocritical race baiting terrorist BLM group for what they are.”
Its “about” section falsely stated that Black Lives Matter is a hate group, and said that Black lives only matter to the group when someone is killed by a white police officer.
Some posts from the page were wildly popular, popping up in other groups frequented by racists and far-right militias.
A recent video of someone burning a Black Lives Matter flag while Toby Keith’s “Courtesy of the Red, White and Blue (The Angry American)” plays was shared 60,000 times.
Posts such as these often solicited racist responses. One comment referred to a founder of Black Lives Matter as “George Soros’ paid slave;” another suggested free shooting lessons for gang members so they kill each other off; others used derogatory, racist language for Black people. Many called for violence.
Contempt for Black people was rampant on the page.
Unsurprisingly, FY BLM was targeted by people who took issue with its content and message.
Asked by someone why it was public, a page admin responded that it is “way more fun to use the ban hammer,” referring to
Content on the page was clearly in violation of Facebook’s standards.
Several people claimed that they reported it to Facebook—yet it persisted until a reporter contacted the company.
Facebook’scommunity guidelinesregarding hate speech prohibit attacking people based on race, ethnicity, sexual orientation, gender identity, religion, etc.
Policing hate speech on Facebook has proved difficult, however, in part because people are continuously looking for new ways to escape detection—such as by shortening this group’s name to an acronym—and in part because hate proliferates on social media.
According to its most recentcommunity standards enforcement report, from the fourth quarter of 2019 to the first quarter of 2020, Facebook took action on nearly four million additional pieces of content for violating its hate speech prohibition—from 5.7 to 9.6 million.
The company has invested significant resources in the fight against hate, yet some believe it hasn’t acted strongly or decisively enough, as it continues to fester on the platform.