Much of the empire built by Alex Jones, the Infowars founder and social media shock jock, vanished this summer when Facebook suspended Jones for 30 days and took down four of his pages for repeatedly violating its rules against bullying and hate speech. YouTube, Apple and other companies also took action against Jones. But a private Infowars Facebook group with more than 110,000 members, which had survived the crackdown, remained a hive of activity.
In Jones’ absence, the group continued to fill with news stories, Infowars videos and rants about social media censorship. Users also posted the sort of content — hateful attacks against Muslims, transgender people and other vulnerable groups — that got Jones suspended. And last week, when Jones’ suspension expired, he returned to the group triumphantly.
“My 30-day Facebook ban is up!” Jones announced.
Jones built his Facebook audience on pages — the big public megaphones he used to blast links, memes and videos to millions of his followers. In recent months, though, he and other large-scale purveyors of inflammatory speech have found refuge in private groups, where they can speak more openly with less fear of being punished for incendiary posts.
Several private Facebook groups devoted to QAnon, a sprawling pro-Trump conspiracy theory, have thousands of members. Regional chapters of the Proud Boys, a right-wing nationalist group that Twitter suspended last month for its “violent extremist” nature, maintain private Facebook groups, which they use to vet new members. And anti-vaccination groups have thrived on Facebook, in part because they are sometimes recommended to users by the site’s search results and “suggested groups” feature.
Facebook’s fight against disinformation and hate speech will be a topic of discussion on Capitol Hill on Wednesday, when Sheryl Sandberg, the company’s chief operating officer, will join Jack Dorsey, Twitter’s chief executive, to testify in front of the Senate Intelligence Committee.
When it comes to public-facing pages, Sandberg will have plenty of company actions to cite. Facebook has taken many steps to clean up its platform, including hiring thousands of additional moderators, developing new artificial-intelligence tools and breaking up coordinated influence operations before the midterm elections.
But when it comes to more private forms of communication through the company’s services — like Facebook groups, or the messaging apps WhatsApp and Facebook Messenger — the social network’s progress is less clear. Some experts worry that Facebook’s public cleanup may be pushing more toxic content into these private channels, where it is harder to monitor and moderate.
Misinformation is not against Facebook’s policies unless it leads to violence. But many of the private groups reviewed by The New York Times contained content and behavior that appeared to violate other Facebook rules, such as rules against targeted harassment and hate speech. In one large QAnon group, members planned a coordinated harassment campaign, known as Operation Mayflower, against public figures such as actor Michael Ian Black, late-night host Stephen Colbert and CNN journalist Jim Acosta. In the Infowars group, posts about Muslims and immigrants have drawn threatening comments, including calls to deport, castrate and kill people.
“They’ve essentially empowered very large groups that can operate secretly without much governance and oversight,” said Jennifer Grygiel, an assistant professor at Syracuse University’s S.I. Newhouse School of Public Communications. “There may be harms and abuses that are taking place, and they can’t see.”
After The Times sent screenshots to Facebook of activity taking place inside these groups, Facebook removed several comments, saying they violated the company’s policies on hate speech. The groups themselves, however, remain active.
A Facebook spokeswoman said the company used automated tools, including machine learning algorithms, to detect potentially harmful content inside private groups and flag it for human reviewers, who make the final decisions about whether or not to take it down. The company is developing additional ways, she said, to determine if an entire group violates the company’s policies and should be taken down, rather than just its individual posts or members.
Harmful activity does not appear to be more prevalent in secret groups, the Facebook spokeswoman said.
Private groups have been a core feature of Facebook for years. But they received new focus last year when executives changed the company’s mission to emphasize close-knit connections, rather than filling users’ feeds with news stories and viral videos. In a long memo called “Building Global Community,” Mark Zuckerberg, Facebook’s chief executive, set out a goal of connecting 1 billion Facebook users.
Soon after, the company’s algorithms began giving posts from groups higher visibility in users’ news feeds. The largest Facebook groups — such as Pantsuit Nation, a progressive political group formed to rally Hillary Clinton supporters in the 2016 election — gained millions of members.
Facebook’s promotion of private groups encouraged like-minded people to cluster together. But it also meant that some users were seeing more posts from people whose opinions and interests they already shared, and it may have created echo chambers where polarizing behavior could flourish.
“It’s one of the really thorny challenges Facebook faces,” said Eli Pariser, a co-founder of the website Upworthy and author of “The Filter Bubble.” “An easy way to bond is over a common enemy. How do you bond people together without actually fanning the flames of division?”
Facebook groups are self-regulated by members who act as administrators and moderators, with the authority to remove posts and oust unruly members. And the company has rolled out new features for group leaders to give them more control, and help them resolve conflicts within their groups.
One type of private Facebook group, known as a “closed” group, can be found through searches. Another type, known as a “secret” group, is invisible to all but those who receive an invitation from a current member to join. In both cases, only members can see posts made inside the group.
The private Infowars group is closed, was formed before Facebook took action against Jones and is billed as an “unofficial” fan group. New users must be approved by moderators and answer several screening questions, including “What is the answer to 1984?” (The correct answer, according to the title of one of Jones’ books, is “1776.”)
Since the public Infowars page was taken down, the private group has functioned as a makeshift home for fans of the site. Two remaining Infowars pages, along with profiles apparently belonging to Jones and others affiliated with Infowars, are listed as administrators, giving them a higher level of control of the group than moderators.
Jones and several other administrators of the Infowars group did not respond to a request for comment.
It is impossible for Facebook to prevent all bad behavior on its platforms. But its choices can make it easier or harder for violent and extreme movements to gather critical mass. This year, after viral hoaxes on the app were implicated in a spate of mob violence in India, the Facebook-owned messaging app WhatsApp limited a feature that allowed users to forward messages easily to large numbers of groups.
Because groups provide more privacy than public pages, they can also be magnets for trolls, abusers and spammers. Last year, right-wing activists gathered in a private Facebook group, Republic of Kekistan, to organize targeted harassment campaigns, including threatening a transgender cartoonist and trolling a fundraising page for Heather Heyer, the woman killed in the violent “Unite the Right” rally in Charlottesville, Virginia. And a private Facebook group for Marines became embroiled in scandal last year when members of the group were discovered to have shared nude photos of female service members, along with lewd comments.
Foreign organizations have also found private groups useful. Last month, when Facebook took down hundreds of public pages that it said were connected to a coordinated influence operation with hints of Russian and Iranian involvement, it also removed three private groups. The company did not name the groups or say how they were used, but it revealed that 2,300 users joined at least one of them.
“The vast majority of groups on Facebook are probably the run-of-the-mill groups,” said Renée DiResta, a researcher with Data for Democracy who studies online extremism. “The challenge is, how does the groups feature interact with the other features on Facebook that we know are causing radicalization, hate speech and genocide in certain places? Who is taking responsibility for looking at the negative externalities of this push to create communities?”
Sheera Frenkel contributed reporting.
© New York Times 2018