Facebook is becoming a bit more like Nextdoor in an effort to boost its groups feature. The only problem is that Facebook appears to be borrowing one of Nextdoor’s more controversial concepts: giving more power to community moderators.
On Wednesday, the company announced it was making a major enhancement to the powers of its groups’ community moderators. Now, administrators can do a number of new things, like automatically block certain people from commenting in conversations based on factors like how long they’ve been a member of the group. Facebook says the new tools are meant to help “admins play a key role in helping maintain a safe and healthy culture.” The changes are part of Facebook’s broader shift toward relying more on unpaid community admins, who get special privileges in exchange for managing the conversation in individual groups.
There are other new powers now at admins’ disposal, like an AI-powered alert that flags “contentious and unhealthy” conversations, and new summaries that moderators can use to review any member’s activity in a particular group. When asked whether the new features were inspired by Nextdoor’s moderation system, Facebook spokesperson Leonard Lam said, “Our product team regularly talks to our admin community to better understand their needs, and the features we announced today reflect direct feedback that we’ve gotten from them.”
The approach largely resembles the way Nextdoor, the neighborhood-based media platform, has for years handled moderation. For example, Facebook’s new AI-powered Conflict Alert system is meant to “slow down” uncivil conversation by sending a notification to group moderators. In 2019, Nextdoor released a “Kindness” reminder powered by machine learning that also tried to slow down conversation before a user posts something potentially harmful. The problem is that Nextdoor’s model hasn’t really worked. Its communities are plagued by a haphazard approach to misinformation and complaints of toxic fights between group members, along with accusations of biased and inconsistent community moderators.
Maybe things will work out differently for Facebook. But the new approach to moderation isn’t the only example of Facebook trying to be more like Nextdoor. Facebook is also preparing to launch a Nextdoor-style group feature in the US called Neighborhoods — the feature is already available in Canada — that will allow users to create and join groups that are limited to geographic areas, which is what Nextdoor does. Facebook will also rely on unpaid community moderators to enforce its guidelines for the Neighborhoods feature, which are meant to keep content “relevant and kind.” Nextdoor also does this.
The launch of Neighborhoods comes as Facebook has been accused of cloning apps or features made famous by its competitors, including TikTok, Snapchat, and Zoom. For instance, Facebook introduced Reels in Instagram last year, which mimics TikTok’s emphasis on short video clips tied to music. Following in the direct footsteps of Snapchat, Facebook launched a Stories feature in Instagram in 2016 and in its namesake app the following year. Then, as video chat went more mainstream during the pandemic, Facebook released Messenger Rooms, a videoconferencing app that competes with Zoom.
Enlisting users to serve as community moderators has its problems, something Nextdoor knows all too well. In recent years, Nextdoor has encountered many of the same moderation issues as Facebook, including the distribution of hate speech, conspiracy theories, and political misinformation. Nextdoor faced criticism last year when unpaid community moderators censored and removed posts in support of Black Lives Matter protests following the murder of George Floyd. The company later emphasized that these posts were permitted speech and, earlier this year, released an anti-racism notification system that’s supposed to prompt users who are about to post potentially racist content. Medical misinformation about Covid-19 is also a problem, users told Recode in February. They also complained that the platform’s community-based moderation system had allowed conspiracy theories to flourish.
Nextdoor has also struggled to handle conversations about politics. As Recode reported last year, Nextdoor groups can be overrun with tense political arguments that its unpaid moderators are either unequipped or unmotivated to resolve. The platform’s issues with political speech were on display following the Capitol insurrection on January 6, when Nextdoor quietly stopped recommending political groups (Facebook decided to do this as well at about the same time).
Nextdoor’s moderation model is far from perfect, but Facebook is betting that making itself more like Nextdoor — which has become increasingly popular during the pandemic — it might find success. Ultimately, the two platforms seem to be converging into offering groups-based interactions and AI-enhanced community moderation, even though both Facebook and Nextdoor continue to struggle with misinformation, racism, and toxic discourse.
Today’s news is just another sign that Nextdoor and Facebook are getting more and more alike, which is probably bad news if you went to Nextdoor to avoid Facebook, or vice versa.
Update, June 21, 1:45 pm ET: This piece was updated with information about Facebook’s Conflict Alert and some of the company’s history of adding features.