The main addition is an updated Admin Home set up, which will now make it easier for group admins to access their various management tools, and view what needs to be done each day, in a more intuitive interface.
As explained by Facebook:
“Admin Home is a simpler, more intuitive destination for all admin tools, settings and features that admins can tailor to their needs.”
As you can see in the screenshots, the new dashboard will include a ‘To Review’ listing of items that require admin attention, making it easier to stay on top of the various tasks. Which, as your group grows, can quickly become problematic, and having a simplified listing of key alerts will save group admins a lot of time and effort in going about their daily activities.
Facebook also says that the improved layout will make it easier for admins to find the various options and tools available to them, while new features will also be highlighted in the app, as well as insider tips, ensuring admins can keep up to date with the latest advances.
In addition to this, Facebook’s also adding comment moderation to Admin Assist, which will enable group admins to set up their own criteria around what they want to moderate, and even automatically moderate both posts and comments based on these rules.
As you can see here, admins can now, for example, automatically decline comments that include a link to a third-party site, which could be good for branded communities and avoid conflicts with competing options.
Through the updated Admin Assist, group admins will also be able to restrict people who don’t qualify to participate based on a range of options (including how long they’ve had a Facebook account and/or how long they have been a member of the group), while the tool will also provide access to Facebook’s advanced anti-spam tools.
Facebook notes that admins will be able to use these automated rules to “maintain positive discussions and resolve conflicts within the group”, which is also the focus of its new ‘Conflict Alerts’, which, according to Facebook, will use AI to detect and notify admins when there may be contentious or unhealthy conversations in their group, so that they can take action as needed.
As you can see here, the new Conflict Alerts will highlight interactions where Facebook’s system detects potential concern, so admins can then jump in and make sure things don’t boil over. This could be great for reducing angst, enabling admins to deescalate such situations before they become more divisive and problematic.
And when such conflicts are detected, admins can either address them directly, or they can take new actions – like slowing down comments.
“Admins can temporarily limit how often specific group members can comment, and control how often comments can be made on certain posts that admins select.”
Additional tools on this front could be a big help in both immediately addressing such, and providing a disincentive for users to engage in such exchanges moving forward.
Facebook’s also adding new overall management and insight tools for groups, including new member summaries, which provide an overview of each group member’s activity.
That will help admins assess and address various actions, by understanding each group member’s input.
Facebook’s also added a new appeal process for group admins, covering content that they or other admins have posted, or that they have approved from members.
This will help to streamline the management of such, as opposed to putting the burden onto the posting member, or having to explain the rules directly to that user, as the decision will then be made by Facebook’s moderation team.
Facebook’s also giving admins the capacity to tag group rules within comments and posts for clarity, while group members will also be able to tag specific group rules when they report posts and comments to admins.
Which seems like it could be helpful – but it might also get annoying, which could trigger a few ‘Conflict Alerts’.
And finally, Facebook’s also making pinned comments and admin announcement notifications available in all groups.
These last two have been available in many groups for some time, but now Facebook will expand access to all, providing more ways to keep your group members informed, and ensure these key updates get priority.
There’s a heap of new elements, and additional considerations here, with Facebook working to made group moderation as easy as possible, to help support the 70 million active admins and moderators running Facebook groups around the world.
Which makes sense. If volunteer moderators are keeping things in check, that lessens the burden on Facebook to do the same, while establishing clear rules around interactions, and what’s not acceptable to post, also takes those actions out of Facebook’s hands, and lessens the pressure on its team.
In many ways, this is the model that has helped Reddit become a key platform for engagement around specific niches, while also reducing the impact of spam and junk, through a dedicated army of volunteer moderators who keep each subreddit in check.
Facebook is hoping that it can facilitate the same in its groups, which are used by 1.8 billion people per month – and if these tools and features can help reduce conflict, and the sharing of controversial material like misinformation (by eliminating link sharing, for example), that could be a big step in lessening such concerns more broadly, improving Facebook’s interactions without boosting Facebook’s own moderation workload.
In other words, Facebook really needs its group mods to keep doing this stuff for free. As such, making the task as simple as possible is a key step – and for brand communities, there’s a range of potential benefits in these new updates.