First off, for individual users, Facebook’s looking to make its existing News Feed control options easier to find, while it’s also looking to give people the capacity to reduce certain types of content in their feeds.
As explained by Facebook:
“As part of this, people can now increase or reduce the amount of content they see from the friends, family, Groups and Pages they’re connected to and the topics they care about in their News Feed Preferences.”
Facebook’s News Feed preferences provide more control over what you’re shown in your feed by enabling you to select favorite profiles that will then get higher priority when they post, to unfollow Pages, people and topics, and to snooze certain users/Pages, all from the unified listings.
Soon, there’ll be even more control options in this list, with the capacity to increase/decrease the content that you’re shown from each element – though how exactly that will work is not clear as yet.
It could be a good way to provide people with more control over their feed – though, of course, that does depend on how many people actually use it, with previous data showing that many people don’t change their Facebook settings, even when there’s clear reason that they should do so.
Which is why updates like this tend to be a win-win for The Social Network, because it puts the onus on users, giving them more control, while Facebook itself knows that many won’t bother, ensuring it largely maintains the status quo in usage. There’s not a lot more it can do in this respect, but hopefully, with this new push, Facebook will go to more effort to encourage people to utilize such controls, and maximize adoption, and awareness, of such tools.
Algorithmic amplification was one of the key elements of concern highlighted by Facebook whistleblower Frances Haugen in her various testimonials about the negative impacts of the platform, with Haugen telling the US Senate that social networks should be forced to stop using engagement-based algorithms altogether, via reforms to Section 230 laws.
As explained by Haugen:
“Facebook [has] admitted in public that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems in most of the languages in the world. It is pulling families apart. And in places like Ethiopia it is literally fanning ethnic violence.”
Haugen’s view is that these algorithms incentivize negative behaviors, in order to drive more engagement, and as Haugen notes, that is causing significant harm in various regions, including the US.
It’s difficult to define the true impacts of such, but it seems fairly clear that Facebook’s algorithms have changed public discourse, with news publishers, in particular, working to maximize interaction with their Facebook posts to boost overall reach, which often involves sharing more partisan, divisive, and argumentative content. That then leads to more angst and dispute.
Providing user controls to limit the impacts of such could be a good step, but we’ll have to wait and see the specifics of how Facebook looks to roll this out. Facebook says that it’ll begin testing the new control options with ‘a small percentage of people, gradually expanding in the coming weeks’.
In addition to this, Facebook’s also expanding its Topic Exclusion controls for News Feed to a limited number of advertisers that run ads in English.
“The advertiser topic exclusion control allows an advertiser to select a topic to help define how we’ll show the ad on Facebook, including News Feed. Advertisers can select three topics – News and Politics, Social Issues, and Crime & Tragedy. When an advertiser selects one or more topics, their ad will not be delivered to people recently engaging with those topics in their News Feed.”
That essentially enables advertisers to avoid unwanted associations with these topics, and their related discussions, which could be a good way for Facebook to assure brands that they won’t end up suffering negative impacts as a result of the same.
Facebook says that, in early testing, the exclusions have been highly successful in ensuring ads are not shown alongside such discussions in the app.
Again, amid broader debate around the impacts of negative interactions on the platform, it makes sense for Facebook to provide more controls, which will help users improve their experience, in-line with their own expectations and interests, while also providing more assurance for brands.
Of course, ideally, if the research shows that there’s a positive impact overall from such changes, you would hope that Facebook would look to reduce these negative elements more broadly, but that’s another aspect that it will need to look into – and may even be forced to explore further, if Frances Haugen’s recommendations are adopted by regulators.