Facebook has announced it is testing out changes it hopes will give users more control over their news feeds on the platform.
It comes amid ongoing criticism of the company which has recently rebranded as Meta after whistleblower Frances Haugen spoke out about the negative impacts of using the platform.
Haugen testified to a US Senate Commerce Committee subcommittee in October, telling the Senators the company's products "harm children, stoke division and weaken our democracy", with the company putting profit over moral responsibility.
"We're testing new ways to make it easier to find and use News Feed controls to adjust people's ranking preferences and customise their News Feed," Facebook said.
"As part of this, people can now increase or reduce the amount of content they see from the friends, family, groups and pages they're connected to and the topics they care about in their news feed preferences."
Testing begins around the world to a "small percentage of people" this week, the company said, with that number expanding in the coming weeks.
"We're also making existing controls easier to access, including favourites, snooze, unfollow and reconnect," it said.
"This is part of our ongoing work to give people more control over News Feed, so they see more of what they want and less of what they don't."
As well as showing misinformation, analysis reported by the Washington Post showed the algorithm which decides what people see had deliberately pushed emotional and provocative content into news feeds.
It did that by using emoji reactions on posts to prioritise, with each emoji reaction deemed five times more valuable that simple likes.
"The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook's business," the Washington Post said.
In an internal document, Facebook's own researchers were able to identify this could lead to issues, saying favouring "controversial" posts could mean "more spam/abuse/clickbait inadvertently".
According to the Washington Post, this proved to be the case with data scientists at Facebook showing "posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news".
CEO Mark Zuckerberg has defended the company from accusations it causes harm, saying it cares "deeply about issues like safety, well-being and mental health".
"At the heart of these accusations is this idea that we prioritise profit over safety and well-being. That's just not true."
The company is also testing giving advertisers more power to determine who will get to see their adverts.
It comes after analysis showed businesses wanted to avoid showing near posts referring to tragedy and conflict 99 percent of the time, while hotly debated social issues were avoided 95 percent of the time.
"When an advertiser selects one or more topics, their ad will not be delivered to people recently engaging with those topics in their news feed," the company said.