New research suggests the proposed Christchurch Call falls short of dealing with the scale of issues social media can create for democracies.
Last month, Prime Minister Ardern announced she and French President Emmanuel Macron would co-chair a meeting in Paris aimed at stopping social media being used for extremist content.
- New Zealand, France to lead world on ending social media use for terrorism
- Social media backlash: How Jacinda Ardern and Emmanuel Macron plan to tackle 'global issue'
- Grant Robertson confirms global conversation on social media company crackdown, announcement coming
Attending leaders and technology company chief executives will agree to taking action with a pledge called the Christchurch Call - which comes after the March 15 Christchurch attacks were livestreamed on Facebook.
"We all need to act, and that includes social media providers taking more responsibility for the content that is on their platforms, and taking action so that violent extremist content cannot be published and shared," Ardern said.
But the report, funded by The Law Foundation, found while the Christchurch Call was a "positive initiative", it "falls short of dealing with the scale of challenge".
"It is critical that the Prime Minister and her advisors look beyond immediate concerns about violent extremism and content moderation, to consider the wider context in which digital media is having a growing, and increasingly negative, impact on our democracy," said lead researcher Marianne Elliot.
The report said social media has positive features such as allowing direct access to people around the globe and providing a voice to the voiceless, but also allowed the spread of fake news, populism and hate speech.
"Opaque" algorithm engines which control what content we see and the business incentives to get content to the masses were also deemed concerning.
The researchers proposed work should be done to get technology workers and digital media users to use "their leverage to demand ethical product design" and counter fake news by "investing more in public interest media and alternative platforms, leading to a more democratic internet".
The report recommended six areas of focus, which could be considered in talks initiated by Ardern:
- Restore a genuine, multi-stakeholder approach to internet governance, including meaningful mechanisms for collective engagement by citizens/users
- Refresh antitrust and competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness
- Recommit to publicly funded democratic infrastructure including public interest media and the online platforms that afford citizen participation and deliberation
- Regulate for greater transparency and accountability from the platforms including algorithmic transparency and accountability for verifying the sources of political advertising;
- Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management
- Recalibrate policies and protections to address not only individual rights and privacy but also collective impact and wellbeing.
The report received funding from the New Zealand Law Foundation and The Luminate Group, a global philanthropic organisation working to "building just and fair societies".
The Paris meeting will take place on May 15, but which world leaders and tech company bosses will attend is not yet known.
A livestream of the Christchurch attacks spread across the internet after the shootings. While Facebook said it removed more than 1.5 million copies of the video within 24 hours, there are reports of it being seen online for weeks after the event.
Days after the attacks, Australian Prime Minister Scott Morrison wrote to the G20 chairman, Japanese Prime Minister Shinzo Abe, asking for social media reform to be a top priority at the next annual meeting of world leaders.
"It is imperative that the global community works together to ensure that technology firms meet their moral obligation to protect the communities which they serve and from which they profit," he wrote, in a letter shared to Twitter.
Australia also pushed ahead with legislation to introduce jail terms and fines for social media providers that didn't act to remove violent material quickly.
Under Australia's proposed laws, offences would be punishable by three years' jail for executives of social media companies, or fines that could reach up to 10 percent of the platform's global annual turnover.
The United Kingdom also proposed introducing an independent watchdog to write a code of practice for technology companies.
Newshub.