A Kiwi tech expert says Facebook's call for global internet rules is weak and won't work.
In his first comments since the Christchurch mosque attacks, the social media site's founder said "a more active role for governments and regulators" is needed, and laws covering all social media are required.
"Regulation could set baselines for what's prohibited and require companies to build systems for keeping harmful content to a bare minimum," Mark Zuckerberg wrote in the Washington Post.
But InternetNZ chief executive Jordan Carter says a set of global rules would fail - they'd only cover the lowest common denominator, and would take too long to implement.
"It's better for us as a community to decide what we're looking for, than for them to just be making their own decisions," he told Newshub.
A set of rules that would apply worldwide wouldn't work because different countries have different values, he said.
"There's the case of Germany, which has got very strong laws against Nazi [material] on social media platforms in a way we don't have here."
- 'Your silence is an insult to our grief': Privacy Commissioner slams Facebook
- Manifesto crosses a line that Mein Kampf didn't - Chief Censor
Carter said it was deeply disappointing Zuckerberg's editorial made no mention of the Christchurch shootings at all. The alleged gunman livestreamed the attack on Facebook, which didn't remove the clip until 12 minutes after it was over.
Carter said there's a lot more Facebook could do without waiting for regulation.
"It sounded a little bit like Facebook saying, 'We want the governments of the world to come and help us do this tough job we've taken on ourselves,' I think there's more they can and should do."
In a blog post at the weekend, Carter said New Zealand's "ability to respond in a legislative or regulatory way is limited because of our small scale and size - and yet, we have a unique responsibility to do what we can because of what has happened".
"This act of terrorism used social media platforms - a core part of most people's world online - to magnify its impact. Others, in different ways, have faced this impact before - but now it is close to home for us. Exploring what needs to change on these platforms, and in the legal environment that applies to them, is going to be a debate that runs for some time."
In the wake of the attacks, Facebook has removed millions of attempts to re-upload the massacre video and banned white nationalist and separatist content. It's also considering restricting who is allowed to use its Facebook Live feature.
Newshub.