Recreations of the 2019 Christchurch terror attack on two mosques in which 51 people were murdered have been available multiple times over the past year in gaming platform Roblox.
The programme is wildly popular with children and allows users to create their own games and share them online for other users to play.
But neo-Nazi and other far-right content has been frequently shared via Roblox and experts are worried about the consequences - especially in New Zealand.
Daniel Kelley, director of strategy and operations for the Center for Technology and Society at the Anti-Defamation League (ADL), found the latest version of the Christchurch terror attack in the game earlier this month.
"I would like one time to search for 'Christchurch' on Roblox and not find a new recreation of the 2019 Christchurch mosque shooting on a game platform aimed at very young children," he tweeted.
He'd previously found other versions in May while preparing for a presentation on how to report games in Roblox.
"I was able to find two different recreations of the Christchurch mosque shooting with a brisk 10 minute keyword search," he wrote.
Online gaming's radicalisation potential
Online gaming is an ideal avenue for extremist recruitment, said Anjum Rahman, lead of the Inclusive Aotearoa Collective Tāhono and a founding member of the Islamic Women's Council of New Zealand.
"We know from the Royal Commission report that the Christchurch terrorist was very active in online gaming from a young age, but the Commission did not and could not investigate (according to their terms of reference) who he was interacting with in those games and to what extent he might have been radicalised or influenced through this activity," she told Newshub.
The availability of recreations of real-life events is harmful and has the potential to both further radicalise by justifying the events as well as re-traumatising the victims of the attack, she said.
"For that reason, it must be removed from this platform and I hope the Classifications Office will be giving some attention to this."
David Shanks, chief censor of New Zealand, banned the original livestream video of the March 15 atrocity as well as associated documents, labelling them as objectionable under the Films, Videos & Publications Classification Act 1993.
He told Newshub that animated recreations of the event, unfortunately, aren't new and have been seen in other sandbox-type platforms and are, essentially, a variation of the original livestream.
"Effectively, it has much of the same or identical harm to the original live stream in terms of the promotion of abhorrent extremist violence, that is radicalising," he told Newshub.
"So that content can be classified as objectionable under our current system."
Anybody found "knowingly" in possession of objectionable material can receive a maximum of 10 years imprisonment, while anyone making, trading, distributing or displaying an objectionable publication via the internet can receive a maximum of 14 years imprisonment.
Legislation makes crackdown harder
However, the nature of Roblox combined with outdated legislation make these kinds of situations difficult to deal with, Shanks said.
"We are starting to see the collision between a set of regulatory provisions designed, largely, in an analogue world impacting with the virtual realities of what we're seeing now," he said.
"Roblox is a dynamic environment where people are engaging in real time and not necessarily capturing a video or a product from that."
There's also the transitory nature of these recreations - with the original example reported by Kelley no longer available because Roblox has since removed it.
"Clearly they [Roblox] don't want to tolerate that material, and it's going to be extremely difficult to find who generated this or to take any enforcement of a classification in this environment," Shanks said.
"While you could do it, it hits the limits in terms of what you can practically do in the real world in terms of enforcement and application.
"The toolkit that I have available is very limited and not fit for purpose in terms of the complexity and variability and nuance of what we're seeing."
But that doesn't excuse large platforms from understanding they have a duty of care to their users.
"If you're going to operate a service which is used by millions and millions of children and people around the world, then I would contend you have a duty to ensure that they're not harmed and certainly not targeted by terrorist or extremist content so you must engage and you must try and you must do better," Shanks said.
Rahman said that the issue of liability must also be considered when it comes to such harmful material.
"Thus far large tech companies have evaded liability for what users create or post on their platforms. They should be fully liable when they have been made aware of harmful content and they haven't removed it in a timely fashion," she said.
"They also need to be held responsible for ensuring they have taken the care necessary to ensure harmful material is not held on their platform. Creating that legal responsibility creates the conditions for ensuring due diligence."
Complexities in online moderation
Roblox told Newshub it proactively monitors for terrorist and extremist content on its platform, including content specific to the Christchurch terror attack and other mass shootings.
"We promptly removed this experience from Roblox after it was brought to our attention and suspended the user responsible for violating our community rules," a spokesperson told Newshub.
"We do not tolerate racism, discriminatory speech, or content related to tragic events. We have a stringent safety and monitoring system that is continuously active and that we rigorously and proactively enforce."
In the case of references to Christchurch, that includes a human review to balance allowing references to the actual city, but not any uses that violate Roblox policies - like the mass shooting recreation.
"We remain vigilant against bad actors who seek to evade our systems. We continue to work with global external organisations and researchers that specialise in these issues to inform our policies and policing of the platform, and we appreciate the opportunity to improve and iterate on our monitoring processes."
Both Rahman and Shanks understand there's a difficulty in moderating the sheer amount of content created on online platforms, such as Roblox, to ensure dangerous material is kept from proliferating while giving users freedom to create.
"We want to ensure that any content moderation doesn't block legitimate political dissent, for example, or content that falls outside of social norms," Rahman told Newshub.
"Many of the decisions around this are subjective and context dependent. It is a complex area to deal with, but an important and urgent one."
And Shanks hopes that a regulatory review underway in Aotearoa will allow consideration of what a modern, fit-for-purpose digital regulatory framework will look like.
"There's a need for some kind of co-regulatory engagement with these big platforms, that transparency from these platforms is going to be a critical component so that there's awareness of where the risks are," he said.
"There's recent indications coming out of the UK, that there is a high degree of awareness of this [sandbox] functionality by terrorists and extremists.
"There seems to be a trend towards potentially producing content that targets children and young people and that has got to be a very real concern here and overseas."