New Zealand lacks a systemic, child-centred approach to online harm.
For over a decade, we have largely taken a "watch and wait" approach — observing global action without leading our own. The Safer Online Services and Media Platforms (SOSMP) review aimed to change this, but was scrapped by the 6th National-led Coalition Government in 2024 under Minsiter of DIA Brooke van Velden.
The review process was underway between 2021–2024. This led to a draft framework which was open for public submissions.
It aimied to update New Zealand’s outdated, pre-internet regulations.
Over 50% of submissions were broadly supportive — including Meta, Google, Microsoft, X Corp, Reddit, TikTok
Key organsiations like the Classifications Office, and Netsafe acknowledged the need for regulatory change.
Around 40% of submitters did not take a clear position.
Only 10% were opposed.
Despite this, the entire reform effort was dropped by the current National Coalition Government, citing concerns about freedom of expression.
A DIA spokesperson told media, “Content regulatory reform of the scale proposed by the Safer Online Services and Media Platforms work is not a ministerial priority for the Minister of Internal Affairs, Brooke van Velden.”
New Zealand now has no clear regulatory framework to address the rising scale of digital harm to children.
Urgent action is needed.
Drawing on international best practice, we recommend:
A National Strategy for Online Child Safety
A whole-of-government plan to prevent harm, coordinate action, and promote child rights online.A Safer Internet Agency and Online Children’s Commissioner
An independent agency empowered to lead, set standards, and drive action — with strong accountability mechanisms.Clear Policy Mandates
Giving the agency authority to investigate, enforce safety standards, require transparency from tech platforms, and handle complaints.A Future-Focused Vision
Moving beyond reactive measures towards a long-term roadmap for safer digital environments.Narrow, Child-Specific Scope
Targeted regulation to protect children and tackle illegal content — without undermining free speech. Free speech groups have indicated support for focused protections on child safety, provided overreach is avoided.