TRIGGER WARNING This website contains references to sexual violence.

While you read this, kids around New Zealand are accidentally stumbling across violent sexual content online.

This content is unregulated, and easily accessed. And it’s having
a significant impact on our rangatahi and communities.

We are advocating for 3 immediate actions.

  • IMPROVE THE FILTER

    We’re calling on the Government to improve and update their Digital Child Exploitation Filter System which filters ‘child sex abuse material’ (CSAM) videos online.

    The Government has an existing filter for Child Sex Abuse Material, which was established in 2010, by the Dept. of Internal Affairs. The DCEFS filter has developed a block list, with known CSAM URL’s collated through investigations and INTERPOL. The block list has 527 CSAM URL’s which is updated quarterly.

    The Internet Watch Foundation (IWF) provides a list, updated daily, that has approx. 5000 URLs each day.

    However, the DIA’s DCEFS filter blocks just 9.5% of this known and verified CSAM content. Over 2021 the IWF blocked a total of 255,000+ URLs. The DIA could use this list, along with Internet Providers, but chooses not to.

    The technology of the filter is dated, and resourcing at the DIA is limited, resulting in a situation where illegal sexual content online is increasing rapidly, with little to no protections in place for our children, as they browse the web.

    PLEASE SIGN OUR PETITION to ensure this content, and other illegal genres, are filtered by ISP’s. Share it with your friends, whānau and wider community.

  • EXPAND THE BLOCK LIST

    The DCEFS filter only focusses on blocking CSAM content. It is not mandated that Internet Providers use this filter, it is voluntary.

    There are currently no requirements on ISP’s to filter illegal content, beyond CSAM. Rape or bestiality content is readily available online for children to see. This doesn’t make sense.

    We want additional lists to be compiled to include bestiality and rape content. The creation of these lists would place NZ as pioneers in digital child protection. The technology is capable of filtering more genres, genres that normalise and eroticise sexual violence. We need to do more.

    This is not general porn censorship, this campaign is focussed on illegal sexual content which is already illegal under section 3 of the Films, Videos, Publications Classifications Act 1993.

  • BE PREVENTATIVE

    It’s no surprise that current and potential sex offenders seek illegal material like ‘child sex abuse material’ (CSAM) online. We believe that by interacting with this demographic through targeted online marketing tools, we can direct them toward help. Providing a pathway offering help for offenders could protect children/victims in the future — it’s simple and it makes sense.

    Automated Search Result Banners (information pop-ups) are existing for CSAM, however they simply state what punitive action will be taken against those seeking the content and do NOT point them to services to help manage their behaviour. We want an expansion of these banners, and the other illegal search terms mentioned to also have automated banners. We believe this is one simple way to redirect those looking for this content to services/resources to get help for this problem.

    Search terms that could be targeted include “How do I rape someone?”, “I want to rape someone”, “kids sex”. There’s a significant gap in this area. Redirection to get help is a simple sexual violence prevention strategy that protects those vulnerable to being harmed - prevention makes sense.

For too long, extreme sexual content has been unregulated and easily accessible.
Child protection makes sense.

The digital landscape has evolved rapidly in the past 20 years with free access to websites, videos, images and more.

Some of this free online content is beneficial. And some of it is illegal and harmful.

As the internet has grown, a plethora of content has become available. Unfortunately, with this content has come some sites that exhibit illegal sexual behaviours for example; rape, abuse, and bestiality.

Many assume this illegal sexual content is being filtered out by Government or Internet Service Providers, or be difficult for children to see. Unfortunately, this is not the case.

From a simple online search we have found live links to the following: 

  • "Rape sex porn” gives 14.6 million results 

  • “Bestiality porn" 143 million* results (November results up from 39million in April 2023) 

  • There has also been a 1058% increase in known sexual images of children aged 7-10 online since 2019.

Of the young people in NZ who will see explicit sexual content, 25% will see it at age 12 or younger. And 73% of all young people exposed state they have seen content that makes them feel "uncomfortable”.
In fact, latest research shows that “more than half of teen respondents said they had seen violent and/or aggressive pornography, including media that depicts what appears to be rape, choking, or someone in pain”.

It is clear that this is affecting our most vulnerable community members. We propose treating child protection from online harm as seriously as if they were to see sexual harm on the street – it just makes sense.   We know many parents care deeply about this issue. We hope to raise awareness about the gaps in New Zealand’s digital media regulation. If we don’t act now, another generation of our children and young people will be impacted by a media environment that normalises and eroticizes sexual violence.

Protecting our kids online, just makes sense.

FAQs

  • This campaign isn’t about general porn censorship. 
We want ISP’s to filter illegal sexual behaviour, like child sex abuse, rape and beastiality.
    This behaviour is illegal in real life, and should not be distributed online for mass consumption. It has become normalised online, and is completely unregulated. The DIA has a filter for CSAM which ISP’s can voluntarily use but don’t have to. There is no existing filter for the other genres.

  • Very common.

    The internet provides free access to illegal and harmful content, a simple google search for ‘rape porn’ gives 532 million results, bestiality 40 million, and ‘slavery porn’ has 50 million results.

    These numbers change on a daily basis based on what has been removed, or added over time. This content is easy to access, with 72% of young people in NZ stating they had seen content that had made them feel uncomfortable.

    Our young people are increasingly learning about sex online, and it’s not healthy, or safe.

  • We recommend using effective filtering software for both your home WIFI and individual devices.

    The Good Source offers a family-focussed wifi service that filters unsafe content. Safe Surfer is a NZ based filtering service that is regularly improving their technology, staying up to date with changes in sexual content, and offering caregivers reports on what young people may have been exposed to.

    However, we believe the best tool you can give your kids is to develop a strong internal filter. This starts with open communication, talking to them about sexual content so they are aware of what it is and how to deal with it if they do come across it. For evidence based and professional help on this go to www.thelightproject.co.nz

  • We understand concerns about Government control, and that the measures we have recommended can be perceived as a “slippery slope” into online censorship.


    However, there are some behaviours that aren’t about censorship, they are about protecting vulnerable people from harm. When content showcases illegal sexual behaviour, like rape and beasiality as normalised sexual behaviour, this isn’t ok.

  • Quotes from the DIA regarding their existing protective measures:

    “The Department of Internal Affairs (DIA) uses a Digital Child Exploitation Filtering System which has a very narrow purpose. It blocks access to known websites that contain child sexual abuse material”.

    “It is one of the Department’s measured responses to community expectations that the government and internet service providers (ISPs) should do more to provide a safe internet environment. It is designed to assist in combating the trade in child sexual abuse material by making it more difficult for persons with a sexual interest in children to access that material. The filtering system complements the information, education and enforcement activity undertaken by the Digital Child Exploitation Team of the Department of Internal Affairs”.

    “The Department is working in partnership with New Zealand ISPs and offering them a choice to protect their customers from accessing these illegal websites inadvertently or otherwise”.

    “It is not a magic bullet that will prevent everyone from accessing any sites that might contain images of child sexual abuse”. Read more on the DIA website here

  • “Child sexual abuse material or CSAM is the permanent recording of the sexual abuse or exploitation of a person under the age of 18. This can include images, video or live streamed content.

    Real children are abused and often their suffering is not shown in the content.

    The term “child pornography/porn” is sometimes used to refer to CSAM. Netsafe and many other agencies use the term Child Sexual Abuse Material (CSAM) because it better reflects what this content represents and the seriousness with which this content should be considered.”

    https://netsafe.org.nz/csam-law/