Reporting Incidents

How to report online safety issues

Written by Cyber Expert:

Taryn Wren

ICT Teacher

If your child is experiencing abuse or unwanted contact online, there are a number of in-platform controls available help manage the situation. 

Reporting to have embarrassing, inappropriate, or hurtful posts or comments removed from social media as soon as possible, is essential. Reporting to the platform is generally the first step but if you have ongoing concerns, or if the incident is serious (such as severe cyberbullying, image based abuse, or involving child exploitation material), consider reporting it to the Australian Office of the eSafety Commissioner (Australia only). In some cases, it may also be appropriate to seek assistance from local law enforcement.

Reporting Incidents by Platform

Use the links below to report online safety incidents directly to the respective platforms.

TikTok

Report content on TikTok

TikTok allows users to report people for cyberbullying, impersonation, and sharing inappropriate or illegal content. Users are able to block people from interacting with them through the platform and delete unwanted comments. Privacy controls can also be used to restrict who can view, comment, and duet a user’s posts.

Instagram

Report content on Instagram

Instagram has a range of tools available to help users manage cyberbullying and other forms of unwanted contact or abuse. If your child is the target of unwanted contact on Instagram, try looking through the management tools available together, and discussing which management option your child is most comfortable with.

Snapchat

Report content on Snapchat

Snapchat allows users to report abuse, including harassment, bullying or any other safety concerns. Every report is reviewed by someone at Snapchat, usually within 24 hours.

Facebook

Report content on Facebook

Facebook allows users to report comments, people, groups, advertisements, and more. Facebook will review and take action where an item breaches the platform’s community guidelines. A user can also block certain people from interacting with, or viewing their page or content.

Messenger

Report content on Messenger

Messenger allows users to block people and delete messages/conversations. Users can also report people for being abusive, inappropriate, or for spamming.

Skype

Report content on Skype

Skype allows users to block contacts. Users are also able to report other users to Skype.

WhatsApp

WhatsApp's Safety Guide

WhatsApp allows users to block and report other accounts. If a user violates the platform’s Terms of Service, then WhatsApp may ban their account.

YouTube

Report content on YouTube

YouTube allows users to report inappropriate content, problematic search predictions, abuse, or other content that breaches their community guidelines. YouTube does not allow users to make comments on videos featuring children.

Fortnite

Report content on Fortnite

Fortnite allows users to report players for bad behaviour. Fortnite reviews reports and takes action against players that have breached their code of conduct.

Minecraft

Minecraft Multiplayer Server Safety

Minecraft offers lots of options for controlling who can play and communicate with your child. If your child is being targeted on the platform, you can mute, block, and report the problematic player.

Roblox

Report content on Roblox

Roblox allows users to block and report abuse and other rule violations. There are also in-platform controls that allow you to control who can contact your child (see our Roblox guide for more information).

Related Articles

App Reviews

In-depth cyber safety expert reviews of the most popular apps.

Instagram: What parents need to know

From photo editing, to video streaming, content sharing, private messaging, and even viral video challenges ...

TikTok

A parenting deep-dive into the trending app.