Reporting To Platforms

Article

Reporting To Platforms

Reporting is a simple but powerful way to keep online spaces safe. This article explains what reporting means, what happens after you report, and why it matters for protecting yourself and others.

22 Oct 2025
1 min read
author
IMDA Singapore
reporting-to-platforms-inner-desktop
Text Size:

report-buttons

What Is Reporting?

Reporting is a feature found on most social media platforms, messaging apps, games, and websites. It lets you alert the platform when you see something that is harmful, violent, or violates community guidelines. The community guidelines are a set of rules that tell users of a platform what behaviours are encouraged or are unacceptable.

Here are some examples of report buttons in the apps you use:

reporting-to-platform

When you report a post, message, or user, your feedback goes to the platform’s moderation team. They will review it and take action like removing the content or warning the user. You can report all sorts of harmful or unsafe content.

Some common examples include:

  • Nudity or sexual content
  • Child abuse or exploitation
  • Hate speech or threats
  • False or misleading information
  • Bullying or harassment
  • Scams or fraud
  • Posts about self-harm or suicide
  • Spam or fake promotions

You can also report a user if they are:

  • Posting harmful or inappropriate content
  • Pretending to be someone else (impersonation)
  • Too young to be using the platform

Note: If something involves illegal activity or feels dangerous in real life, do not just report it on the app. Reach out to a trusted adult and report it to the police.

Why Is It Important To Report?

Even with artificial intelligence (AI) filters and moderators, some harmful and inappropriate content slips through. You can play a part in protecting your space and the community by reporting them.

Reporting is confidential and the person you report will not be informed.

Reporting a post or content helps to:

  • Protect others from seeing harmful content like bullying, harassment, scams and explicit materials.
  • Send a signal to the platform to review the content and stop the spreading of such content.
  • Create a safer space for yourself and your community.

Reporting a user helps to:

  • • Prevent that person from targeting or hurting others
  • • Stop people who are pretending to be someone else or doing illegal things online

What Happens When I Report?

Most platforms let you report either a specific post, message, or a user. When you do, this is what usually happens:

report-flow

1. Say why you’re reporting

  • Most platforms give you a list of categories to choose from, such as hate speech, nudity, or bullying.
  • Be as accurate as you can and select the option that best fits.

2. Platform reviews the content

  • Your report goes to the platform’s moderation team of either human reviewers or AI.
  • If the content violates the community guidelines, it may be removed, and the account could get a warning or ban. This can take time, so don’t worry if you do not see action right away.

3. Take further action to stay safe

If the moderators decide the content does not violate the community guidelines, the content may stay up. However, you will still have options to protect yourself:

  • Tap “Not interested” (or a similar option) on the post to see less of that kind of content in the future.
  • Block the user so they cannot see your profile or message you
  • Encourage others to report if they also saw the harmful content as more reports help bring attention to the issue

Want more tips and tools to build healthier digital habits? Come along and Scroll Smart, Surf Safe with us.

How useful is this resource?

Not useful

You may be interested