AChild Safety Standards (CSAE/CSAM) Policy

Aftrs maintains a strict, zero-tolerance policy toward Child Sexual Abuse and Exploitation (CSAE) and Child Sexual Abuse Material (CSAM). We remove, report, and permanently ban accounts involved in any such content or behaviour.

1) Scope

This policy applies to all Aftrs products and services, including our mobile applications, websites, APIs, and any user-generated content or interactions on our platform. It applies to every user and visitor, regardless of age or location.

2) Definitions

2.1 Child

Any person under the age of 18.

2.2 CSAE

Child Sexual Abuse and Exploitation (CSAE) means any content, conduct, or activity that sexually exploits, abuses, or endangers a child. Examples include (without limitation): grooming, solicitation, sextortion, trafficking for sexual purposes, or facilitation of sexual contact with a child.

2.3 CSAM

Child Sexual Abuse Material (CSAM) means any content (image, video, text, audio, or other media) that depicts, promotes, or documents the sexual abuse or exploitation of a child, or sexualised depictions of a child — whether real or computer-generated where prohibited by law.

3) Prohibited content and behaviour

  • Any CSAM, including the creation, possession, distribution, or promotion of such material

  • Sexualised comments about children, sexual solicitation of, or grooming a child

  • Arranging or facilitating sexual contact with a child, including via links or third-party services

  • Sexualised, exploitative, or age-misrepresentative profiles or content involving children

  • Attempts to obtain CSAM or to direct others to locations containing CSAM

4) Detection, moderation, and enforcement

We employ a combination of automated signals and human review to detect prohibited content and behaviour. When CSAE/CSAM is suspected:

  • Immediate action: We restrict or disable relevant content, features, and/or accounts while investigated.

  • Permanent bans: Confirmed violators are permanently banned. Associated devices and accounts may be blocked.

  • Evidence handling: We securely preserve minimal necessary logs and evidence to support reporting to competent authorities, then retain and delete in line with legal obligations.

5) Reporting & escalation

If you believe CSAE/CSAM is present on Aftrs or that a child is at risk:

  1. Use the in-app Report flow (Profile → Contact us) or email us at support@aftrs.app