Taggo Child Safety

Updated as of: January 4, 2026

Contact Email: [email protected]

We are committed to protecting the safety, dignity, and well-being of children on our platform. This Child Safety Policy outlines the key principles, prohibited behaviors, and enforcement actions we take to create a safe environment for minors.

1. Commitment to Child Safety

  • We maintain strict standards to prevent child abuse, exploitation, and any harmful behavior toward minors.
  • We apply age-appropriate protections, including content filtering, restricted access to certain features, and behavioral safeguards.
  • 2. Age Verification and Access Control

  • Users must provide accurate age information during account creation.
  • Users under the age of 13 (or higher age if required by local laws) are not allowed to access our services without verified parental consent.
  • Certain features (e.g., live streaming, direct messaging, location sharing) may be restricted or disabled for underage users to reduce potential risks.
  • 3. Prohibited Content and Behaviors

    We enforce zero tolerance for the following:

  • Sexual exploitation, abuse, or grooming of children
  • Sharing or creating child sexual abuse material (CSAM)
  • Manipulative or targeted advertising toward minors
  • Violence, harassment, or bullying against underage users
  • Any content that sexualizes or objectifies minors
  • Violations may result in immediate account termination and reporting to relevant authorities.

    4. Moderation and Content Control

  • We use a combination of automated detection systems and trained human moderators to review user-generated content.
  • Harmful content is reviewed and removed promptly—typically within 24 hours of being reported.
  • Our safety systems are regularly updated to address new and evolving threats (e.g., grooming tactics, hidden CSAM).
  • 5. Reporting and Enforcement

  • In-app "Report" and "Feedback" functions are available to flag harmful behavior or content.
  • Reports involving minors receive highest priority and are typically reviewed within 24 hours.
  • Actions taken may include:

  • Content removal
  • Account suspension or permanent ban
  • Referral to law enforcement when legally required
  • 6. Cooperation with Authorities

    We actively collaborate with:

  • Law enforcement agencies
  • Child protection organizations
  • Regulatory authorities (when required)
  • In cases involving suspected abuse or exploitation, we fully cooperate with lawful investigations and provide necessary information in accordance with applicable laws.

    7. Policy Review and Continuous Improvement

  • This Child Safety Policy is regularly reviewed and updated to reflect emerging threats, industry best practices, and legal requirements.
  • We remain committed to building a platform that is safe, age‑appropriate, and respectful for all users—especially children.
  • 8. Contact Us

    If you encounter content or behavior involving potential harm to minors, please report it through:

  • Email: [email protected]
  • In-app: Use the "Report" or "Feedback" functions
  • We treat all reports with urgency, confidentiality, and in full compliance with applicable laws.