Child Safety Standards
At AIVC, we maintain a zero-tolerance policy toward any form of child exploitation. We are committed to a safe digital environment and strictly prohibit the use of our platform for Child Sexual Abuse and Exploitation (CSAE).
1. Prohibition of CSAE
MobAppAI strictly prohibits the creation, sharing, promotion, or solicitation of Child Sexual Abuse Material (CSAM) or any content that exploits or endangers minors. This includes:
- Visual depictions of the sexual abuse of minors.
- Grooming behaviors or the solicitation of minors for sexual purposes.
- The use of AI-generated imagery depicting minors in sexualized contexts.
2. Enforcement & Rapid Removal
We combine advanced AI screening with professional human moderation to secure our platform:
- Immediate Removal: Any confirmed CSAM is removed instantly from our infrastructure.
- Permanent Ban: Users attempting to distribute or solicit CSAE content receive immediate, hardware-level bans without the possibility of appeal.
3. Mandatory Reporting
In accordance with international safety regulations, MobAppAI reports all confirmed instances of Child Sexual Abuse Material to the National Center for Missing & Exploited Children (NCMEC) and relevant local law enforcement agencies.
4. Reporting Safety Concerns
We empower our community to act immediately if they encounter violations:
- In-App Reporting: Use the dedicated "Report" tool available on every user profile and during every live call.
- Direct Safety Line: Send urgent reports directly to our Child Safety Compliance Officer at info@mobappai.com.
5. Safety Point of Contact
For inquiries regarding our safety enforcement practices, please contact:
Child Safety Compliance Officer
MobAppAI
Email: info@mobappai.com