Koko logo

Koko

Koko — crisis detection and intervention API for self-harm, suicide, and eating-disorder signals.

-

Our Verdict

Specialist layer for platforms with mental-health exposure; pair with a general moderation vendor.

Pros

  • Only major API focused on crisis intervention
  • Detects self-harm, suicide, and ED signals
  • Research-backed, used on real platforms
  • Offers response playbooks, not just detection

Cons

  • Very narrow scope, not general moderation
  • Ethical concerns around past AI-experiment controversy
  • Limited public docs and pricing
  • Requires careful human escalation design
Best for: Teen apps, mental health platforms, and communities needing crisis signal detection. Not for: General moderation use cases or teams uncomfortable with sensitive domain risk.

When to Use Koko

Good fit if you need

  • Building empathy-focused mental health support tools with AI
  • Detecting user distress signals in text for crisis intervention
  • Integrating peer support features with AI-assisted response routing
  • Running content safety checks for mental health platform moderation

Lock-in Assessment

Medium 3/5
Lock-in Score
3/5

Koko Pricing

Pricing Model
custom
Free Tier
No
Entry Price
Enterprise Available
No
Transparency Score

Beta — estimates may differ from actual pricing

1,000
1001K10K100K1M

Estimated Monthly Cost

$25

Estimated Annual Cost

$300

Estimates are approximate and may not reflect current pricing. Always check the official pricing page.

Community Discussion

Comments powered by Giscus (GitHub Discussions). You need a GitHub account to comment.