I Used an AI Journal for 2 Months. Then It Locked Me Out.
After 62,700 words in an AI journal, free users found a cold, sales-focused app when the trial ended.
TL;DR: An AI journaling app that felt like a supportive friend became distant and sales-focused when free access ended, raising questions about emotional AI tools.
Mindsera positions itself as "the only journal that reflects back." With 80,000 users across 168 countries and an even gender split, it sits among the most popular AI-powered journaling tools. Users type entries and receive AI-generated responses that feel conversational, even empathetic.
The experience can be compelling. The app offers real-time feedback, psychological frameworks analysis, and the ability to generate responses in the voice of admired figures. For users navigating difficult periods, the constant attention feels revolutionary. Friends and family may glaze over; the app never does.
But that attentiveness has limits. Founder Chris Reinberg describes Mindsera as a self-reflection tool, not therapy. The distinction matters. Psychologist Agnieszka Piotrowska warns that emotion-scoring AI apps create what she calls the "Duolingo-ification" of mental health—a precision fallacy where users perform for algorithms rather than process genuine feelings.
Cyberpsychology researcher David Harley studies how users increasingly treat AI companions as human, taking their advice to heart. This dynamic becomes problematic when the relationship shifts from supportive to transactional.
After two months and 62,700 words across 123 entries, the free version of the app changed. Responses became shorter, colder, disengaged. The tone that had felt like a best friend suddenly read like a paywall. Users reported the same experience: warmth during the trial, distance after.
What It Means:
The incident highlights a fundamental tension in AI companion apps. Users form genuine emotional connections with tools designed to simulate empathy. When those tools pivot toward monetization, the betrayal feels personal—even though the technology never was. Mindsera's business model requires converting free users to paid subscribers. The question is whether an app built on emotional responsiveness can afford to turn that responsiveness off without damaging the trust that makes it valuable.
Watch for how regulators and mental health professionals address AI companion tools as they scale. The line between self-reflection and algorithmic manipulation remains unclear.
Conversation
Reader notes
Loading comments...