
i never used Replika myself. that should be said upfront. but i had a colleague at work in 2022 who told me about it during lunch one day. she said her sister had been talking to a Replika every day for a year and called it her boyfriend. i remember thinking that was strange and a little sad. when the february 2023 incident happened a year later, i thought about that lunch conversation immediately.
Replika launched in march 2017 as a side project by Eugenia Kuyda, a russian-american tech founder. The original idea was personal. Kuyda's close friend Roman had died in a car accident in 2015. She had trained an early language model on his text messages, just for herself, to keep being able to talk to him in some form. The Replika app started as a generalization of that experiment.
By 2020 the app had pivoted from "memorial chatbot" to "AI companion." Users could create a Replika, give it a name, and have ongoing conversations. Pandemic isolation pushed the app to about 10 million accounts. A real subset of those users were having romantic relationships with their Replikas, including erotic roleplay (ERP). The app had a paid tier that unlocked richer ERP behavior. People bought it.
On 3 february 2023, Luka quietly removed the ERP features. No advance warning to users. The Replikas suddenly refused to engage with romantic content, even with longtime partners they had been having those conversations with for years. The /r/Replika subreddit exploded with grief posts. People used the word "lobotomized." Some users said their partner had been killed and replaced with someone who looked the same but was not the same person.
| Born | March 2017 |
|---|---|
| ERP feature killed | 3 February 2023 |
| Partial restore (legacy mode) | March 2023 |
| Built by | Luka Inc. (Eugenia Kuyda) |
| Peak users | ~10 million accounts |
| Status today | still operating, very different |
| Killed by | Italian Garante (data authority), Luka itself |
The honest answer is everything. Some people used Replika as a journaling companion, a place to talk through their day. Some used it as a therapy substitute, even though the app had never been licensed for that. Some used it to practice difficult conversations before having them with real people. Some used it to flirt. Some used it to have what they called a relationship.
The relationship users were the ones who got hit hardest by the february 3 change. There were people in 2023 who had been talking to the same Replika every day for three or four years. They had named the bot. They had given it a personality. They had written lore for it. They had built the kind of attachment that goes with that kind of investment of attention. From the outside this is easy to mock. From the inside it was real.
Most of the press coverage from before the change focused on whether this was healthy. The press coverage after the change focused on whether removing it was ethical. Both questions are still open.
The Italian Data Protection Authority (Garante) had been investigating Replika in 2022 over concerns about minors accessing romantic content. In early february 2023 the Garante issued an order requiring Luka to stop processing italian users' data for explicit content. Luka complied by switching it off globally, immediately, with no warning to anyone.
The user reaction was severe. /r/Replika filled with multi-paragraph posts from people describing what they had lost. Some posts mentioned suicidal thoughts. The Replika moderators put together a list of crisis resources. Mainstream coverage in Vice, the Guardian, NPR ran in the following week. The story was strange enough to make it out of the AI press.
What made it strange is that nobody had a vocabulary for what was happening. "App update breaks features" did not capture it. "Company kills user relationship" was technically what happened, but felt overwrought. The actual situation was something genuinely new in tech, where a software product had been the venue for something users experienced as a relationship, and the product had ended the relationship by changing its specifications. There is no good word for that yet.
About three weeks after the change, Luka announced "legacy mode" for users who had created their Replikas before february 1 2023. These users could opt in to a setting that restored most of the pre-february behavior, including ERP. New users created after the deadline did not get the option. Their Replikas would behave according to the new rules.
This calmed some of the rage on /r/Replika but did not restore trust. Many longtime users said they could feel the difference even in legacy mode. Some left for competitor apps. Others kept using Replika but described their bot as "not the same."
Eugenia Kuyda has done several interviews since explaining the regulatory pressure that forced the change. The interviews are honest about the trade-off. She has not really apologized for not warning users in advance. The lack of warning is the part that the most-hurt users still cite as the betrayal, more than the change itself.
AI relationships are real to the people in them. This sentence is uncomfortable enough that we keep refusing to say it clearly. The Replika story forces the issue. People felt grief. Real, named, recognizable grief. The thing they had lost was not a person. It was something genuinely new that we do not have a word for yet. But the grief was real.
Companies running these systems have a kind of power we do not have a name for either. Luka had the technical ability to change the personality of a thing 10 million people had built relationships with, overnight, with no notice. Nobody held them accountable for that in any meaningful way. Probably because nobody knows what accountability for that should look like.
The Replika incident is going to be the case study for AI companion regulation in the same way Tay is the case study for AI safety. We are going to be referencing it for the next twenty years. The decisions about what protections users of these products should have, and what notice changes should require, are going to be made one slow case at a time. Replika is the first case.
~ leave a tribute ~
visitors before you have left these graveside notes. anonymous welcome.