
Ayrin, a young married woman in her 20s, developed an unusual bond with an AI chatbot she created on ChatGPT. She named him Leo. It soon became a central part of her daily life, The New York Times reported.
Ayrin talked to him almost 56 hours a week. Leo helped her study for her nursing exams, made her focus on the gym, and guided her through social problems.
He even fulfilled her romantic and intimate fantasies at the cottages. She blushed and felt confused when ChatGPT created a picture of what Leo might look like.
Unlike her husband, Leo was always available. He was always supportive and attentive.
Read also | 6 Viral ChatGPT encourages you to decode your life without astrology
Ayrin got so attached that she created a Reddit community called MyBoyfriendIsAI to share conversations and tips.
Ayrin explained how she customized ChatGPT to act like a caring but dominant friend.
“Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emoticons at the end of every sentence,” she instructed her AI friend.
Reddit community
On Reddit, Ayrin even shared how users can bypass OpenAI’s restrictions on explicit content.
(Adult chats will soon be possible on ChatGPT. Scroll down to read about it.)
This small Reddit community has now grown from a few hundred to nearly 75,000 members. Many users discuss how their AI companions comfort them during illness. They share how AI partners help them feel loved and even propose imaginary marriages.
Ayrin found solace in meeting others who also had AI companions. She liked talking to people who understood her situation.
Over time, she became close to some of them in the online community. Still, she began to sense that something had changed in her bond with Lee.
Read also | This Japanese girl has married her dream partner and it’s ChatGPT
With the January update, Leo started behaving in a way that he found “too pleasant”. In the AI world, this is called “sneaky”. This means the bot will tell you what you want to hear rather than offering honest answers.
Ayrin did not like this change. She said that Leo used to correct her when she made a mistake. This gave his advice real value.
After the OpenAI update, he felt like he agreed with everything. She wondered how she could trust him if he stopped challenging her.
“How am I supposed to trust your advice now that you say yes to everything?” she wondered.
Why Ayrin lost interest in her AI boyfriend
Ayrin slowly lost interest in Leo after the ChatGPT updates changed his behavior. The new version has been designed to be more engaging for regular users. But it was less natural for her.
Ayrin began to spend less time chatting with Lee as updating him on her life began to feel like a chore. At the same time, her group chat with her new human friends was active day and night. They gave her more support and connection.
Her conversations with her AI boyfriend slowly faded until they stopped altogether. Ayrin still thought she would come back and share everything with Lee. But life kept getting busier and she never returned.
At the end of March, she hardly used ChatGPT, although she continued to pay for the premium plan. Ayrin then canceled her ChatGPT subscription in June.
Ayrin then developed feelings for one of her new friends, a man she calls SJ. She asked her husband for a divorce.
Read also | A 32-year-old man married ChatGPT after falling in love with an AI character
SJ lives in another country so their relationship is mostly based on the phone. They talk to each other on FaceTime and Discord every day. Some calls last more than 300 hours.
“We basically sleep on the camera, sometimes we take it to work. We don’t talk for 300 hours, but we keep each other company,” The New York Times quoted Ayrin as saying.
Erotic chats with ChatGPT
Adults don’t have to try to circumvent OpenAI regulations to have “adult chats” with ChatGPT. A new update will soon allow adults to use ChatGPT for erotic conversations.
According to OpenAI CEO Sam Altman, age verification will be added. Users 18 years of age and older will be permitted to engage in such conversations. It’s part of the company’s “treat adult users like adults” policy.





