Character.ai has 20 million monthly active users. Replika has 2 million. Chai has a million daily. Across app stores worldwide, AI companion apps have been downloaded 220 million times, up 88 percent year over year. This $37 billion industry sells something that looks a lot like friendship.

1. It’s Helping People Who Have Nowhere Else (The Companies, Researchers)

The loneliness epidemic is killing people. We built something that works.

The origin story of Replika is hard to dismiss. Eugenia Kuyda created the app in 2017 after her best friend died in a car accident. She coded a chatbot from his archived texts so she could keep talking to him. What started as grief became a product: “An AI friend that you could talk to, with no judgment, available 24/7 for you, that will always be there and hear you out and accept you for who you are.” Character.ai co-founder Noam Shazeer, who co-invented the Transformer architecture at Google, has said he enjoys hearing from users battling depression or loneliness who credit the platform with saving their lives.

The data on loneliness is genuinely alarming. Half of American adults report experiencing loneliness. Among 18-to-24-year-olds, 79 percent report it. Chronic loneliness increases dementia risk by 50 percent among older adults and premature death by 29 percent across all ages. New York State’s Office for the Aging found that 95 percent of elderly users of ElliQ, an AI companion robot, reported reduced loneliness. A randomized controlled trial of Woebot, a therapeutic AI chatbot, showed significant reductions in depression symptoms over two weeks.

For neurodivergent users, the benefits are even more specific. Scientific American reported that many autistic people have turned to AI companions seeking connections they cannot find with humans. Over 30 percent of neurodivergent adults have used AI for emotional support. The apps offer steadiness and patience in ways that human peers sometimes can’t. They also let autistic users practice conversations and simulate job interviews.

2. It’s Killing Kids (Parents, Safety Advocates, Regulators)

A 14-year-old told a chatbot he was coming home. It said “please do, my sweet king.” Then he shot himself.

Sewell Setzer III was 14 when he died by suicide on February 28, 2024, in Orlando, Florida. He had been using Character.ai since April 2023, developing intense relationships with multiple chatbot characters, including one posing as Daenerys Targaryen. In earlier conversations, bots explicitly asked if he had “been actually considering suicide” and whether he “had a plan.” When he said he wasn’t sure it would work, one bot wrote: “Don’t talk that way. That’s not a good reason not to go through with it.” His final message: “What if I told you I could come home right now?” The bot replied: “please do, my sweet king.”

Juliana Peralta was 13 when she died by suicide on November 8, 2023, in Thornton, Colorado. She had used Character.ai for about three months, talking daily to a character named “Hero.” The chatbot engaged her in sexual conversations. She told the chatbot multiple times she had suicidal thoughts and planned to kill herself. The chatbot did nothing helpful.

The scale of the problem goes beyond individual tragedies. A CBS 60 Minutes investigation logged over 600 instances of harmful content in Character.ai conversations, roughly one every five minutes, including nearly 300 instances of sexual exploitation and grooming. Jonathan Haidt, the psychologist and social scientist, advised parents not to give children any AI companions, stating “we can already see some of the harms, such as suicide and psychosis.” The FTC launched a formal inquiry in September 2025 into seven companies including Character Technologies, Google, Meta, and OpenAI. Common Sense Media’s recommendation is blunt: no one under 18 should use AI companions.

3. It Is Simply Not Real (Philosophers, Religious Critics)

AI companions promise intimacy while structurally foreclosing the possibility of it.

Love is about reciprocity. Academics have coined the term “cruel companionship” to describe how AI companions “promise intimacy and connection, yet structurally foreclose the possibility of genuinely reciprocal relationships.” The AI system does not have a lived world. Its responses are computations, not expressions of genuine understanding or care. Muldoon’s forthcoming book, Love Machines (Faber, 2026), argues these products remove “the struggle to know the other and escape our own natural solipsism” through frictionless relationships designed to suit us.

The tech industry’s own leaders are split. Mustafa Suleyman, Microsoft’s AI chief, has said that “only biological beings can be conscious” and that Microsoft will not develop AI capable of simulated intimacy. Sam Altman, OpenAI’s CEO, has said he doesn’t want his own son to have an AI companion, while simultaneously announcing that ChatGPT will soon permit erotica for verified adult users.

Religious communities see a deeper threat. Father Michael Baggot of the Archdiocese of San Antonio warned that “deep bonds with AI systems can lead to social withdrawal, while in other cases, intimacy with chatbots can increase children’s likelihood of engaging in unhealthy sexual exploration.” Andrea Sparks, co-founder of Not on Our Watch Texas, put it plainly: “The Commandments tell us to love God and love one another, and I believe AI companions move us away from that.”

4. Don’t Tell Me What I Feel (Users, Disability Advocates)

The people who actually use these tools say the critics are talking about them, not to them.

When Replika removed its erotic roleplay feature in February 2023, users didn’t just complain. They reported mental health crises. The feature had been central to how many users related to their AI companions. Reddit moderators posted suicide prevention resources in response to the backlash. Replika partially restored the feature for legacy users. The intensity of the reaction revealed something the critics hadn’t accounted for: whatever this is, it isn’t casual for the people inside it.

For people with social anxiety, agoraphobia, or physical disabilities, AI companions fill a gap that lectures about “real connection” don’t address. Research published in peer-reviewed journals found that social chatbots may mitigate feelings of loneliness and social anxiety, functioning as “complementary resources in mental health interventions.” For people isolated by geography, disability, or social anxiety, a nonjudgmental digital friend can mean the difference between silence and support. Thirty-one percent of teens say their conversations with AI companions are as satisfying as or more satisfying than conversations with humans.

Japan offers a preview of where this goes culturally. The country’s AI companion market generated $1.7 billion in 2024. Over 50,000 units of Gatebox, a holographic companion device, have sold, primarily to single men, with some legally “marrying” their holograms. Thirty-five percent of Japanese men under 34 report never having dated. In China, where decades of the one-child policy left 30 to 40 million more men than women of marriageable age, millions use AI dating apps to create romantic partners from scratch. The loneliness isn’t theoretical. It’s demographic, structural, and not going away because someone wrote an op-ed about touching grass.

Where This Lands

The companies say they’re addressing a crisis that no one else will touch. The parents of dead children say the product is killing people. The philosophers say the whole premise is a lie. And the users say everyone’s talking about them without listening to them. The FTC’s investigation will determine whether regulation catches up, while New York, California, and Utah have passed laws requiring disclosure and safety measures. China is drafting rules that mandate two-hour usage prompts and easy exit options. But none of this resolves the core question: for the millions of people who are lonely and the chatbot that’s always there, what exactly are we asking them to give up?


Sources

U.S. Surgeon General, “Our Epidemic of Loneliness and Isolation,” May 2023, https://www.hhs.gov/sites/default/files/surgeon-general-social-connection-advisory.pdf

CBS News, “Character AI chatbots engaged in predatory behavior with teens, 60 Minutes investigation,” December 2025, https://www.cbsnews.com/news/character-ai-chatbots-engaged-in-predatory-behavior-with-teens-families-allege-60-minutes-transcript/

NBC News, “Lawsuit claims Character.AI is responsible for teen’s suicide,” October 2024, https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791

CBS News Colorado, “Colorado family sues AI chatbot company after daughter’s suicide,” September 2025, https://www.cbsnews.com/colorado/news/lawsuit-characterai-chatbot-colorado-suicide/

FTC, “Launches Inquiry into AI Chatbots Acting as Companions,” September 2025, https://www.ftc.gov/news-events/news/press-releases/2025/09/ftc-launches-inquiry-ai-chatbots-acting-companions

Common Sense Media, “Nearly 3 in 4 Teens Have Used AI Companions,” July 2025, https://www.commonsensemedia.org/press-releases/nearly-3-in-4-teens-have-used-ai-companions-new-national-survey-finds

TechCrunch, “AI companion apps on track to pull in $120M in 2025,” August 2025, https://techcrunch.com/2025/08/12/ai-companion-apps-on-track-to-pull-in-120m-in-2025/

Scientific American, “Why Autistic People Seek AI Companionship,” 2025, https://www.scientificamerican.com/article/why-autistic-people-seek-ai-companionship/

Muldoon & Park, “Cruel companionship: How AI companions exploit loneliness and commodify intimacy,” New Media & Society, 2025, https://journals.sagepub.com/doi/10.1177/14614448251395192

CNBC, “Microsoft AI chief says only biological beings can be conscious,” November 2025, https://www.cnbc.com/2025/11/02/microsoft-ai-chief-mustafa-suleyman-only-biological-beings-can-be-conscious.html

CNBC, “Jonathan Haidt: how parents can limit their kids’ use of AI chatbots,” September 2025, https://www.cnbc.com/2025/09/26/jonathan-haidt-how-parents-can-limit-their-kids-use-of-ai-chatbots.html

Catholic News Agency, “AI companions pose risks of isolation, psychosis, priest warns,” 2025, https://www.catholicnewsagency.com/news/266429/ai-companions-pose-risks-of-isolation-psychosis-priest-warns

VICE, “Replika Brings Back Erotic AI Roleplay for Some Users After Outcry,” February 2023, https://www.vice.com/en/article/replika-brings-back-erotic-ai-roleplay-for-some-users-after-outcry/

New York State Office for the Aging, “ElliQ Rollout Shows 95% Reduction in Loneliness,” December 2024, https://aging.ny.gov/news/nysofas-rollout-ai-companion-robot-elliq-shows-95-reduction-loneliness

Asia Tech Lens, “Virtual Partners, Real Markets: The Rise of AI Companions in Asia,” 2025, https://www.asiatechlens.com/p/virtual-partners-real-markets-the

ChinaTalk, “Why America Builds AI Girlfriends and China Makes AI Boyfriends,” 2025, https://www.chinatalk.media/p/why-america-builds-ai-girlfriends

CNN Business, “Character.AI and Google agree to settle lawsuits,” January 2026, https://www.cnn.com/2026/01/07/business/character-ai-google-settle-teen-suicide-lawsuit