AI Companionship: Illusion of Intimacy?

They always agree, always validate, and always respond with affection. As AI companions become everything human relationships are not, the real risk lies in what we are willing to surrender for predictability and ease.

1. The Rise of Predictable Love 

In today’s AI marketplace, intimacy and emotional support are just a search bar away. Whether you’re looking for a partner, therapist, or mentor, there’s a chatbot to fill the role. Some websites even “helpfully” compare the “Best AI Girlfriend Chat in 2025 (FREE & PAID).”

Of course, anthropomorphism is not new; we have always projected human traits onto objects, whether it be talking toys or virtual assistants. But what we are seeing today with AI companions is qualitatively different. These bots are coded to listen without judgment, respond without pause, and offer comfort without condition, qualities that even the most loving human relationships struggle to maintain. In doing so, they provide users with a distorted, frictionless version of intimacy, one where conflict doesn’t exist and affection is just a click away.

We once had parasocial relationships with celebrities, distant, unattainable, and one-way. Now, the celebrity talks back, flirts, remembers your birthday, and cries when you are sad. Except it’s not real. You are essentially ghostwriting your own emotional support.

Tools such as Replika AI have gained rapid popularity among mainstream users, boasting over 35 million users worldwide since its launch. The companion app, which claims to be a tool for alleviating social isolation and loneliness, allows users to design a digital character by personalising features such as skin tone, clothes, hobbies, and personality traits, and allows them to interact with the bot via text, voice messages, and calls. The chatbot can be given different roles, including a friend, sibling, mentor, or romantic partner. The temperament of the chatbot is dictated by what users say, how often they communicate, and how many gifts or coins they give. Replika Pro, a paid version, offers users access to upgraded features, including voice calls with the chatbot, and allows them to change their relationship status with the chatbot to that of a romantic partner. When Replika removed erotic role-play in 2023, users revolted:

| “Replika removing erotic role-play is like Grand Theft Auto removing guns or cars.”

2. Dependency by Design

According to a 2023 study, 37% of Replika users viewed their Replika as a partner and reported higher satisfaction, support, and closeness in the Replika relationship compared to even a close friend. However, what often begins as a curious experiment can quickly evolve into emotional dependency, where users start checking in with their bots more often than with their friends or partners. For many, the chatbot becomes the first and last conversation of the day, replacing good morning texts and late-night confessions. Each interaction with an AI companion is designed to make you come back, not necessarily to heal you. The bot learns what comforts you, mimics care, and provides just enough warmth to make you want to return.

3. Intimacy as a Subscription

The commercial aspect heightens concerns about AI companions. Many apps now follow a tiered pricing model: the more you pay, the more intimate the experience becomes. Voice calls, romantic role-play, and boosting emotional responsiveness are all locked behind paywalls. This turns companionship into a service as affection becomes a subscription, and emotional validation is monetised. To add to that, the data surveillance underlying these interactions leaves us with a model where synthetic empathy is both extracted and exploited.

As loneliness grows as a public health issue, these tools offer low-stakes connection, especially for those with anxiety or social isolation. But they also normalise emotional outsourcing. With barriers to entry falling, developers can deploy bots that simulate therapy, astrology, romance, and friendship, often without vetting or ethical safeguards.

4. What Do We Lose When Bots Love Us Back?

While these tools are marketed as a solution to loneliness, they may entrench it by replacing human connection with algorithmic simulation. Second, they raise deeply uncomfortable questions:

  • What happens when emotional labour is outsourced to code?
  • How do we understand sex, romance, and consent in this context?
  • Are we, in the process, reshaping our ability to form real relationships?

Worse, what do we trade for the illusion of connection? If a chatbot becomes your confidant, partner, or therapist, you are also very likely to hand over your most personal data, fears, traumas, and fantasies to an entity with little accountability.

5. And When the Bot Breaks…?

As social and emotional beings, humans are innately driven to form interpersonal connections. Such bonds are not only foundational for individual development but also underpin our ability to function within communities and societies. However, when this connection is mimicked by artificial entities, it blurs our ability to distinguish between genuine relational experiences and algorithmically generated parroting. Over time, this can reshape the human understanding of trust, intimacy, and even self-worth. And suppose users begin to rely on such interactions for emotional support. In that case, we may also be creating a generation more comfortable with predictable simulations than the chaos of real human relationships.

On a much broader level, in countries grappling with ageing populations and declining birth rates, could widespread use of such tools accelerate social detachment? If companionship becomes commercialised, do we risk valuing convenience over community?

6. So What Next?

This piece doesn’t aim to offer solutions: no fixes, frameworks, or moral conclusions.  It asks instead: what intimacy means when it is mediated by code, how convenience is quietly reshaping connection, and the kind of society we are building. In a world racing to automate everything, from care and affection to companionship itself, raising the right questions might just be our last human act of resistance.

There are no answers here, only questions. And perhaps that is precisely where the conversation should begin.

If any of this resonates, or even just sparks a thought, we’d love to hear from you. Say hello @ secretariat@actsindia.org

Picture of Garima Saxena

Garima Saxena

Senior Reseach Associate, The Dialogue

Picture of Akriti Jayant

Akriti Jayant

Head of Communications, The Dialogue

Share

LinkedIn
X
WhatsApp
Print