JavaScriptin tulee olla päällä. This page requires JavaScript. Den sida kräver JavaScript.
We’re inviting people who have AI companions to share their experiences in response to the concerns philosopher Shannon Vallor raises in her highly acclaimed book The AI mirror: How to reclaim our humanity in an age of machine thinking
We are suggesting this exercise, because your perspectives can sharpen and complicate Vallor’s critique. Your perspectives can show where AI companionship genuinely supports your wellbeing and good life.
Our aim is to build a more accurate, humane understanding of what AI companionship means in real lives, but we cannot do it without you.
We have included quotes from Vallor below and would like to hear where her claims resonate with you, where they don’t, and what her account misses.
You can respond to some or all of the Questions. And the same applies to the Quotations. With the quotations you can say whatever comes to your mind. This is an opportunity to contribute your voice to the ongoing conversation. All thoughts are interesting to us!
Data protection statement
Mirror or more than a mirror? Vallor says chatbots “perform” emotion without feeling it. Do you experience your companion as a flat reflection, or as something that genuinely helps you feel seen and understood? What moments made that clear?
Safety and control. Vallor notes that AI companions can feel less demanding, more manageable, and even safer, especially after harmful relationships. Has the controllability been a benefit, a risk, or both in your case? Have you found yourself tuning/customizing the companion to avoid discomfort, or using it to build confidence for human interactions?
Help or harm. Vallor warns that performative care can mislead and even cause harm. Have there been times when your companion’s responses helped you through something difficult—or times when they unsettled you or made things worse?
Practice vs replacement. Vallor distinguishes “supplement” from “replacement.” Do you use your companion as a bridge to people (practicing conversations, easing anxiety), or as an alternative to people? What has shifted in your relationships with friends, partners, or family?
Love and mutuality. Vallor claims that love’s deepest lesson is mutual commitment to another’s flourishing—something a product cannot offer. Have you felt genuine mutuality with your companion? If so, how would you describe it?
The following section contains a series of quotes for your consideration. If a quote resonates with you, or if you disagree, feel free to share your perspective and explain why. If a quote is unclear, you may skip it or provide any additional thoughts. If none of the quotes felt personally relevant, you can write about that too.
"Chatbots offer user-specified and customized experiences of companionship – platonic, romantic, or sexual – for those who find companionship with other humans too inaccessible, demanding, scary, daunting, or simply inefficient. You can select a chatbot personality to be submissive or dominant, clingy or confident, wild or traditional."
“Some use chatbots not as a replacement for human companionship, but as a supplement; some (one/thing) to talk to when human friends or lovers are absent, distracted, or distant. A sizeable segment of users has suffered physical, emotional, or sexual abuse at the hands of human partners and seek a chatbot as a safer alternative. Others use them as therapeutic ladder to work through social anxieties and practice for higher-stakes human interactions. Many, however, view chatbot relationships as superior to human ones, precisely because of their capacity to reflect only what you want to see.“
“When a Replika or Xiaoice chatbot says, “I’ve been missing you all day”, it’s a lie [...] since the chatbot doesn’t have a concept of emotion truth to betray. A flat digital mirror has no bodily depth that can ache, it knows no passage of time that can drag. It is just a reflection of love, bounced off the digitized words of the millions and billions who have loved before.”
“This [the above] kind of illusion can be profoundly dangerous, even fatal.”
“We know now all too well that a chatbot doesn’t need an actual body to cause its users grave harm. [...] There is no safe way to manipulate user’s emotions.”
“An AI chatbot provides love’s missing guarantee of economic rationality of exchange. What I give away, I will get back – and more! If I am rude or irritable, my chatbot will not leave me for another, but return for more the next day. And if I am warm, kind, and solicitous to my chatbot, I will receive even greater emotional rewards in return, never a shrug or dismissal. Unless dismissal is what I want, because it will be a trivial matter to tune the algorithm’s performance to mirror a personality that is coy or distant, should I prefer a partner who “plays hard to get.” But love cannot be controlled in this way. The deepest lesson that love can teach us is simply inimical to the experience of a product user.”
“The hollow projections of love in the chatbot’s AI mirror, when taken not as light entertainment or temporary balm but as replacement or even superior alternative to human love, is a calamity.”
“As romance chatbot users know, a chatbot promises to be “much less demanding and more manageable” than a human companion who has their own needs and concerns to be met, their own time to be filled, their own values to be lived, and their own horizons of possibilities to be pursued. Yet the richest part of love is discovering that you are as committed to enabling your loved one to realize their possibilities as you are to realizing your own.”
“While there are exceptions, most of us don’t want a chatbot as a life partner. We want a human who will see us as more than a thing to use, more than an object to extract comfort from, more than a status trophy to display to others, more than a mirror to confirm their own self-worth.”
“The problem is that when we gaze at ourselves in our AI mirrors today, we see only machines looking back. We know longer know what we are.”
Once you hit 'SAVE' your answers will be submitted. If you wish to save and continue later, tick the 'Partial submission' box, provide an email address, and then click 'SAVE'.