AI ‘companions’ promise to combat loneliness, but history shows the dangers of one-way relationships

The United States is in the grips of a loneliness epidemic: Since 2018, about half the population has reported that it has experienced loneliness. Loneliness can be as dangerous to your health as smoking 15 cigarettes a day, according to a 2023 surgeon general’s report.

It is not just individual lives that are at risk. Democracy requires the capacity to feel connected to other citizens in order to work toward collective solutions.

In the face of this crisis, tech companies offer a technological cure: emotionally intelligent chatbots. These digital friends, they say, can help alleviate the loneliness that threatens individual and national health.

But as the pandemic showed, technology alone is not sufficient to address the complexities of public health. Science can produce miraculous vaccines, but if people are enmeshed in cultural and historical narratives that prevent them from taking the life-saving medicine, the cure sits on shelves and lives are lost. The humanities, with their expertise in human culture, history and literature, can play a key role in preparing society for the ways that AI might help – or harm – the capacity for meaningful human connection.

The power of stories to both predict and influence human behavior has long been validated by scientific research. Numerous studies demonstrate that the stories people embrace heavily influence the choices they make, ranging from the vacations they plan, to how people approach climate change to the computer programming choices security experts make.

Two tales

There are two storylines that address people’s likely behaviors in the face of the unknown territory of depending on AI for emotional sustenance: one that promises love and connection, and a second that warns of dehumanizing subjugation.

The first story, typically told by software designers and AI companies, urges people to say “I do” to AI and embrace bespoke friendship programmed on your behalf. AI company Replika, for instance, promises that it can provide everyone with a “companion who cares. Always here to listen and talk. Always on your side.”

There is a global appetite for such digital companionship. Microsoft’s digital chatbot Xiaoice has a global fan base of over 660 million people, many of whom consider the chatbot “a dear friend,” even a trusted confidante.

In the film “Her,” the protagonist develops a romantic relationship with a sophisticated AI chatbot.

In popular culture, films like “Her” depict lonely people becoming deeply attached to their digital assistants. For many, having a “dear friend” programmed to avoid difficult questions and demands seems like a huge improvement over the messy, challenging, vulnerable work of engaging with a human partner, especially if you consider the misogynistic preference for submissive, sycophantic companions.

To be sure, imagining a chummy relationship with a…

Access the original article

Subscribe
Don't miss the best news ! Subscribe to our free newsletter :