AI is providing emotional support for employees – but is it a valuable tool or privacy threat?

AI is providing emotional support for employees – but is it a ...

As artificial intelligence tools like ChatGPT become an increasingly popular avenue for people seeking personal therapy and emotional support, the dangers that this can present – especially for young people – have made plenty of headlines. What hasn’t received as much attention is employers using generative AI to assess workers’ psychological well-being and provide emotional support in the workplace.

Since the pandemic-induced global shift to remote work, industries ranging from health care to human resources and customer service have seen a spike in employers using AI-powered systems designed to analyze the emotional state of employees, identify emotionally distressed individuals, and provide them with emotional support.

This new frontier is a large step beyond using general chat tools or individual therapy apps for psychological support. As researchers studying how AI affects emotions and relationships in the workplace, we are concerned with critical questions that this shift raises: What happens when your employer has access to your emotional data? Can AI really provide the kind of emotional support workers need? What happens if the AI malfunctions? And if something goes wrong, who’s responsible?

The workplace difference

Many companies have started by offering automated counseling programs that have many parallels with personal therapy apps, a practice that has shown some benefits. In preliminary studies, researchers found that in a doctor-patient-style virtual conversation setting, AI-generated responses actually make people feel more heard than human ones. A study comparing AI chatbots with human psychotherapists found the bots were “at least as empathic as therapist responses, and sometimes more so.”

This might seem surprising at first glance, but AI offers unwavering attention and consistently supportive responses. It doesn’t interrupt, doesn’t judge and doesn’t get frustrated when you repeat the same concerns. For some employees, especially those dealing with stigmatized issues like mental health or workplace conflicts, this consistency feels safer than human interaction.

But for others, it raises new concerns. A 2023 study found that workers were reluctant to participate in company-initiated mental health programs due to worries about confidentiality and stigma. Many feared that their disclosures could negatively affect their careers.

Other workplace AI systems go much deeper, analyzing employee communication as it happens – think emails, Slack conversations and Zoom calls. This analysis creates detailed records of employee emotional states, stress patterns and psychological vulnerabilities. All this data resides within corporate systems where privacy protections are typically unclear and often favor the interests of the employer.

illustration of a giant eyeball watching a woman working on a computer at a desk

Employees might feel that AI emotional support systems are more like workplace surveillance.
Malte Mueller/fStop via Getty Images

Access the original article

Subscribe
Don't miss the best news ! Subscribe to our free newsletter :