Could an app tell if a first date is just not that into you? Engineers at the University of Cincinnati say the technology might not be far off. They trained a computer—using data from wearable technology that measures respiration, heart rates and perspiration—to identify the type of conversation two people were having based on their physiological responses alone.
Researchers studied a phenomenon in which people’s heart rates, respiration and other autonomic nervous system responses become synchronized when they talk or collaborate. Known as physiological synchrony, this effect is stronger when two people engage deeply in a conversation or cooperate closely on a task.
“Physiological synchrony shows up even when people are talking over Zoom,” said study co-author Vesna Novak, an associate professor of electrical engineering in UC’s College of Engineering and Applied Science.
In experiments with human participants, the computer was able to differentiate four different conversation scenarios with as much as 75% accuracy. The study is one of the first of its kind to train artificial intelligence how to recognize aspects of a conversation based on the participants’ physiology alone.
The study was published in the journal IEEE Transactions on Affective Computing.
Lead author and UC doctoral student Iman Chatterjee said a computer could give you honest feedback about your date—or yourself.
“The computer could tell if you’re a bore,” Chatterjee said. “A modified version of our system could measure the level of interest a person is taking in the conversation, how compatible the two of you are and how engaged the other person is in the conversation.”
Chatterjee said physiological synchrony is likely an evolutionary adaptation. Humans evolved to share and collaborate with each other, which manifests even at a subconscious level, he said.
“It is certainly no coincidence,” he said. “We only notice physiological synchrony when we measure it, but it probably creates a better level of coordination.”
Studies have shown that physiological synchrony can predict how well two people will work together to accomplish a task. The degree of synchrony also correlates with how much empathy a patient perceives in a therapist or the level of engagement students feel with their teachers.
“You could probably use our system to determine which people in an organization work better together in a group and which are naturally antagonistic,” Chatterjee said.
This aspect of affective computing holds huge potential for providing real-time feedback for educators, therapists or even autistic people, Novak said.
“There are a lot of potential applications in this space. We’ve seen it pitched to look for implicit bias. You might not even be aware of these biases,” Novak said.
Novak studies rehabilitation robotics and wearable technology among other topics in her lab at UC.
Novak and her students were able to teach the computer how to recognize four types of conversations based on five physiological indicators: chest and nose respiration, an electrocardiogram, skin conductance and peripheral skin temperature.
Individually, Novak said, these measurements can’t say much about interpersonal relations. Each physiological signal can be statistically noisy and hard to interpret. But researchers were able to sift through the noise by applying pattern recognition algorithms.
Sixteen pairs of participants discussed possible topics that they could strongly agree on or disagree on before engaging in four different conversations:
A positive conversation in which they happily talk about a topic in which they shared a similar opinion.A negative conversation in which they unhappily discuss a topic over which they disagree.Two conversations about an agreeable topic in which each participant takes a turn dominating the discussion.
Every three out of four times, the artificial intelligence was able to identify the type of conversation (one-sided, two-sided, positive or negative) based only on what the participants’ bodies told the machine.
Novak said their findings raise tantalizing questions about what else computers can tell us about interpersonal relations.
“Our next step is to see how much nuance we can separate,” she said. “We’ve shown that AI has the ability to identify positive versus negative conversations, but can you separate shades of gray that humans wouldn’t discern?”
More information:
Iman Chatterjee et al, Automated Classification of Dyadic Conversation Scenarios using Autonomic Nervous System Responses, IEEE Transactions on Affective Computing (2023). DOI: 10.1109/TAFFC.2023.3236265
Provided by
University of Cincinnati
Citation:
Could an app let you know if a first date is just not that into you? (2023, February 13)