Computer science researchers at the University of Central Florida have developed a sarcasm detector.
Social media has become a dominant form of communication for individuals, and for companies looking to market and sell their products and services. Properly understanding and responding to customer feedback on Twitter, Facebook and other social media platforms is critical for success, but it is incredibly labor intensive.
That’s where sentiment analysis comes in. The term refers to the automated process of identifying the emotion—either positive, negative or neutral—associated with text. While artificial intelligence refers to logical data analysis and response, sentiment analysis is akin to correctly identifying emotional communication. A UCF team developed a technique that accurately detects sarcasm in social media text.
The team’s findings were recently published in the journal Entropy.
Effectively the team taught the computer model to find patterns that often indicate sarcasm and combined that with teaching the program to correctly pick out cue words in sequences that were more likely to indicate sarcasm. They taught the model to do this by feeding it large data sets and then checked its accuracy.
“The presence of sarcasm in text is the main hindrance in the performance of sentiment analysis,” says Assistant Professor of Engineering Ivan Garibay ’00MS ’04PhD. “Sarcasm isn’t always easy to identify in conversation, so you can imagine it’s pretty challenging for a computer program to do it and do it well. We developed an interpretable deep learning model using multi-head self-attention and gated recurrent units. The multi-head self-attention module aids in identifying crucial sarcastic cue-words from the input, and the recurrent units learn long-range dependencies between these cue-words to better classify the input text.”
The team, which includes computer science doctoral student Ramya Akula, began working on this problem under a DARPA grant that supports the organization’s Computational Simulation of Online Social Behavior program.
“Sarcasm has been a major hurdle to increasing the accuracy of sentiment analysis, especially on social media, since sarcasm relies heavily on vocal tones, facial expressions and gestures that cannot be represented in text,” says Brian Kettler, a program manager in DARPA’s Information Innovation Office (I2O). “Recognizing sarcasm in textual online communication is no easy task as none of these cues are readily available.”
This is one of the challenges Garibay’s Complex Adaptive Systems Lab (CASL) is studying. CASL is an interdisciplinary research group dedicated to the study of complex phenomena such as the global economy, the global information environment, innovation ecosystems, sustainability, and social and cultural dynamics and evolution. CASL scientists study these problems using data science, network science, complexity science, cognitive science, machine learning, deep learning, social sciences, team cognition, among other approaches.
“In face-to-face conversation, sarcasm can be identified effortlessly using facial expressions, gestures, and tone of the speaker,” Akula says. “Detecting sarcasm in textual communication is not a trivial task as none of these cues are readily available. Especially with the explosion of internet usage, sarcasm detection in online communications from social networking platforms is much more challenging.”
Scientists devise algorithm that detects sarcasm better than humans
More information:
Ramya Akula et al, Interpretable Multi-Head Self-Attention Architecture for Sarcasm Detection in Social Media, Entropy (2021). DOI: 10.3390/e23040394
Provided by
University of Central Florida
Citation:
Researchers develop artificial intelligence that can detect sarcasm in social media (2021, May 7)
retrieved 8 May 2021
from https://techxplore.com/news/2021-05-artificial-intelligence-sarcasm-social-media.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.