Part of the challenge in controlling the coronavirus pandemic is in identifying and isolating infected people quickly – not particularly easy when COVID-19 symptoms aren’t always noticeable, especially early on. Now scientists have developed a new artificial intelligence model that can detect the virus from a simple forced cough.
Evidence shows that the AI can spot differences in coughing that can’t be heard with the human ear, and if the detection system can be incorporated into a device like a smartphone, the research team thinks it could become a useful early screening tool.
The work builds on research that was already happening into Alzheimer’s detection through coughing and talking. Once the pandemic started to spread, the team turned its attention to COVID-19 instead, tapping into what had already been learned about how disease can cause very small changes to speech and the other noises we make.
“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs,” says research scientist Brian Subirana, from the Massachusetts Institute of Technology (MIT).
“This means that when you talk, part of your talking is like coughing, and vice versa.”
“It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough.”
The Alzheimer’s research repurposed for COVID-19 involved a neural network known as ResNet50. It was trained on a thousand hours of human speech, then on a dataset of words spoken in different emotional states, and then on a database of coughs to spot changes in lung and respiratory performance.
When the three models were combined, a layer of noise was used to filter out stronger coughs from weaker ones. Across around 2,500 captured cough recordings of people confirmed to have COVID-19, the AI correctly identified 97.1 percent of them – and 100 percent of the asymptomatic cases.
That’s an impressive result, but there’s more work to do yet. The researchers emphasise that its main value lies in spotting the difference between healthy coughs and unhealthy coughs in asymptomatic people – not in actually diagnosing COVID-19, which a proper test would be required for. In other words, it’s an early warning system.
“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” says Subirana.
The fact that the test is non-invasive, virtually free to run and speedy to apply adds to its potential usefulness – while it’s not designed to diagnose people with COVID-19 who are already showing symptoms, it could tell you if you should isolate and get a proper test when no major signs of the virus are showing.
The researchers now want to test the engine on a more diverse set of data, and see if there are other factors involved in reaching such an impressively high detection rate. If it does make it to the phone app stage, there are obviously going to be privacy implications too, as few of us will want our devices constantly listening out for signs of ill health.
Once we start to put the coronavirus pandemic behind us, the new research could help feed back into the study of coughs and Alzheimer’s detection. The data show that the neural networks only required slight tweaking in order to be adapted to each condition.
“Our research uncovers a striking similarity between Alzheimer’s and COVID discrimination,” write the researchers in their published paper.
“The exact same biomarkers can be used as a discrimination tool for both, suggesting that perhaps, in addition to temperature, pressure or pulse, there are some higher-level biomarkers that can sufficiently diagnose conditions across specialties once thought mostly disconnected.”
The research has been published in the IEEE Open Journal of Engineering in Medicine and Biology.