Medical

AI transforms the humble chest X-ray into a better diagnostic tool

AI transforms the humble chest X-ray into a better diagnostic tool
Researchers have used AI to transform the humble chest X-ray into a more powerful diagnostic tool
Researchers have used AI to transform the humble chest X-ray into a more powerful diagnostic tool
View 1 Image
Researchers have used AI to transform the humble chest X-ray into a more powerful diagnostic tool
1/1
Researchers have used AI to transform the humble chest X-ray into a more powerful diagnostic tool

Researchers have used a deep-learning AI model to turn the humble chest X-ray into a more powerful tool for diagnosing heart problems. They say their novel approach could be used as a quick and accurate way of assessing heart function and checking for disease.

Chest X-rays are the most frequently conducted radiological test in the world and a common way for health professionals to diagnose lung and heart conditions. But, while they’re quick and easy to perform, an X-ray is a static image that’s unable to provide information about how the heart is functioning. For that, you need an echocardiogram.

An echocardiogram – commonly called an ‘echo’ – assesses how effectively the heart is pumping and whether the valves between the heart chambers are leaky or diseased. If the heart valves are diseased, the heart can’t pump blood effectively and has to work harder, which can lead to heart failure or sudden cardiac arrest and death. However, echocardiography requires a technician with specialized skills.

Now, researchers from the Osaka Metropolitan University have recruited a deep-learning AI model to transform the humble chest X-ray into a more detailed diagnostic tool.

Deep learning is a process used by artificial intelligence (AI) that teaches computers to process data in a way that imitates the human brain. The model can recognize complex patterns in pictures, text, sounds, and other data to produce accurate insights and predictions.

The researchers trained the deep-learning model with 22,551 chest X-rays associated with 22,551 echocardiograms obtained from 16,946 patients from four facilities between 2013 and 2021. They used data from multiple institutions to reduce the risk of the AI producing biased results.

The X-rays were set as input data and the echocardiograms as output data, and the model was trained to learn the features that connected both datasets. On testing their deep learning model, the researchers found that it could precisely categorize six types of valvular heart disease. The area under the curve (AUC) – the rating index that indicates the capability of an AI model to distinguish between classes – ranged from 0.83 to 0.92. The AUC has a value range from 0 to 1; the closer to 1, the better.

The researchers say their novel AI approach could complement echocardiograms, especially when a quick diagnosis is needed or technicians are in short supply.

“It took us a very long time to get these results, but I believe this is significant research,” said Daiju Ueda, lead author of the study. “In addition to improving the efficiency of doctors’ diagnoses, the system might also be used in areas where there are no specialists, in night-time emergencies, and for patients who have difficulty undergoing echocardiography.”

The study was published in the journal The Lancet Digital Health.

Source: Osaka Metropolitan University

No comments
0 comments
There are no comments. Be the first!