top of page

Evolving Expectations

AI algorithms are coming to radiology, but how long will it be until they're integrated into workflows?

The rise of deep learning artificial intelligence (AI) in today's society has been swift; nearly every industry has been impacted for the better in some way by the innovative technology. That includes health care, where AI has begun to transform the ways that hospitals, physicians, and patients interact. However, while there is interesting research that will further escalate AI adoption over the next few years, Paras Lakhani, MD, an assistant professor in the department of radiology at Thomas Jefferson University Hospital in Philadelphia, says that there are still many unknowns about the technology.

"For example, we don't know the best ways to integrate these algorithms into routine workflow, and [it is difficult to generalize workflow] at multiple institutions and with multiple vendors and variations in image processing," Lakhani says. "At Thomas Jefferson University Hospital, we are interested mainly in applied research, both in creation and deployment of AI algorithms, as well as consumption of algorithms produced by others."

AI technology is rapidly evolving, however, especially in the radiology field, where the interpretation process is heavily dependent on image-based findings. Elad Benjamin, cofounder and CEO of Zebra Medical Vision, says AI is beginning to provide tools that can have a real impact on automated reading of radiology images. "There is real underlying technology, which is fundamentally better than previous generations," Benjamin says. "Over the next few years, it will change how radiology data are viewed, analyzed, and treated."

Automated Assistants Examples in which deep learning has been shown to be effective include the analysis of chest radiographs, fracture detection, and mammography. But deep learning could be widely used for many study types, assuming there are enough high-quality, labeled data to train the algorithm.

Daniel H. Kim, MSc, FRCR, MBChB, a radiologist in the South West of England, says there has been a lot of buzz around AI in DR, and the recent release of huge data sets, including around 100,000 chest radiographs, is a huge plus.

"AI technology is ready, but access to well-curated data is the limiting factor at present. The release of more large data sets will be a major boost," Kim says. "Deep learning is excellent for pattern recognition. Plain radiographs are an ideal target for this kind of AI because some studies tend to be fairly repetitive and often single image. In the near term, AI in DR will be especially useful in situations where there is no access to an expert radiology opinion."

Kim recently published an article about using deep learning for the purposes of AI fracture detection.

"We achieved some excellent results using transfer learning," Kim says. "I am now beginning some work in cancer detection and characterization, which is exciting and has the potential to reduce diagnostic delay, avoid unnecessary treatment, and improve management decisions."

It's generally agreed that AI is very good at mapping one input to another, and for DR, some use cases could include improvement of image quality such as reducing artifacts; recognizing poor-quality images, eg, inadequate coverage, motion artifacts, patient rotation, and penetration; abnormality detection; classification of lesions; and reducing dose.

"Right now, this is still in a stage of development," Lakhani says. "I think you'll start to see the initial work being done in screening mammography."

In November, Lunit, an AI-powered medical image analysis software company, introduced Lunit INSIGHT, a cloud-based AI solution for real-time image analysis. Lunit INSIGHT for chest radiography covers lung nodules/mass; consolidation, eg, pneumonia and tuberculosis; and pneumothorax. For mammography, it detects breast cancer lesions in mammograms.

Brandon Suh, MD, MPH, CMO of Lunit, says that AI is very good at pattern recognition such as detecting abnormal lesions in images. He believes AI algorithms will augment the interpretation capabilities of radiologists, enabling them to detect and classify lesions they would otherwise have missed.

"At first, I think most products will be dealing with increasing the efficiency for radiologists, to do what they do faster, by optimizing the workflow or detecting lesions. These products require the least amount of work for regulatory approval, so it is a good first step for the companies," Suh says. "Increasingly, there will be products that are more deeply involved in the interpretation process, namely classification of lesions, rather than simple detection."

Zebra Medical Vision has developed a broad portfolio of automated tools that assist radiologists in creating more complete, accurate, and consistent reports, including multiple tools to automatically assist physicians in identifying bone, lung, liver, and cardiovascular disease.