From The Arcade To The Reading Room

NVIDIA's Project Clara promises to level up the capabilities of medical scanners.

Arcade to Reading Room

As electronic game enthusiasts know, winning a battle in any multiplayer, role-playing video game can come down to which player has the best graphics processing unit (GPU) in their computer. A good GPU—an electronic circuit that quickly manipulates memory to accelerate the creation of images—renders images, animations, and video with the greatest clarity in the shortest amount of time. The more quickly players can see their enemy, the faster they can defend themselves or devise a successful counterattack within their virtual reality world.

Seeing images clearly, accurately, and in a timely manner are important criteria for radiologists as well. Whether looking for cancerous lesions or blockages in blood flow, a faster, more visually accurate discovery can lead to early detection and a successful clinical course of action.

California technology company NVIDIA, renowned for its gaming graphics and advances in AI technology, is making the connection between gaming visualization and radiology imaging quality with Project Clara. This medical imaging supercomputer draws from current advancements in computation. With GPUs, NVIDIA sees increased efficiency, value, and quality for medical imaging practices. "Gaming is pushing us to the edge in that it demands massive amounts of computation," says Abdul Hamid Halabi, global business development lead for health care and life sciences with NVIDIA. "The underlying computation required to create a virtual world is not that dissimilar from the computation required to acquire, analyze, and visualize a medical image."

NVIDIA, which has been involved in health care for about 10 years, is combining the computation, visualization, and deep learning capabilities of GPUs to create "smart instruments." Halabi says medical instruments are becoming more computational. Project Clara takes advantage of advancements in computation to renew the capabilities of existing machines, such as MRI and CT. These instruments can then be used in the early detection of conditions including cancer and heart disease. "One of the best homes for a GPU is in the instrument," Halabi says. "However, sometimes the GPU is either not available or it does not have enough horsepower to handle new workloads. Project Clara can help medical imaging instruments, such as an MRI device or a CT scanner, be virtual computers. GPUs outside of the instruments enable radiologists to take advantage of the latest algorithms."

How It Works Project Clara, introduced by NVIDIA in March 2018 at the company's annual GPU Technology Conference, is a virtual medical imaging supercomputer based on NVIDIA's computing, AI, and visualization technology. There has been a revolution in medical image processing led by AI, and Project Clara is NVIDIA's platform to bring this amazing technology to the existing install base of scanners around the world.

Updating existing imaging devices can be a lengthy process. Halabi says NVIDIA's Project Clara can provide a more efficient path, taking exiting GPU accelerated code, adding access to the latest code libraries, and creating even better imaging apps. The process becomes an extension of existing imaging devices; it's not a reinvention of the wheel.

"GPU computing can be used everywhere, by everyone," Halabi says. "It takes existing medical imaging instruments and improves image quality while speeding up the imaging process, which reduces radiation exposure and improves the patient experience at the same time.

"Just like in gaming, you want to play with the latest version, but you don't have the latest machine," he continues. "But, if the game operates out of the cloud, the age or functionality of your machine doesn't matter. The cloud, which is updated regularly, acts as a virtual machine. Project Clara works the same way, with universally applicable updates going to the cloud for use with any imaging device."

Halabi says that a big challenge with virtualizing medical instruments is latency in transferring data from the point of collection to the screen for viewing. NVIDIA's technology stack addresses how quickly information can be transferred from the scanning device, through computation, to the radiology reading room or a smartphone or other electronic viewing method.

The technology is also universal. Project Clara can run several computational instruments simultaneously and can perform computations for any imaging device—CT, MRI, ultrasound, X-ray, or mammography.

In addition, Project Clara leverages technology from the NVIDIA GPU Cloud. This adds to the technology's virtual nature, giving users the same look and functionality regardless of where the imaging partners place the computational tools—on a device, in a data center at the hospital, or on the cloud.

"Once the patient is scanned, you can run the latest AI and maybe some 3D extrapolations, then send that data back to an iPad or other viewing device," Halabi says.

NVIDIA's AI Evolution Halabi says NVIDIA's history began 25 years ago with the development of multitasking processing technology that managed thousands of parallel tasks simultaneously. This is good for quick problem-solving, he says. In 1999, the company introduced the GPU, and, with that, video games started looking a whole lot better. "GPUs are great for speed. They are specialized accelerators," he says. "When we moved from CPU to GPU, some tasks that took weeks to compute could be completed in hours."

The next step in the road to AI for NVIDIA was high-performance computing, with simulations in areas such as weather, astronomy, and life sciences. Deep learning came next, launched in 2012.

"Deep learning is a type of machine learning," Halabi says. "You hear about AI everywhere. That's where machines act like humans. We're not there yet. We're at machine learning, where you're teaching the computer to do something."

One method of machine learning utilizes "neural networks," he says. At a high level, they are modeled after the brain—neurons that represent data elements and synapses that represent the connection and affinity between these neurons. Using the human face as an example, Halabi describes how neural networks work.

"With neural networks, you show images of faces and the computer will learn to recognize a human face" he says. "Neural networks will learn the underlying features that make up a face, such as nose, mouth, eyes, and hair. It will simultaneously start building a strong connection between all these features and the face. The next time it sees these features, a light bulb lights up with the word 'Face' on top of it. The term 'deep learning' comes from the fact that the networks do this in many layers. For example, it first learns to recognize corners and lines, then shapes, and finally organs such as eyes, nose, and so on. And the cool thing is that GPUs can do this in minutes."

AI In The Industry NVIDIA announced two partnerships at RSNA 2017 with early adopters of its deep learning capabilities. GE Healthcare is embedding the company's AI computing platform into its imaging instruments. NVIDIA will also power GE Healthcare's AI analytics platform to accelerate future creation, deployment, and consumption of deep learning algorithms for other imaging instruments.