NVIDIA launches new federated learning solution for training AI models

NVIDIA has unveiled a new solution at RSNA 2019 in Chicago, one that encourages the development of new AI models while keeping patient data in the hands of healthcare providers.

NVIDIA Clara Federated Learning (Clara FL) is a reference application powered by the company’s EGX intelligent edge computing platform. It helps providers know their data is secure—because it never actually goes anywhere.

Providers label their own patient data using NVIDIA’s AI algorithms, and then the EGX servers train global models on that local data.

“The local training results are shared back to the federated learning server over a secure link,” according to a new blog post on the NVIDIA website. “This approach preserves privacy by only sharing partial model weights and no patient records in order to build a new global model through federated averaging. The process repeats until the AI model reaches its desired accuracy. This distributed approach delivers exceptional performance in deep learning while keeping patient data secure and private.”

The American College of Radiology, Partners Healthcare, UCLA Health and King’s College London are all currently exploring Clara FL, according to NVIDIA.

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

Compensation for heart specialists continues to climb. What does this say about cardiology as a whole? Could private equity's rising influence bring about change? We spoke to MedAxiom CEO Jerry Blackwell, MD, MBA, a veteran cardiologist himself, to learn more.

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”