AI tech behind deepfake videos creates bogus imaging results—and it could be big for radiology

Generative adversarial networks (GANs), a fairly new breakthrough in AI, are capable of creating fake images that look incredibly real. It’s the same technology, in fact, responsible for those deceptive “deepfake” videos that put words in the mouths of public figures such as President Trump and Facebook Founder Mark Zuckerberg.

According to a new analysis published in Academic Radiology, GANs could also make a big impact on the future of healthcare research, especially in the field radiology. They can generate fake, realistic medical images that researchers are eager to explore.  

“This recent innovative technology has the potential to be applied to a variety of radiology tasks,” wrote lead author Vera Sorin, Chaim Sheba Medical Center in Israel, and colleagues. “These tasks include generation of fake images to increase datasets for training deep learning algorithms, translation of one image type to another and improving the quality of existing images. The radiology community can benefit from getting acquainted with this technology.”

The authors reviewed academic publications from 2017 to September 2019, focusing on any papers that detailed GAN applications in radiology. Overall, 33 studies made the cut, and they included research in four key areas: image reconstruction and denoising, data augmentation, transfer between modalities and image segmentation.

“Fourteen studies described GANs for image reconstruction and denoising,” the authors wrote. “These studies aimed to improve image quality and reduce radiation dose, an assignment that can greatly impact the availability and usage of imaging modalities for diagnostic and screening purposes.”

The researchers doing this work found significant success. One team, for instance, trained a GAN to remove metallic artifacts from CT scans.

Data augmentation was another common topic for researchers working with GANs. Annotating medical images requires a lot of time, energy and knowledge—but GANS can hep researchers by creating fake images that can contribute to the development of AI algorithms.

“The main pitfall in generated images is that they sometimes struggle to compete with real ones,” the authors wrote. “Synthetic images may have low resolution or be blurred. For this reason, algorithm training is initially done using fake images, and then refined with real images. This way benefiting training and decreasing the number of required real images.”

GAN technology can also be used for “generating CT-like images based on MR images or generating MR images across different sequences.” This technique can improve an AI model’s effectiveness by using the knowledge from one modality for “improved differentiation” in another modality.

The team did add that this technology is still in the “proof-of-concept stage” at this time, but researchers continue to explore the potential of GANs in radiology and other healthcare specialties.  

“In conclusion, GANs are increasingly studied for various radiology applications,” the authors wrote. “They enable the creation of new data, which can be used for clinical care, education and research.”

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The tirzepatide shortage that first began in 2022 has been resolved. Drug companies distributing compounded versions of the popular drug now have two to three more months to distribute their remaining supply.

The 24 members of the House Task Force on AI—12 reps from each party—have posted a 253-page report detailing their bipartisan vision for encouraging innovation while minimizing risks. 

Merck sent Hansoh Pharma, a Chinese biopharmaceutical company, an upfront payment of $112 million to license a new investigational GLP-1 receptor agonist. There could be many more payments to come if certain milestones are met.