Leveraging Technology, Data and Patient Care: How Geisinger Is Interjecting Insight & Action
As an integrated health-delivery network comprising 13 hospital campuses, two research centers and a health plan with more than half a million subscribers sitting atop the biggest biobank with whole exome (DNA) sequence data in existence, Pennsylvania’s Geisinger Health System is one of the best-positioned institutions in the U.S. to explore the possibilities and initial successes of AI in healthcare. The institution is bringing complex algorithmic concepts to everyday patient care and showing others the path forward.
There’s a lot to learn from Geisinger. The system serves more than 3 million residents in central, south-central and northeastern Pennsylvania plus southern New Jersey. Here’s their prowess: As an early adopter of electronic health records, they have 20-plus years of longitudinal views of a stable population of 2 million people, a centralized data warehouse stretching back to 2007 and extensive imaging databases. DNA sequencing is now being added to the mix. Numerous AI-related studies are in progress here, thanks to the massive homegrown datasets and closely aligned research and clinical teams seeking to improve the care of individual patients and populations. They seek to spread the word that, in the age of artificial and augmented intelligence, every hospital team needs to get onboard to utilize their own data to treat, support and improve the care and health of their patients.
Four leaders with key roles in activities guided by this mission explained the emphasis to AI in Healthcare.
‘TRANSLATION FIRST’
“Everything we do is about translating research into clinical practice,” says the Chair of the Department of Radiology at Geisinger Aalpen Patel, MD. “Research for research’s sake, there’s a place for that [and needed to understand the basic science]. But our focus at Geisinger has been translation first.”
“If you look at the literature, you’ll find there are a lot of papers out there [with AI in the title]”, says Brandon Fornwalt, MD, PhD, a radiologist who chairs Geisinger’s Department of Imaging Science and Innovation and also directs the Cardiac Imaging Technology Lab (CITL). “They get published, and they’re great papers. But there’s no implementation. And then review articles are written about those papers, which leads to the proliferation of the hype we’re seeing in the medical AI field right now.”
At the same time, underlying every translational initiative is a dedicated research project that helped get Geisinger to the point at which care quality can be significantly improved by machine learning based on scientific evidence. Christopher Haggerty, PhD, a biomedical engineer and Fornwalt’s co-director at the CITL, jumps in to make sure this aspect doesn’t get short shrift.
As an example, he describes Geisinger’s work to build an AI model that can predict disease severity for patients with heart failure while also predicting the potential benefit from specific targeted interventions, such as getting an annual flu shot. That work has expanded as the team behind it seeks to roll out the model to Geisinger’s pharmacy group, whose members will further test and deploy it for patients. This development built on many “preceding months of work to understand what are the important inputs that we need to be considering and curating,” Haggerty says. “How do we build the model, and what do we do with it once we have it?”
Adding the IT and enterprise perspective, Geisinger’s Chief Information Officer John Kravitz, CHCIO, MHA, stresses the importance of collaboration that bridges not only research labs with patient-care areas but also multidisciplinary teams with former departmental silos. “We have the luxury, working in IT especially, of having the respect of clinicians, because we provide them with tools. It’s a bi-modal, bi-directional respect,” Kravitz says. “That goes a long way to help our patients.”
MULTI-MODAL DATA FEEDS
While Geisinger’s translational research spans myriad medical specialties and disease states, its work applying AI to help head off heart failure is illustrative. The efforts began with vital signs, laboratory and diagnostic data, and measurements derived from echocardiograms and is now incorporating data from 12-lead electrocardiography (ECG) readings. The team is using data from 2.5 to 3 million historical ECG studies, and the hope is to also build predictive models for everything from atrial fibrillation to stroke.
Next, tapping the institution’s genomic data, the team will integrate data from all three sources—echocardiography, ECG and genomics—into a multi-modal machine learning algorithm.
“If you can say a patient has a 70 percent probability that, within a year, he or she is going to develop atrial fibrillation based on the historical data, you can monitor that patient very closely,” says Patel. “This is important, because when you have undiagnosed atrial fibrillation, you’re at a much higher risk of stroke.”
Studies at Geisinger and elsewhere have already shown that a physician can’t predict future heart events nearly as well as a computer can, says Fornwalt, adding that 95 percent of medicine is about prediction. “We’re going to leverage machine learning’s predictive power to make a difference,” he says. “We can give you a heart monitor that you stick on your chest for 14 days, pick up the readings, put you on an anticoagulation med so you don’t present with a stroke that destroyed half your brain and now you can’t move the left side of your body. That’s a game changer.”
PREDICTIVE VALUES
Fornwalt is equally enthusiastic about the ECG project’s potential to aid in initial diagnosis of asymptomatic heart disease. There appears to be a subset of over 100,000 of those 2.5 to 3 million historical ECGs labeled normal by the physician but some may not be so normal after all.
“The machine learning models can actually predict future events even in those ‘normal’ ECGs,” Fornwalt says. “This means they’re probably not normal. There’s something subtle inside of those ECGs that the physicians are not picking up on, and yet it predicts [a heart event] in the future. This is like AI for a whole new area of medicine that doesn’t even exist right now.”
CIO Kravitz shines light on how such predictive medicine, when extrapolated to multiple health conditions across Geisinger’s catchment area, can both improve health and drive down costs at the level of population health.
Geisinger provides care for about 40 percent of its 600,000 or so health plan subscribers, he notes. “It behooves us to keep those patients as well as possible, keeping them out of the acute-care setting,” Kravitz adds. “We can bring a patient in, maybe in an ambulatory setting, and take care of a problem. For health systems that are paid a certain amount, the better you can treat a patient through the use of AI and ML tools, the better you will do financially as a health system. And then, on the flip side, we can teach patients that holistic care does a lot more to help you have a better quality of life and a longer life.”
GOVERNING AI GROWTH
The four Geisinger experts agree that three things are key to making AI a difference-maker in any healthcare system—data governance, multidisciplinary collaboration and computing power.
“We look at everything from a data governance perspective to make sure the data are pristine and correct and mean one thing across the organization to everyone who’s going to use that data,” Kravitz says.
“We don’t have this all figured out by any stretch of the imagination,” Fornwalt adds. “We’ve had an EHR for more than 20 years, so we’re fairly far along. But we’re still going back and forth about how to define heart failure in the best way. So good data governance is absolutely critical.”
Even the soundest data governance wouldn’t do much to enable AI if efforts were siloed, says Haggerty, adding that making sure the data are available and sharable by multi-disciplinary teams throughout the enterprise is vital.
“The people who are building the machine learning models and doing that end of the analysis tend to be the higher-profile individuals,” Haggerty says. “But we also highly value the analysts who understand the data that are available, the governance that goes under it, and the strengths and limitations thereof. Analysts are the intermediaries between the raw data and the data that the model builders are looking for.”
IDENTIFIABLE INFRASTRUCTURE
Also enabling the underlying processes that encourage AI to sprout from research and bloom in patient care is pure computing “horsepower,” the four agree. This is nowhere more evident than when working with multi-modal data, as Geisinger has found with the ECG–echocardiography– genomic data flows, among others, Fornwalt says.
“We’ve found that you have to have very fast ways to process that multi-modal data with very fast storage,” he adds before underscoring that the storage must reside very close to where the data lives for daily clinical use such that clinical workflows are not disrupted.
To support all the work Geisinger is doing with machine learning, leadership is behind the concept of placing everything “in one spot,” Fornwalt says, so that all involved parties can tap the very high-performance compute GPU clusters and do their best work.
“We are one of the only hospitals that has that type of infrastructure built inside the clinical network,” he says. “We keep it all in an identified space where it’s secure and protected, and we build that infrastructure inside of that space.”
At many other institutions, he continues, researchers and innovators “are sitting outside the clinical network, and they’re begging for data. Most places have a very slow pipeline to get that data out for the researchers and innovators.”
Part of the problem is that data are often de-identified to protect patient privacy, and the de-identification processes slow things down and sometimes compromises the integrity of the data. Fornwalt says Geisinger opted to build a space inside the clinical network to keep the data-sharing process simple and efficient without compromising privacy and security.
FAST COMPUTE
This was a logical choice, given Geisinger’s view of compute power as mission-critical.
“Typically in IT systems you have storage area networks, which is a big disk array,” Kravitz says. “Whether it’s flash memory [or some other solution], we need to have a high-volume cluster where the data and the compute power are locally attached, so it’s very fast in computing.”
To give an example, Kravitz describes Geisinger’s big-data platform. “We use tools that let us search over 30 million encounters from 2 million patients for a keyword such as myocardial infarction in less than 1 second. That’s how fast it functions,” he says. “It’s all super-indexed. That’s the way this process works, and it’s necessary to support our clinicians so they can get quick results and have tangible data to work from.”
Geisinger’s imaging computing sits next to a GPU cluster with 1 petaflop of processing power. “That’s a million billion floating point operations per second, or FLOPS,” Patel explains. “Just to put that into perspective, the world’s most powerful computer has 200 petaflops. So this is some serious compute power, and our teams are actually maxing it out. We always need more compute power. The data mandates it.”
EXCITINGLY TERRIFYING
To wrap up the discussion, AI in Healthcare asked each about their long-term goals with AI at Geisinger:
- Patel: “The ultimate question we need to be asking is: How do I help the patient? Part of that [equation] is going to be helping physicians take care of patients. But there will be other parts [variables] to taking care of the patient even better. Ultimately the shift will happen from physician centricity to patient centricity.”
- Fornwalt: “I became a physician-scientist because I wanted to change lives in a positive way. I believe that we at Geisinger can lead the implementation of AI and machine learning in healthcare and show the world what’s possible. I strongly believe that what we do here can be translated to other places. And we can improve the lives of our patients, helping them live longer and happier lives. That’s why I come to work.”
- Kravitz: “Not being a physician, but looking at it from a commonsense perspective, if you can treat a large population of patients cost-effectively, and improve their quality of life, isn’t that what we’re here for? It’s discrete and predictive medicine, to look at health problems that people aren’t even aware they have. If you can intercede in those problems and improve someone’s quality of life, that’s really what we want to achieve here.”
- Haggerty: “When you talk about the opportunities with the genomic data and the image data that we have—all of that is special and unique. To not leverage that and capitalize on that opportunity to make a difference and do so in a way that is both special and powerful would feel like a shame.”
The last word goes to Fornwalt, who says AI in healthcare is, at the present moment, equal parts exciting and terrifying. “I’m stressed out,” he says, “because I’m thinking: What’s hiding in these data that we’re not acting on, that we need to be doing right now to help our patients?” That’s a motivator for sure.
-
View more features from this issue:
Building Foundations to Build Better Care
Embracing AI: Why Now Is the Time for Medical Imaging
Bullish on AI: The Wisconsin Way: Reengineering Imaging & Image Strategy
ML’s Role in Building Confidence and Value in Breast Imaging
Will ‘Smart’ Solutions Really Transform Cardiology?
Matching Machine Learning and Medical Imaging: Predictions for 2019
NYU’s Daniel Sodickson on AI, Facebook and Why Faster MR Scans Could Improve Healthcare