HIMSS14: Coordinated CDS program breaks down QI siloes
ORLANDO--An abundance of reporting and performance programs required by certification bodies, federal agencies, payers and organizations themselves requires a coordinated approach to ensure complete, accurate and consistent electronic data in support of quality care, said Elizabeth Crabtree, MPH, PHD, director of evidence-based practices and assistant professor at Medical University of South Carolina, speaking at the Health Information and Management Systems Society annual conference.
But getting there meant breaking down the siloes of the many quality improvement initiatives in rhe organization and creating a unified structure to get data out.
“We have so many quality measures flying around, and they all look the same but they are all different,” said Crabtree, who noted MUSC reports on 250 different measures. As such, organizations need to understand the nuances between different measures and get consistent definitions across the board. This is not easy, she said.
MUSC launched a coordinated clinical decision support (CDS) program to drive one single reporting structure at the enterprise level. This entailed:
- The development of a comprehensive organizational blueprint for quality measure requirements so “you can see everything on one page and isolate where things overlap and where they are filled.”
- Building a comprehensive workflow that collects data needed for all programs.
- Working with an established CDS Oversight Committee to design and implement CDS tools to integrate data capture into workflow in a meaningful way, resulting in quality measure linkage. “This tackles workflow, not just lists.”
To achieve these ends, MUSC established not only a CDS Oversight Committee, but a CDS Workgroup, EHR development and operations committee and ad-hoc content expert teams to: identify targets for new and improved CDS intervention; design, validate and development and integrate into workflow CDS interventions; and evaluate intervention effect, said Itara Barnes, manager of regulatory analytics.
“It allowed us to shift focus not on the measurement for the sake of measurement, but to see how it is meaningfully implemented to improve care,” said Crabtree. “Everyone has a place in the process.”
MUSC also launched an evidence-based process for order sets, and provided guidance on the need to ask for more granular information from patients about their care.
Barnes said the changes are starting to work, but they are still waiting to learn their final reporting rates. “Performance rates matter. I expect we’ll see this reflected in the final performance rates,” she said.';