Webinar: CMS' quality measure reporting specs take shape
As organizations prepare for Stage 1 meaningful use requirements, the standards community is hammering out an end-to-end process for quality measure reporting, said Robert H. Dolin, MD, chair of Health Level 7. Dolin, who is also principle at Semantically Yours, discussed two key elements of this effort during a recent HIMSS-sponsored webinar, titled “Quality Measure Electronic Specifications for EHRs: A CMS perspective.”
The first are eMeasures—non-patient-specific, formal representations of quality measures—which express “paper measures” in a standard format (the HL7 Health Quality Measures Format, or HL7 eMeasure standard), said Dolin. “Ideally, in this end-to-end framework, providers [will be able to] push a button and import these eMeasures.”
These can be turned into queries that can automatically query EHRs’ data stores, and generate reports for internal use or for external reports to quality organizations. Based on criteria specified in an eMeasure, a facility can determine whether patients met numerator, denominator or exclusionary criteria for a particular measure, and can automatically report that information internally or to quality organizations, said Dolin.
The second element is a standard for quality reporting: HL7 CDA R2 Quality Reporting Document Architecture, or QRDA, said Dolin. “QRDA is an implementation guide for CDA electronic documents that allow you to communicate numerator/denominator data or individual patient data so a quality organization can do its own aggregation. We had multiple use cases for quality reporting,” said Dolin, so the QRDA standard enables sites to send quality reports that have individual patient-level data or aggregate data.
“The process we’re developing is that the eMeasure gets pulled in [and] processed, and what gets pushed out is the corresponding QRDA.”
EHR certification plays a role in this process, according to Dolin. “A certified EHR system has to be able to generate an HL7 CCD document, and in that CCD document, the EHR has to be able to send out structured problems, medications, possibly procedures, allergies,” he said. Because it’s certified, “we assume that it has certain capabilities to ’speak’ in terms of [these same things]. It stands to reason that if I write my eMeasure criteria to the level at which I know the EHR can speak, chances are the EHR will be able to import the eMeasure.”
The framework still needs testing, “but as we develop eMeasures and QRDA with EHRs, through meaningful use [requirements] and the ONC final rule, we are enabling incremental sophistication in the reports we can create,” said Dolin. “The way we envision nationwide sharing of eMeasures, ... is we have to define the core set of capabilities that all EHRs have.
“Once we’ve done that, then we can assume that if I write an eMeasure that bases my criteria on artifacts that the EHR can produce, that the EHR should be able to reason with those eMeasures.”
QRDA contains data elements for determining criteria based on individual patient data. Some quality organizations wanted to be responsible for receiving patient-level data and computing the numerator and denominator data. Therefore, the QRDA must have sufficient data to compute whether a patient meets numerator/denominator/exclusion criteria, according to Dolin.
“QRDA includes individual patient data; lays out individual, sufficient patient-level data elements you would need to compute eMeasure criteria, such as age, encounter, encounter admit date, encounter discharge diagnosis, access to problem list, discharge medications,” he said. “In terms of data criteria—the atoms, building blocks for population criteria; and population criteria—the molecules that take data criteria and assemble them.”
Many data criteria are built on National Quality Foundation Health IT Expert Panel (HITEP)-defined Quality Data Elements (QDEs), said Dolin.
“For eMeasures, we’re basing data criteria on HITEP QDEs, where we take generic QDEs defined by HITEP, and couple them with code lists,” he said. “You wind up developing a regimented, consistent process for developing eMeasures. Because data criteria are built from common set of building blocks, your ability to automatically import an eMeasure into your EHR and automatically process it start to go way up,” Dolin said.
QRDA Category 1 is an official HL7 draft standard for trial use (single patient report). Category 2 (patient list report) and Category 3 (calculated report) are in draft form but have not been balloted, he said.
The next steps include:
“We want to incrementally continue to raise the bar through template CDA. This, over time, further enables our ability to query the system and report out of the system. If EHR can communicate data, then theoretically the EHR can be queried for the data,” said Dolin.
The first are eMeasures—non-patient-specific, formal representations of quality measures—which express “paper measures” in a standard format (the HL7 Health Quality Measures Format, or HL7 eMeasure standard), said Dolin. “Ideally, in this end-to-end framework, providers [will be able to] push a button and import these eMeasures.”
These can be turned into queries that can automatically query EHRs’ data stores, and generate reports for internal use or for external reports to quality organizations. Based on criteria specified in an eMeasure, a facility can determine whether patients met numerator, denominator or exclusionary criteria for a particular measure, and can automatically report that information internally or to quality organizations, said Dolin.
The second element is a standard for quality reporting: HL7 CDA R2 Quality Reporting Document Architecture, or QRDA, said Dolin. “QRDA is an implementation guide for CDA electronic documents that allow you to communicate numerator/denominator data or individual patient data so a quality organization can do its own aggregation. We had multiple use cases for quality reporting,” said Dolin, so the QRDA standard enables sites to send quality reports that have individual patient-level data or aggregate data.
“The process we’re developing is that the eMeasure gets pulled in [and] processed, and what gets pushed out is the corresponding QRDA.”
EHR certification plays a role in this process, according to Dolin. “A certified EHR system has to be able to generate an HL7 CCD document, and in that CCD document, the EHR has to be able to send out structured problems, medications, possibly procedures, allergies,” he said. Because it’s certified, “we assume that it has certain capabilities to ’speak’ in terms of [these same things]. It stands to reason that if I write my eMeasure criteria to the level at which I know the EHR can speak, chances are the EHR will be able to import the eMeasure.”
The framework still needs testing, “but as we develop eMeasures and QRDA with EHRs, through meaningful use [requirements] and the ONC final rule, we are enabling incremental sophistication in the reports we can create,” said Dolin. “The way we envision nationwide sharing of eMeasures, ... is we have to define the core set of capabilities that all EHRs have.
“Once we’ve done that, then we can assume that if I write an eMeasure that bases my criteria on artifacts that the EHR can produce, that the EHR should be able to reason with those eMeasures.”
QRDA contains data elements for determining criteria based on individual patient data. Some quality organizations wanted to be responsible for receiving patient-level data and computing the numerator and denominator data. Therefore, the QRDA must have sufficient data to compute whether a patient meets numerator/denominator/exclusion criteria, according to Dolin.
“QRDA includes individual patient data; lays out individual, sufficient patient-level data elements you would need to compute eMeasure criteria, such as age, encounter, encounter admit date, encounter discharge diagnosis, access to problem list, discharge medications,” he said. “In terms of data criteria—the atoms, building blocks for population criteria; and population criteria—the molecules that take data criteria and assemble them.”
Many data criteria are built on National Quality Foundation Health IT Expert Panel (HITEP)-defined Quality Data Elements (QDEs), said Dolin.
“For eMeasures, we’re basing data criteria on HITEP QDEs, where we take generic QDEs defined by HITEP, and couple them with code lists,” he said. “You wind up developing a regimented, consistent process for developing eMeasures. Because data criteria are built from common set of building blocks, your ability to automatically import an eMeasure into your EHR and automatically process it start to go way up,” Dolin said.
QRDA Category 1 is an official HL7 draft standard for trial use (single patient report). Category 2 (patient list report) and Category 3 (calculated report) are in draft form but have not been balloted, he said.
The next steps include:
- eMeasure authoring guidelines “to remove some of the wiggle room in the measure step,” said Dolin.
- Clarifying where QRDA begins and an eMeasure ends.
- Develop a “templated CDA” to define the EHR interface such that it can be queried and can be reported.
“We want to incrementally continue to raise the bar through template CDA. This, over time, further enables our ability to query the system and report out of the system. If EHR can communicate data, then theoretically the EHR can be queried for the data,” said Dolin.