AHIMA: MU especially challenging for rural hospitals
CHICAGO--Rural hospitals are struggling more than other facilities to achieve expected levels of Meaningful Use over the next few years, according to a panel of speakers at the American Health Information Management Association’s (AHIMA) 84th Annual Conference & Exhibition.
Quality reporting is the biggest concern in achieving Meaningful Use, said Ryan Sandefer, MA, chair of the health informatics and information management department at The College of St. Scholastica in Duluth, Minn., because there are 28 different data elements required to be captured in a structured format across six different information systems, “and it gets pretty complicated.”
Another reporting challenge is that, of the 28 different data elements, over half on the hospital side come from structured clinical documentation. “That’s a huge transition.” Hospitals not only have to get the systems implemented but they have to get the doctors on board for better documentation, he said.
The $500,000, 18-month project had four priorities: health information exchange, privacy and security, standardization and quality improvement efforts. Each participating facility was required to have core members to make up the project team with at least one member from HIM, IT, nursing, quality and a physician. They conducted a needs assessment and worked to establish a standardized process for electronic capture of clinical data, “the heart of project.”
SISU Medical Solutions in Duluth manages the IT infrastructure and skeleton EHR for 17 rural hospitals, including technical services, data services and consulting, explained Trina Lower, quality and health information services, Mercy Hospital and Health Center in Moose Lake, Minn.
Mercy started its EHR journey back in 1999 with the Y2K fear, Lower said. As of 2010, “we consider ourselves fully electronic.”
Through documentation of the current environment, “we were surprised to find variation in data capture processes.” For example, there were numerous discrepancies around time measures. There were multiple places where that could be captured as well as many different definitions. Does a surgery start as soon as the patient enters the operating room? Does a patient’s care begin when he or she is received in the emergency department or when he or she goes through the registration process? Those differences could result in wide variation, she said.
The team decided which would be true reporting times and increased the number of discrete data fields and increased utilization of computerized physician order entry. The project team also developed a repeatable process for standardization of clinical data.
During the team’s data element evaluation, they wanted to take workflows and identify who can enter, revise and undo data by each element level, Lower said. There’s an assumption that just about anyone can do all three and “that’s not the case.” There were situations where physicians could enter and revise but can’t undo information. The team needed to define attributes, and drill down into every data elements. Those facilities still using paper records posed a particular challenge for the process.
There were other project challenges, Lower said. For example, many team members wear multiple hats, some participating facilities had staffing issues and changes; and software upgrades and other implementation schedules competed with the project’s timeline.
More answers were required for questions such as who would house and maintain the quality measures going forward, said Brooke Palkie, MA, also of the department of healthcare informatics and information management at St. Scholastica.
The team learned that the process was decentralized and managed by different departments at the participating facilities. “We tried to standardize as much as we could,” said Palkie. “Data capture is a natural byproduct of clinical workflows but we didn’t really consider clinical workflows. We found they are important to consider when going through this process.”
For example, the concept of clinical trials alone raised numerous questions. What does clinical trial mean? Any trial or only a trial that affects patients’ current care? “How are you going to standardize that across your organization?” Palkie said.
Another challenge for the project team was that nearly half of the clinical quality measures contained within the Meaningful Use program apply to CAHs only minimally, she said. For example, one CAH in Minnesota had just three stroke cases admitted as inpatients in fiscal 2012. “Most cases are transferred. It’s difficult to gain staff buy-in when volumes are so small.” Meaningful Use requires a lot of work for very small volumes so facilities must think about how to approach their staffs to sell the effort and educate them.
Speaking of education, Palkie said the major challenge was educating clinicians about software changes, including explaining why they are required to collect a bit of information that appears to be clinically irrelevant yet time-consuming. Meaningful Use project leaders need to think about what each measure is trying to accomplish and how it can transition to the electronic environment, Lower said.
The intent of the Meaningful Use legislation is to reward high-performing facilities, she said, but many. rural facilities don’t have a sufficient number of patients to accurately track. The Office of the National Coordinator of Health IT has “put out a call to save CAHs because they don’t have the capability to do attestation and are struggling with quality measures.”