QC integration optimizes digital imaging environment

The integration of quality control (QC) procedures for image and data integrity in a digital environment not only improves patient care, it reduces both professional and administrative costs to a medical imaging practice.

“Excellence in imaging is defined as the production of images utilizing accurate demographics, positioning, protocol and technique resulting in diagnostic quality images for interpretation,” said Kathy Tabor McEwan, executive director of imaging at Boca Raton Community Hospital in Boca Raton, Fla.

McEwan, along with Boston-based Massachusetts General Hospital (MGH) Director of Clinical Operations, Mary-Theresa Shore, examined digital QC issues in a presentation last week at the 2009 Association for Medical Imaging Management (AHRA) meeting in Las Vegas.

In a digital imaging environment, McEwan noted six image integrity categories that can affect a practice:
•    Incomplete or non-transmitted exam;
•    Incorrect or incomplete demographics entered into RIS;
•    Mis- or non-labeling of an image;
•    Incorrect technique;
•    Incorrect protocol; and
•    Incorrect positioning.

“Excellence in quality data integrity is defined by the proper delivery of timely, accurate and complete clinical information and images to PACS,” said Shore.

She identified six categories of data integrity that can impact a practice:
•    Unverified (exam fails verification such as a medical record number and accession number mismatch);
•    Mischeduled (such as an exam archived to a PACS with the incorrect exam code);
•    Misidentified (an exam associated with the wrong patient information);
•    Mislabeled or non-labeling of an image;
•    Merged (two different exams merged under one accession number); and
•    Cancelled (exam performed and archived in PACS but has a cancelled status in RIS).

McEwan and Shore, who worked together at MGH, performed a problem assessment of data integrity errors over a year period at that institution. Over the course of the year, data integrity errors grew from 325 a month to 480 a month.

The pair then did a resource utilization analysis for a PACS analyst, discovering that 35 percent of the employee’s time was spent handling data integrity issues.

McEwan and Shore next looked at the time required to handle the data integrity errors and found that this too had increased, from 81 hours in the beginning of the year to 120 hours in the final month.

They developed a pilot project to improve image quality and data integrity for CT, MRI, ultrasound and general radiology exams in the MGH emergency radiology department.

“We created a multidisciplinary team to monitor QC/QA processes and workflow to achieve excellence in imaging outcomes and data integrity resulting in high-quality patient care,” McEwan said.

Prior to the implementation of their pilot project, image integrity issues were requiring 86 hours of rework and data integrity issues 40 hours of rework, for a total of 126 hours per month. Post-implementation, image integrity issues decreased to 27 hours of rework and data integrity rework was cut to 15 hours for a total of 42 hours per month.

Overall, they reported a 67 percent decrease in rework time, a 73 percent reduction in image integrity issues and a 76 percent decrease in data integrity errors.

There is a cost associated with image and data errors that is both clinical and financial, noted Shore and McEwan. The integration of new technologies into a practice requires that processes be redefined and appropriate QC/QA tools be put in place to decrease errors at their source of origin.

Around the web

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

With generative AI coming into its own, AI regulators must avoid relying too much on principles of risk management—and not enough on those of uncertainty management.

Trimed Popup
Trimed Popup