Health Affairs: Cost-effectiveness analyses must have role in quality assessment

Providing statistics of costs and use of healthcare interventions to compare the values and outcomes of interventions between institutions is necessary, and must be upheld to raise quality of care measures and also reduce spending, according to an analysis and commentary published in the October issue of Health Affairs.

Alan M. Garber, MD, PhD, of the VA Palo Alto HealthCare System in Palo Alto, Calif., and Harold C. Sox, MD, of the Dartmouth Medical School in Lebanon, N.H., wrote that the passage of the Patient Protection and Affordable Care Act, which focuses on comparativeness effectiveness research, has actually limited the role of cost analysis performed by the Patient-Centered Outcomes Research Institute. But Garber and Sox argued that cost-effectiveness data are necessary and must play a larger part within the healthcare industry.

The act was facilitated to aid patients in making decisions about healthcare systems and healthcare. “But, knowing the clinical effectiveness and risks of a health intervention is not enough for many potential users. They also want to know whether the intervention’s health benefits are commensurate with its costs—that is, whether it delivers value,” Garber and Sox wrote.

However, the researchers wrote that the act has prohibited Medicare from using cost-effectiveness as a factor in making coverage and reimbursement decisions and also limited comparative effectiveness research by the Medicare program.

“How can congressionally mandated restrictions on the use of such research be reconciled with the need to address issues of value in healthcare?” Garber and Sox wrote.

The researchers said that cost-effectiveness analyses are less established, particularly in the U.S., and are often very controversial. Additionally, the researchers said that some who oppose cost-effectiveness analysis do so with the argument that the government should not mix cost considerations with clinical research; however, those who were for comparative effectiveness research argued that this was not a tool to control costs.

The law language is limiting, wrote Garber and Sox, and said that it discourages the use of quality-adjusted life years (QALYs) “in ways that would disadvantage the elderly and people with disabilities.”

Because comparative effectiveness research tends to focus on old techniques, such as clinical research, outcomes research and health services research, Garber and Sox deemed the new term “nothing new,” but said that “in another sense, everything is new.”

Garber and Sox said that they believe the research should be guided by four principles:
  • Real-world settings: compare alternatives that are most relevant to decision making in the real-world clinical practice. Therefore, treatment should not be compared to placebo, for example;
  • Representative populations: Studies should represent patients who would receive the intervention in clinical settings so that results are more applicable to a wider population;
  • Personalized healthcare: The research should support personalized approaches to healthcare by identifying patient-specific characteristics that account for differences in the way individuals respond to therapy; and
  • Full information: Comparative effectiveness research should seek to measure all outcomes that are important to patients.

Garber and Sox urged that use and cost data be included in comparativeness effectiveness research because costs are important outcomes of interventions; however, they said that they do not feel the Patient-Centered Outcomes Research Institute should provide routine cost-effectiveness analysis.

“Doing so might lead to the appearance, if not the reality, that the institute was attempting to define care standards for federal health insurance programs in the U.S., which the Patient Protection and Affordable Care Act discouraged,” they wrote.

"Growth in healthcare spending is a major threat to the economic vitality of the U.S.,” the authors wrote. “The federal comparative effectiveness research initiative should provide the most accurate and comprehensive evidence for decision makers. We cannot afford to ration information that will help us make better decisions about health and healthcare,” the researchers concluded.

Around the web

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

Trimed Popup
Trimed Popup