¿´Æ¬ÊÓƵ

News

Hospital 'report cards' found not effective for quality improvement

Published: 25 July 2005

Hospitals that were given feedback on their performance on certain quality indicators for treating heart attack patients did not show more improvement in those areas than hospitals that were provided with the feedback at a later date, according to a study in the July 20 issue of JAMA.

Acute myocardial infarction (AMI; heart attack) patients often do not receive recommended evidence-based treatments, according to background information in the article. There is increasing interest in implementing quality improvement strategies for AMI care. One quality improvement strategy that has been suggested is feedback on "quality indicators" to hospitals and clinicians treating AMI patients. Quality indicators are defined as a summary of clinical performance over a specified time. It is suggested that "report cards" presenting a summary of quality indicators relevant to care provided by individual hospitals can catalyze quality improvement at these hospitals. Ideally, hospital report cards provide clinicians with an accurate picture of the care they deliver and provide benchmarks for comparison, such as the care delivered at other hospitals or recommended target rates. Hospital report cards are increasingly being implemented in the United States and some parts of Canada as a strategy for quality improvement in many areas of health care, despite lack of strong evidence to support their use.

Christine A. Beck, MSc, of ¿´Æ¬ÊÓƵ University Health Centre, Montreal, and colleagues conducted a study to determine whether hospital report cards produced using linked administrative databases are effective for improving AMI care. The study included patients with AMI who were admitted to 76 acute care hospitals in Quebec that treated at least 30 AMI patients per year between April 1, 1999, and March 31, 2003. The hospitals were randomly assigned to receive rapid (immediate; n = 38 hospitals and 2,533 patients) or delayed (14 months; n = 38 hospitals and 3,142 patients) confidential feedback on quality indicators developed using administrative data. The quality indicators pertained to processes of care and outcomes of patients admitted between 4 and 10 months after randomization. The primary indicator was the proportion of elderly survivors of AMI at each study hospital who filled a prescription for a beta-blocker within 30 days after discharge.

The researchers found that at follow-up, adjusted prescription rates within 30 days after discharge were similar in the early vs. late groups for beta-blockers, angiotensin-converting enzyme (ACE) inhibitors, lipid-lowering drugs and aspirin. In addition, the adjusted death rate was similar in both groups, as were length of in-hospital stay, physician visits after discharge, waiting times for invasive cardiac procedures, and readmissions for cardiac complications.

"In this cluster randomized controlled trial, confidential feedback provided to hospitals in the form of report cards constructed using linked administrative data was not effective in improving quality of AMI care. Our results suggest that even if the United States eventually acquires these types of administrative data through the Medicare program, confidential feedback based on these data are unlikely to be a sufficient strategy for health care quality improvement," the authors write. "More intensive interventions, which could include chart review and continuous and/or public data feedback accompanied by other multimodal interventions, such as team workshops and standard orders, may be effective, but a need remains to study these interventions and their cost-benefit ratios in well-controlled randomized trials."

The researchers speculate that there could be several reasons for the lack of effectiveness of the study intervention, including "that the administrative data were perceived as invalid or irrelevant to practice. It is possible that report cards constructed using chart review data may be more effective than those constructed using administrative data because physicians are less skeptical of their data quality." (JAMA. 2005;294:309–317. Available at .)

Editor's Note: For funding/support information, please see the JAMA article.

The Research Institute of the ¿´Æ¬ÊÓƵ University Health Centre (RI MUHC) is a world-renowned biomedical and health care hospital research centre. Located in Montreal, Quebec, the institute is the research arm of the MUHC, a university health centre affiliated with the Faculty of Medicine at ¿´Æ¬ÊÓƵ The institute supports over 500 researchers, nearly 1,000 graduate and postdoctoral students, and operates more than 300 laboratories devoted to a broad spectrum of fundamental and clinical research. The Research Institute operates at the forefront of knowledge, innovation and technology and is inextricably linked to the clinical programs of the MUHC, ensuring that patients benefit directly from the latest research-based knowledge. For further details visit .

The ¿´Æ¬ÊÓƵ University Health Centre (MUHC) is a comprehensive academic health institution with an international reputation for excellence in clinical programs, research and teaching. The MUHC is a merger of five teaching hospitals affiliated with the Faculty of Medicine at ¿´Æ¬ÊÓƵ University — the Montreal Children's, Montreal General, Royal Victoria, and Montreal Neurological Hospitals, as well as the Montreal Chest Institute. Building on the tradition of medical leadership of the founding hospitals, the goal of the MUHC is to provide patient care based on the most advanced knowledge in the health care field, and to contribute to the development of new knowledge.

Back to top