Article Type
Changed
Fri, 09/14/2018 - 12:35
Display Headline
When It Comes to Quality Measures, Size Matters

You work in a small rural hospital. In one year, you admit six patients with acute myocardial infarctions (AMI). You follow CMS and Hospital Quality Alliance guidelines for the eight process measures for AMI, and your hospital scores 100% for that year.

A neighboring hospital isn’t as lucky: One of its four AMI admits, a 99-year-old man, refuses a beta blocker at discharge. What could have been a perfect score (a beta blocker prescribed four out of a possible four times, or 100%), is now 75%.

A study released in June by Duke University Medical Center elucidates the challenges faced by small hospitals when they report performance measures. Smaller hospitals, according to the study, are more likely to rate as top performers when reporting on the eight AMI process measures.1 However, the authors conclude, reports such as those required by Medicare, which ignore denominator size when assessing process performance, can unfairly reward or penalize hospitals.

“The scores can be very misleading,” says Randy Ferrance, DC, MD, a hospitalist at the 67-bed Riverside Tappahannock Hospital in Tappahannock, Va. “If we miss aspirin on discharge for one patient and everything else was perfect, we have the potential to slide into a lower percentile, whereas larger hospitals can miss aspirin at discharge and do just fine.”

Small Denominators, Big Differences

Doug Koekkoek, MD, is in a unique position to see how performance and quality metrics vary by hospital size. As chief medical officer of the Providence Hospitalist Programs in Oregon, Dr. Koekkoek oversees two tertiary facilities, Providence Portland Medical Center (483 beds) and Providence St. Vincent Medical Center (523 beds), as well as a 77-bed community hospital (Providence Milwaukie Hospital), a 40-bed community hospital (Providence Newberg Medical Center), and a 24-bed critical access hospital (Providence Seaside Hospital).

“When we do a roll-up, looking at our appropriate care score, which looks at all the CMS metrics for AMI, congestive heart failure, and pneumonia, we can see that in the bigger institutions, where you have a much bigger denominator of patients who qualify for each diagnosis, the trends are fairly even,” Dr. Koekkoek says. “But in the smaller hospitals, there is much greater variability.”

Rather than focus on each month’s scores, he looks at trends for several months to get a better sense of how his hospitals rate. “You can run at 100% on the heart-failure measures for nine months and then, if your denominator is 10 cases in a quarter and you miss only two or three of the measures, all of a sudden, you’re in the 80% or 70% performance percentile,” he says. “You don’t get a full picture unless you’re looking back over the last six, eight, or 10 months.”

The American Hospital Association (AHA) recommends presenting data to consumers in the same way. “We encourage our hospitals to not let the data themselves tell the story, but to help set them in context and portray to the communities they serve exactly what the data mean,” says Nancy Foster, AHA’s vice president for quality and patient safety.

Foster concedes the issue raised in the Duke study, that quality data don’t reflect low case volumes, has plagued the data-reporting process, but the AHA believes the process should continue. “We firmly believe that all hospitals ought to be sharing good, reliable information on the quality of care they’re providing with the communities they serve,” she says.

Document Challenges

Conveying an accurate representation of your hospital starts with appropriate documentation, says Christian Voge, MD, a hospitalist with Central Coast Chest Consultants, which provides coverage to Sierra Vista Regional Medical Center and French Hospital Medical Center in San Luis Obispo, Calif.

 

 

He gives an example: An ACE inhibitor—one of the CMS care process measures for AMI—is contraindicated in a patient. “The way the rules are, if the physician does not document the reason for not giving the medication, this will look like you simply did not meet that measure and will show up as a deficiency.”

It’s similar to billing and coding processes, says hospitalist Joseph Babbitt, MD, who works at the 25-bed Blue Hill Memorial Hospital in Blue Hill, Maine, “It’s not about what you do. It’s about what you document,” he says. “You can provide ‘the best care,’ but if you didn’t write down why an ACE inhibitor was contraindicated and not given, this will not show up as ‘the best care.’ ”

Another complicating factor, in the opinion of Matthew Szvetecz, MD, a hospitalist at St. Mary Medical Center, a rural hospital with 142 beds in Walla Walla, Wash., is severity indexes for determining patients' underlying risk for complications and mortality “are very coarse–there could be small hospitals taking care of very sick patients that are not getting picked up because they do not have that level of detail in an interpretable format."

More Accurate Results

It’s true smaller hospitals are more vulnerable to large swings in performance ratings. However, with fewer staff who need to buy into the process, these hospitals may have an advantage over larger institutions when launching quality improvement initiatives. Case in point: Gifford Medical Center, a 25-bed critical access hospital in Randolph, Vt.

Hospitalist Josh Plavin, MD, MPH, who is board certified in internal medicine and pediatrics, serves as Gifford’s medical director. The current hospitalist program consists of one hospitalist and three physician assistants who provide round-the-clock coverage. For the hospital’s quality improvement effort, all admitting staff, including the eight emergency room providers, must use the hospital’s systemwide, CMS-compliant order set. In addition, quality management staff participate in multidisciplinary rounds and help track performance measures for patients admitted to the hospital. According to Dr. Plavin, the hospital has been 100% compliant with CMS measures the past three quarters since instituting this system.

Dr. Voge agrees smaller hospitals better lend themselves to quality improvement initiatives. “If you have only three or four hospitalists with a contractual arrangement with the hospital, they’re going to be a little more open to ensuring that their numbers–and the hospital’s numbers–look good,” he notes.

Reference

1. O’Brien SM, DeLong ER, and Peterson ED. Impact of case volume on hospital performance assessment. Arch Intern Med. June 2008;168(12):1277-1284.

Issue
The Hospitalist - 2008(11)
Publications
Sections

You work in a small rural hospital. In one year, you admit six patients with acute myocardial infarctions (AMI). You follow CMS and Hospital Quality Alliance guidelines for the eight process measures for AMI, and your hospital scores 100% for that year.

A neighboring hospital isn’t as lucky: One of its four AMI admits, a 99-year-old man, refuses a beta blocker at discharge. What could have been a perfect score (a beta blocker prescribed four out of a possible four times, or 100%), is now 75%.

A study released in June by Duke University Medical Center elucidates the challenges faced by small hospitals when they report performance measures. Smaller hospitals, according to the study, are more likely to rate as top performers when reporting on the eight AMI process measures.1 However, the authors conclude, reports such as those required by Medicare, which ignore denominator size when assessing process performance, can unfairly reward or penalize hospitals.

“The scores can be very misleading,” says Randy Ferrance, DC, MD, a hospitalist at the 67-bed Riverside Tappahannock Hospital in Tappahannock, Va. “If we miss aspirin on discharge for one patient and everything else was perfect, we have the potential to slide into a lower percentile, whereas larger hospitals can miss aspirin at discharge and do just fine.”

Small Denominators, Big Differences

Doug Koekkoek, MD, is in a unique position to see how performance and quality metrics vary by hospital size. As chief medical officer of the Providence Hospitalist Programs in Oregon, Dr. Koekkoek oversees two tertiary facilities, Providence Portland Medical Center (483 beds) and Providence St. Vincent Medical Center (523 beds), as well as a 77-bed community hospital (Providence Milwaukie Hospital), a 40-bed community hospital (Providence Newberg Medical Center), and a 24-bed critical access hospital (Providence Seaside Hospital).

“When we do a roll-up, looking at our appropriate care score, which looks at all the CMS metrics for AMI, congestive heart failure, and pneumonia, we can see that in the bigger institutions, where you have a much bigger denominator of patients who qualify for each diagnosis, the trends are fairly even,” Dr. Koekkoek says. “But in the smaller hospitals, there is much greater variability.”

Rather than focus on each month’s scores, he looks at trends for several months to get a better sense of how his hospitals rate. “You can run at 100% on the heart-failure measures for nine months and then, if your denominator is 10 cases in a quarter and you miss only two or three of the measures, all of a sudden, you’re in the 80% or 70% performance percentile,” he says. “You don’t get a full picture unless you’re looking back over the last six, eight, or 10 months.”

The American Hospital Association (AHA) recommends presenting data to consumers in the same way. “We encourage our hospitals to not let the data themselves tell the story, but to help set them in context and portray to the communities they serve exactly what the data mean,” says Nancy Foster, AHA’s vice president for quality and patient safety.

Foster concedes the issue raised in the Duke study, that quality data don’t reflect low case volumes, has plagued the data-reporting process, but the AHA believes the process should continue. “We firmly believe that all hospitals ought to be sharing good, reliable information on the quality of care they’re providing with the communities they serve,” she says.

Document Challenges

Conveying an accurate representation of your hospital starts with appropriate documentation, says Christian Voge, MD, a hospitalist with Central Coast Chest Consultants, which provides coverage to Sierra Vista Regional Medical Center and French Hospital Medical Center in San Luis Obispo, Calif.

 

 

He gives an example: An ACE inhibitor—one of the CMS care process measures for AMI—is contraindicated in a patient. “The way the rules are, if the physician does not document the reason for not giving the medication, this will look like you simply did not meet that measure and will show up as a deficiency.”

It’s similar to billing and coding processes, says hospitalist Joseph Babbitt, MD, who works at the 25-bed Blue Hill Memorial Hospital in Blue Hill, Maine, “It’s not about what you do. It’s about what you document,” he says. “You can provide ‘the best care,’ but if you didn’t write down why an ACE inhibitor was contraindicated and not given, this will not show up as ‘the best care.’ ”

Another complicating factor, in the opinion of Matthew Szvetecz, MD, a hospitalist at St. Mary Medical Center, a rural hospital with 142 beds in Walla Walla, Wash., is severity indexes for determining patients' underlying risk for complications and mortality “are very coarse–there could be small hospitals taking care of very sick patients that are not getting picked up because they do not have that level of detail in an interpretable format."

More Accurate Results

It’s true smaller hospitals are more vulnerable to large swings in performance ratings. However, with fewer staff who need to buy into the process, these hospitals may have an advantage over larger institutions when launching quality improvement initiatives. Case in point: Gifford Medical Center, a 25-bed critical access hospital in Randolph, Vt.

Hospitalist Josh Plavin, MD, MPH, who is board certified in internal medicine and pediatrics, serves as Gifford’s medical director. The current hospitalist program consists of one hospitalist and three physician assistants who provide round-the-clock coverage. For the hospital’s quality improvement effort, all admitting staff, including the eight emergency room providers, must use the hospital’s systemwide, CMS-compliant order set. In addition, quality management staff participate in multidisciplinary rounds and help track performance measures for patients admitted to the hospital. According to Dr. Plavin, the hospital has been 100% compliant with CMS measures the past three quarters since instituting this system.

Dr. Voge agrees smaller hospitals better lend themselves to quality improvement initiatives. “If you have only three or four hospitalists with a contractual arrangement with the hospital, they’re going to be a little more open to ensuring that their numbers–and the hospital’s numbers–look good,” he notes.

Reference

1. O’Brien SM, DeLong ER, and Peterson ED. Impact of case volume on hospital performance assessment. Arch Intern Med. June 2008;168(12):1277-1284.

You work in a small rural hospital. In one year, you admit six patients with acute myocardial infarctions (AMI). You follow CMS and Hospital Quality Alliance guidelines for the eight process measures for AMI, and your hospital scores 100% for that year.

A neighboring hospital isn’t as lucky: One of its four AMI admits, a 99-year-old man, refuses a beta blocker at discharge. What could have been a perfect score (a beta blocker prescribed four out of a possible four times, or 100%), is now 75%.

A study released in June by Duke University Medical Center elucidates the challenges faced by small hospitals when they report performance measures. Smaller hospitals, according to the study, are more likely to rate as top performers when reporting on the eight AMI process measures.1 However, the authors conclude, reports such as those required by Medicare, which ignore denominator size when assessing process performance, can unfairly reward or penalize hospitals.

“The scores can be very misleading,” says Randy Ferrance, DC, MD, a hospitalist at the 67-bed Riverside Tappahannock Hospital in Tappahannock, Va. “If we miss aspirin on discharge for one patient and everything else was perfect, we have the potential to slide into a lower percentile, whereas larger hospitals can miss aspirin at discharge and do just fine.”

Small Denominators, Big Differences

Doug Koekkoek, MD, is in a unique position to see how performance and quality metrics vary by hospital size. As chief medical officer of the Providence Hospitalist Programs in Oregon, Dr. Koekkoek oversees two tertiary facilities, Providence Portland Medical Center (483 beds) and Providence St. Vincent Medical Center (523 beds), as well as a 77-bed community hospital (Providence Milwaukie Hospital), a 40-bed community hospital (Providence Newberg Medical Center), and a 24-bed critical access hospital (Providence Seaside Hospital).

“When we do a roll-up, looking at our appropriate care score, which looks at all the CMS metrics for AMI, congestive heart failure, and pneumonia, we can see that in the bigger institutions, where you have a much bigger denominator of patients who qualify for each diagnosis, the trends are fairly even,” Dr. Koekkoek says. “But in the smaller hospitals, there is much greater variability.”

Rather than focus on each month’s scores, he looks at trends for several months to get a better sense of how his hospitals rate. “You can run at 100% on the heart-failure measures for nine months and then, if your denominator is 10 cases in a quarter and you miss only two or three of the measures, all of a sudden, you’re in the 80% or 70% performance percentile,” he says. “You don’t get a full picture unless you’re looking back over the last six, eight, or 10 months.”

The American Hospital Association (AHA) recommends presenting data to consumers in the same way. “We encourage our hospitals to not let the data themselves tell the story, but to help set them in context and portray to the communities they serve exactly what the data mean,” says Nancy Foster, AHA’s vice president for quality and patient safety.

Foster concedes the issue raised in the Duke study, that quality data don’t reflect low case volumes, has plagued the data-reporting process, but the AHA believes the process should continue. “We firmly believe that all hospitals ought to be sharing good, reliable information on the quality of care they’re providing with the communities they serve,” she says.

Document Challenges

Conveying an accurate representation of your hospital starts with appropriate documentation, says Christian Voge, MD, a hospitalist with Central Coast Chest Consultants, which provides coverage to Sierra Vista Regional Medical Center and French Hospital Medical Center in San Luis Obispo, Calif.

 

 

He gives an example: An ACE inhibitor—one of the CMS care process measures for AMI—is contraindicated in a patient. “The way the rules are, if the physician does not document the reason for not giving the medication, this will look like you simply did not meet that measure and will show up as a deficiency.”

It’s similar to billing and coding processes, says hospitalist Joseph Babbitt, MD, who works at the 25-bed Blue Hill Memorial Hospital in Blue Hill, Maine, “It’s not about what you do. It’s about what you document,” he says. “You can provide ‘the best care,’ but if you didn’t write down why an ACE inhibitor was contraindicated and not given, this will not show up as ‘the best care.’ ”

Another complicating factor, in the opinion of Matthew Szvetecz, MD, a hospitalist at St. Mary Medical Center, a rural hospital with 142 beds in Walla Walla, Wash., is severity indexes for determining patients' underlying risk for complications and mortality “are very coarse–there could be small hospitals taking care of very sick patients that are not getting picked up because they do not have that level of detail in an interpretable format."

More Accurate Results

It’s true smaller hospitals are more vulnerable to large swings in performance ratings. However, with fewer staff who need to buy into the process, these hospitals may have an advantage over larger institutions when launching quality improvement initiatives. Case in point: Gifford Medical Center, a 25-bed critical access hospital in Randolph, Vt.

Hospitalist Josh Plavin, MD, MPH, who is board certified in internal medicine and pediatrics, serves as Gifford’s medical director. The current hospitalist program consists of one hospitalist and three physician assistants who provide round-the-clock coverage. For the hospital’s quality improvement effort, all admitting staff, including the eight emergency room providers, must use the hospital’s systemwide, CMS-compliant order set. In addition, quality management staff participate in multidisciplinary rounds and help track performance measures for patients admitted to the hospital. According to Dr. Plavin, the hospital has been 100% compliant with CMS measures the past three quarters since instituting this system.

Dr. Voge agrees smaller hospitals better lend themselves to quality improvement initiatives. “If you have only three or four hospitalists with a contractual arrangement with the hospital, they’re going to be a little more open to ensuring that their numbers–and the hospital’s numbers–look good,” he notes.

Reference

1. O’Brien SM, DeLong ER, and Peterson ED. Impact of case volume on hospital performance assessment. Arch Intern Med. June 2008;168(12):1277-1284.

Issue
The Hospitalist - 2008(11)
Issue
The Hospitalist - 2008(11)
Publications
Publications
Article Type
Display Headline
When It Comes to Quality Measures, Size Matters
Display Headline
When It Comes to Quality Measures, Size Matters
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)