Article Type
Changed
Fri, 09/14/2018 - 12:37
Display Headline
How Am I Doing?

How hospitalists assess their performance and hone their skills is critical to patient care. Continuing medical education (CME), relicensure, specialty recertification, and lifelong learning are all linked to hospitalists’ abilities to assess and meet their learning needs.

But the preponderance of evidence suggests physicians have limited ability to accurately assess their performance, according to a physician self-assessment literature review published in September 2006 in JAMA.1

“Self-assessment should be guided by tools designed by experts, based on standards, and aimed at filling gaps in knowledge, skills, and competencies—not simply the internally based self-rating of individual practitioners,” says C. Michael Fordis, MD, senior associate dean for con-

tinuing medical education at the Baylor College of Medicine in Houston, and one of the authors of the study.

“Hospitalists and other physicians are not doing themselves a service to rely on their own internal self-rated judgments of knowledge and performance,” Dr. Fordis says. “There’s too much to know, too much that’s changing, and too much that affects the implementation into practice of the knowledge that you have for any one person to be able to take care of patients and at the same time have some sense of whether there are gaps along that implementation pathway.”

“Guided” self-assessment represents the thinking of many experts who ask questions, consider guidelines, and suggest tools that can help physicians pursue the best ways of identifying those gaps that reflect differences in what they think they are doing and their actual performance.

Regular, consistent self-assessment is imperative for a self-regulating profession such as medicine. How well are hospitalists doing—and what mechanisms or tools do they use?

HOW TO SELF-ASSESS

  • Develop a more holistic continuing professional development process (learning portfolios, documentation of practice-based learning and improvement activities), creating less general and more detailed learning and practice objectives;
  • Reduce the variation between self- and external assessments by encouraging the internalization of objective measurements or benchmarks of performance;
  • Use multisource feedback evaluations especially to address more difficult improvement needs that may be difficult to assess (communication, psychosocial);
  • Consider using objective measures of competence and performance;
  • Increase the role of specialty societies by providing current evidence-based learning objectives on a regular basis to give members external markers of competence;
  • Make self-assessment an iterative process that particularly focuses on scope of practice; and
  • Use separate initiatives to identify physicians who require remediation. Although those professionals can also benefit from guided self-assessment, the process is designed primarily to support competent physicians who want to continuously improve their practice performance.—AS

Group Assessment

Hospital medicine groups are increasingly able to measure their clinical competence against other hospitals’ and hospitalist groups. SHM’s Benchmarks Committee has been working on performance assessment at a program level.

“When the JCAHO [Joint Comm­ission on Accreditation of Healthcare Organizations] Core Measures were coming out a few years back, as a whole most docs when reflecting on their practice would say they do a fine job within these measures,” says Burke T. Kealey, MD, chairman of the Benchmarks Committee from 2006-07. “For instance, [they might say] ‘I always send people out on ACE inhibitors and beta-blockers,’ or, ‘We always start people on aspirin when they come into the ER,’ but when you looked at the data, you found that their self-assessment was not as accurate as we hoped it would be.”

A lot of hard work went into discovering why their self-assessment was inaccurate. “We found there were documentation problems that they didn’t really incorporate a lot of the contraindications when giving their answer about self-assessment,” says Dr. Kealey, who leads the hospital medicine program at Regions Hospital and HealthPartners Medical Group in St. Paul, Minn.

 

 

If patients had kidney dysfunction or kidney failure, they were not discharged on ACE inhibitors.

“But we as doctors didn’t do a great job of explaining why we weren’t doing that,” Dr. Kealey says. “We were not transparent in our reasoning, but the core measures caused us to become more transparent, to explain what we were thinking and what we were doing in a way that the public could see.”

At SHM’s annual meeting in May, the Benchmarks Committee released the white paper “Measuring Hospitalist Performance: Metrics, Reports, and Dashboards” with the intent of assisting hospitals and hospital medicine programs develop or improve their performance monitoring and reporting.

“Hospitalists in general could do a better job of assessing themselves,” says Arpana Vidyarthi, MD, an assistant professor in the division of hospital medicine at the University of California, San Francisco (UCSF). “Self-assessment for those of us in cognitive specialties, like internists, is more complicated than in procedural specialties like surgery, partly because these procedural specialties have very specific outcomes that are linked to the procedure and that level of skill. With the new drivers of quality improvement and patient safety, and the dramatic increase of quality indicators for hospitals overall, this is now trickling down to thinking about how we truly assess the doctors themselves.”

The quality indicators that hospitalist groups are benchmarking may not be linked to the individual, she says. Dr. Vidyarthi, also director of quality for the Inpatient General Medicine Service at UCSF Medical Center, provides an example. “Pneumovax as a quality indicator is part of the Joint Commission core measures,” says Dr. Vidyarthi. “You can go online where it is publicly reported and choose this or other indicators to compare one hospital to another. That is the sort of benchmarking that some hospitalists groups are doing.”

But using that kind of evaluation for individual assessment misses the mark.

“Does the fact that the patient does not get Pneumovax reflect upon me and my abilities as a hospitalist? Not at all,” she says, “because my institution and those institutions who have done well with this specific indicator have taken it out of the hands of the doctors. It’s an automated sort of thing. At our hospital, the pharmacists do it.”

Although the American Board of Internal Medicine asks that the individual physician assess his or her own care as part of recredentialing, it’s more difficult for a hospitalist than for an outpatient internist. Hospitalists don’t have a panel of diabetic patients, for instance, for which the outcomes data can be easily analyzed.

Hospitalists as a group also haven’t had a tradition of self-assessment or peer assessment. Further, hospitalist groups differ as to how they handle assessments of individual physicians.

“In general if you ask our [UCSF] hospitalists, the way that we assess competency is generally through hospital privileging,” Dr. Vidyarthi says. Because the hospital as a whole reviews the competency of all the doctors that work there, the process known as “privileging” has consisted of asking a couple of colleagues to write letters of recommendation. “The division is changing this, but that is just on the cusp.

“We’ve built a new system for our quality committee in which one layer is peer assessment, looking at just the individual cases that bubble up from an incident report or a root-cause analysis or other sources. We’re looking at and identifying both systems issues and individual issues and trying to build a way to feed back those assessments.”

But that’s just half the equation, she says, the flip side being continual self-assessment for what a hospitalist is doing well.

 

 

To Dr. Kealey, self-assessment plays a significant role in helping physicians with their career goals and ensuring that their careers are on track and on target.

At HealthPartners, physicians fill out a self-evaluation form on which they list all activities they’ve been involved in over the previous year. Then they are asked what they got out of these activities, what their career goals are, and whether they are meeting them. They’re also asked how the group can help them reach those goals.

“We ask them to pause and reflect on where they’re headed with their career and their life, and put it down in writing so that in that moment they take the time to ask, ‘What is it that I’m ultimately after?’ ” says Dr. Kealey.

Day to day, they are immersed in patient care and focused on doing a good job. “But in the trajectory of where they are headed—the committees, projects, and educational activities they are involved in—are they all aligned and pointing in the same direction and the right direction?” Dr Kealey asks.

The process, which HealthPartners hospitalists have been using for about 10 years, was modified from the American College of Physician Executives course “Managing Physician Performance.”

“It is a tool to help hospitalists pause and reflect on their career and how to move it forward,” Dr. Kealey says.

Marc B. Westle, DO, FACP, president and managing partner of the Asheville Hospitalist Group, PA, in Asheville, N.C., relies on ongoing conversations. This group also uses Crimson’s Physician Management Software to track various group quality and cost indicators, looking at data from as many angles as possible.

“It’s an excellent tool to look at a group, it is a poor tool to look at an individual,” Dr. Westle says. “Although the insurance companies like to say you can apply it to the individual, in reality there is no good way to attribute that data down to the physician level.”

Within the group data, it may be possible to recognize underperformers, but still it is anecdotal, based on experience and interaction.

“Under, ‘How am I doing?’ there is an objective category in the software where there are hard end-points and measures you can look at,” says Dr. Westle

On the subjective side, Dr. Westle collects data on relative value units (RVUs), non-monetary, numeric values Medicare uses to represent the relative amount of physician time, resources, and expertise needed to provide various services to patients. They review total RVUs as well as individual-components that make up total RVUs.

“I’ll track how many simple, moderate, or complex follow-up visits were made, how many simple or moderate histories and physicals or consultations, how many procedures are they doing.” Dr. Westle says. “I’ll track every statistic that way for every individual and give them that feedback so they can see how they’re doing from a performance and a work standard, compared to their peers within the group, and nationally as published by Medicare.”

Dr. Westle uses charts and graphs to drive his points home.

“It just gives them an idea about where they are,’’ he says. “It doesn’t mean they’re doing a bad job. Our patients may be sicker than some other patients. And that is why we do it as a group, too, because their patients should be similar to the group’s patients and the group’s patients may be different than the average Medicare patient.”

They also look at hospitalists’ quality of life, their schedules, and the quantity of work the average physician is doing compared with those around the country. They discuss scheduling, income, disposable income, and the kind of work they’re doing in the hospital. “All this comes into a discussion of where they are in their lives and are they happy with what they’re doing,” Dr. Westle says. TH

 

 

Andrea Sattinger is a medical writer based in North Carolina.

Reference

  1. Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102.
Issue
The Hospitalist - 2007(12)
Publications
Sections

How hospitalists assess their performance and hone their skills is critical to patient care. Continuing medical education (CME), relicensure, specialty recertification, and lifelong learning are all linked to hospitalists’ abilities to assess and meet their learning needs.

But the preponderance of evidence suggests physicians have limited ability to accurately assess their performance, according to a physician self-assessment literature review published in September 2006 in JAMA.1

“Self-assessment should be guided by tools designed by experts, based on standards, and aimed at filling gaps in knowledge, skills, and competencies—not simply the internally based self-rating of individual practitioners,” says C. Michael Fordis, MD, senior associate dean for con-

tinuing medical education at the Baylor College of Medicine in Houston, and one of the authors of the study.

“Hospitalists and other physicians are not doing themselves a service to rely on their own internal self-rated judgments of knowledge and performance,” Dr. Fordis says. “There’s too much to know, too much that’s changing, and too much that affects the implementation into practice of the knowledge that you have for any one person to be able to take care of patients and at the same time have some sense of whether there are gaps along that implementation pathway.”

“Guided” self-assessment represents the thinking of many experts who ask questions, consider guidelines, and suggest tools that can help physicians pursue the best ways of identifying those gaps that reflect differences in what they think they are doing and their actual performance.

Regular, consistent self-assessment is imperative for a self-regulating profession such as medicine. How well are hospitalists doing—and what mechanisms or tools do they use?

HOW TO SELF-ASSESS

  • Develop a more holistic continuing professional development process (learning portfolios, documentation of practice-based learning and improvement activities), creating less general and more detailed learning and practice objectives;
  • Reduce the variation between self- and external assessments by encouraging the internalization of objective measurements or benchmarks of performance;
  • Use multisource feedback evaluations especially to address more difficult improvement needs that may be difficult to assess (communication, psychosocial);
  • Consider using objective measures of competence and performance;
  • Increase the role of specialty societies by providing current evidence-based learning objectives on a regular basis to give members external markers of competence;
  • Make self-assessment an iterative process that particularly focuses on scope of practice; and
  • Use separate initiatives to identify physicians who require remediation. Although those professionals can also benefit from guided self-assessment, the process is designed primarily to support competent physicians who want to continuously improve their practice performance.—AS

Group Assessment

Hospital medicine groups are increasingly able to measure their clinical competence against other hospitals’ and hospitalist groups. SHM’s Benchmarks Committee has been working on performance assessment at a program level.

“When the JCAHO [Joint Comm­ission on Accreditation of Healthcare Organizations] Core Measures were coming out a few years back, as a whole most docs when reflecting on their practice would say they do a fine job within these measures,” says Burke T. Kealey, MD, chairman of the Benchmarks Committee from 2006-07. “For instance, [they might say] ‘I always send people out on ACE inhibitors and beta-blockers,’ or, ‘We always start people on aspirin when they come into the ER,’ but when you looked at the data, you found that their self-assessment was not as accurate as we hoped it would be.”

A lot of hard work went into discovering why their self-assessment was inaccurate. “We found there were documentation problems that they didn’t really incorporate a lot of the contraindications when giving their answer about self-assessment,” says Dr. Kealey, who leads the hospital medicine program at Regions Hospital and HealthPartners Medical Group in St. Paul, Minn.

 

 

If patients had kidney dysfunction or kidney failure, they were not discharged on ACE inhibitors.

“But we as doctors didn’t do a great job of explaining why we weren’t doing that,” Dr. Kealey says. “We were not transparent in our reasoning, but the core measures caused us to become more transparent, to explain what we were thinking and what we were doing in a way that the public could see.”

At SHM’s annual meeting in May, the Benchmarks Committee released the white paper “Measuring Hospitalist Performance: Metrics, Reports, and Dashboards” with the intent of assisting hospitals and hospital medicine programs develop or improve their performance monitoring and reporting.

“Hospitalists in general could do a better job of assessing themselves,” says Arpana Vidyarthi, MD, an assistant professor in the division of hospital medicine at the University of California, San Francisco (UCSF). “Self-assessment for those of us in cognitive specialties, like internists, is more complicated than in procedural specialties like surgery, partly because these procedural specialties have very specific outcomes that are linked to the procedure and that level of skill. With the new drivers of quality improvement and patient safety, and the dramatic increase of quality indicators for hospitals overall, this is now trickling down to thinking about how we truly assess the doctors themselves.”

The quality indicators that hospitalist groups are benchmarking may not be linked to the individual, she says. Dr. Vidyarthi, also director of quality for the Inpatient General Medicine Service at UCSF Medical Center, provides an example. “Pneumovax as a quality indicator is part of the Joint Commission core measures,” says Dr. Vidyarthi. “You can go online where it is publicly reported and choose this or other indicators to compare one hospital to another. That is the sort of benchmarking that some hospitalists groups are doing.”

But using that kind of evaluation for individual assessment misses the mark.

“Does the fact that the patient does not get Pneumovax reflect upon me and my abilities as a hospitalist? Not at all,” she says, “because my institution and those institutions who have done well with this specific indicator have taken it out of the hands of the doctors. It’s an automated sort of thing. At our hospital, the pharmacists do it.”

Although the American Board of Internal Medicine asks that the individual physician assess his or her own care as part of recredentialing, it’s more difficult for a hospitalist than for an outpatient internist. Hospitalists don’t have a panel of diabetic patients, for instance, for which the outcomes data can be easily analyzed.

Hospitalists as a group also haven’t had a tradition of self-assessment or peer assessment. Further, hospitalist groups differ as to how they handle assessments of individual physicians.

“In general if you ask our [UCSF] hospitalists, the way that we assess competency is generally through hospital privileging,” Dr. Vidyarthi says. Because the hospital as a whole reviews the competency of all the doctors that work there, the process known as “privileging” has consisted of asking a couple of colleagues to write letters of recommendation. “The division is changing this, but that is just on the cusp.

“We’ve built a new system for our quality committee in which one layer is peer assessment, looking at just the individual cases that bubble up from an incident report or a root-cause analysis or other sources. We’re looking at and identifying both systems issues and individual issues and trying to build a way to feed back those assessments.”

But that’s just half the equation, she says, the flip side being continual self-assessment for what a hospitalist is doing well.

 

 

To Dr. Kealey, self-assessment plays a significant role in helping physicians with their career goals and ensuring that their careers are on track and on target.

At HealthPartners, physicians fill out a self-evaluation form on which they list all activities they’ve been involved in over the previous year. Then they are asked what they got out of these activities, what their career goals are, and whether they are meeting them. They’re also asked how the group can help them reach those goals.

“We ask them to pause and reflect on where they’re headed with their career and their life, and put it down in writing so that in that moment they take the time to ask, ‘What is it that I’m ultimately after?’ ” says Dr. Kealey.

Day to day, they are immersed in patient care and focused on doing a good job. “But in the trajectory of where they are headed—the committees, projects, and educational activities they are involved in—are they all aligned and pointing in the same direction and the right direction?” Dr Kealey asks.

The process, which HealthPartners hospitalists have been using for about 10 years, was modified from the American College of Physician Executives course “Managing Physician Performance.”

“It is a tool to help hospitalists pause and reflect on their career and how to move it forward,” Dr. Kealey says.

Marc B. Westle, DO, FACP, president and managing partner of the Asheville Hospitalist Group, PA, in Asheville, N.C., relies on ongoing conversations. This group also uses Crimson’s Physician Management Software to track various group quality and cost indicators, looking at data from as many angles as possible.

“It’s an excellent tool to look at a group, it is a poor tool to look at an individual,” Dr. Westle says. “Although the insurance companies like to say you can apply it to the individual, in reality there is no good way to attribute that data down to the physician level.”

Within the group data, it may be possible to recognize underperformers, but still it is anecdotal, based on experience and interaction.

“Under, ‘How am I doing?’ there is an objective category in the software where there are hard end-points and measures you can look at,” says Dr. Westle

On the subjective side, Dr. Westle collects data on relative value units (RVUs), non-monetary, numeric values Medicare uses to represent the relative amount of physician time, resources, and expertise needed to provide various services to patients. They review total RVUs as well as individual-components that make up total RVUs.

“I’ll track how many simple, moderate, or complex follow-up visits were made, how many simple or moderate histories and physicals or consultations, how many procedures are they doing.” Dr. Westle says. “I’ll track every statistic that way for every individual and give them that feedback so they can see how they’re doing from a performance and a work standard, compared to their peers within the group, and nationally as published by Medicare.”

Dr. Westle uses charts and graphs to drive his points home.

“It just gives them an idea about where they are,’’ he says. “It doesn’t mean they’re doing a bad job. Our patients may be sicker than some other patients. And that is why we do it as a group, too, because their patients should be similar to the group’s patients and the group’s patients may be different than the average Medicare patient.”

They also look at hospitalists’ quality of life, their schedules, and the quantity of work the average physician is doing compared with those around the country. They discuss scheduling, income, disposable income, and the kind of work they’re doing in the hospital. “All this comes into a discussion of where they are in their lives and are they happy with what they’re doing,” Dr. Westle says. TH

 

 

Andrea Sattinger is a medical writer based in North Carolina.

Reference

  1. Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102.

How hospitalists assess their performance and hone their skills is critical to patient care. Continuing medical education (CME), relicensure, specialty recertification, and lifelong learning are all linked to hospitalists’ abilities to assess and meet their learning needs.

But the preponderance of evidence suggests physicians have limited ability to accurately assess their performance, according to a physician self-assessment literature review published in September 2006 in JAMA.1

“Self-assessment should be guided by tools designed by experts, based on standards, and aimed at filling gaps in knowledge, skills, and competencies—not simply the internally based self-rating of individual practitioners,” says C. Michael Fordis, MD, senior associate dean for con-

tinuing medical education at the Baylor College of Medicine in Houston, and one of the authors of the study.

“Hospitalists and other physicians are not doing themselves a service to rely on their own internal self-rated judgments of knowledge and performance,” Dr. Fordis says. “There’s too much to know, too much that’s changing, and too much that affects the implementation into practice of the knowledge that you have for any one person to be able to take care of patients and at the same time have some sense of whether there are gaps along that implementation pathway.”

“Guided” self-assessment represents the thinking of many experts who ask questions, consider guidelines, and suggest tools that can help physicians pursue the best ways of identifying those gaps that reflect differences in what they think they are doing and their actual performance.

Regular, consistent self-assessment is imperative for a self-regulating profession such as medicine. How well are hospitalists doing—and what mechanisms or tools do they use?

HOW TO SELF-ASSESS

  • Develop a more holistic continuing professional development process (learning portfolios, documentation of practice-based learning and improvement activities), creating less general and more detailed learning and practice objectives;
  • Reduce the variation between self- and external assessments by encouraging the internalization of objective measurements or benchmarks of performance;
  • Use multisource feedback evaluations especially to address more difficult improvement needs that may be difficult to assess (communication, psychosocial);
  • Consider using objective measures of competence and performance;
  • Increase the role of specialty societies by providing current evidence-based learning objectives on a regular basis to give members external markers of competence;
  • Make self-assessment an iterative process that particularly focuses on scope of practice; and
  • Use separate initiatives to identify physicians who require remediation. Although those professionals can also benefit from guided self-assessment, the process is designed primarily to support competent physicians who want to continuously improve their practice performance.—AS

Group Assessment

Hospital medicine groups are increasingly able to measure their clinical competence against other hospitals’ and hospitalist groups. SHM’s Benchmarks Committee has been working on performance assessment at a program level.

“When the JCAHO [Joint Comm­ission on Accreditation of Healthcare Organizations] Core Measures were coming out a few years back, as a whole most docs when reflecting on their practice would say they do a fine job within these measures,” says Burke T. Kealey, MD, chairman of the Benchmarks Committee from 2006-07. “For instance, [they might say] ‘I always send people out on ACE inhibitors and beta-blockers,’ or, ‘We always start people on aspirin when they come into the ER,’ but when you looked at the data, you found that their self-assessment was not as accurate as we hoped it would be.”

A lot of hard work went into discovering why their self-assessment was inaccurate. “We found there were documentation problems that they didn’t really incorporate a lot of the contraindications when giving their answer about self-assessment,” says Dr. Kealey, who leads the hospital medicine program at Regions Hospital and HealthPartners Medical Group in St. Paul, Minn.

 

 

If patients had kidney dysfunction or kidney failure, they were not discharged on ACE inhibitors.

“But we as doctors didn’t do a great job of explaining why we weren’t doing that,” Dr. Kealey says. “We were not transparent in our reasoning, but the core measures caused us to become more transparent, to explain what we were thinking and what we were doing in a way that the public could see.”

At SHM’s annual meeting in May, the Benchmarks Committee released the white paper “Measuring Hospitalist Performance: Metrics, Reports, and Dashboards” with the intent of assisting hospitals and hospital medicine programs develop or improve their performance monitoring and reporting.

“Hospitalists in general could do a better job of assessing themselves,” says Arpana Vidyarthi, MD, an assistant professor in the division of hospital medicine at the University of California, San Francisco (UCSF). “Self-assessment for those of us in cognitive specialties, like internists, is more complicated than in procedural specialties like surgery, partly because these procedural specialties have very specific outcomes that are linked to the procedure and that level of skill. With the new drivers of quality improvement and patient safety, and the dramatic increase of quality indicators for hospitals overall, this is now trickling down to thinking about how we truly assess the doctors themselves.”

The quality indicators that hospitalist groups are benchmarking may not be linked to the individual, she says. Dr. Vidyarthi, also director of quality for the Inpatient General Medicine Service at UCSF Medical Center, provides an example. “Pneumovax as a quality indicator is part of the Joint Commission core measures,” says Dr. Vidyarthi. “You can go online where it is publicly reported and choose this or other indicators to compare one hospital to another. That is the sort of benchmarking that some hospitalists groups are doing.”

But using that kind of evaluation for individual assessment misses the mark.

“Does the fact that the patient does not get Pneumovax reflect upon me and my abilities as a hospitalist? Not at all,” she says, “because my institution and those institutions who have done well with this specific indicator have taken it out of the hands of the doctors. It’s an automated sort of thing. At our hospital, the pharmacists do it.”

Although the American Board of Internal Medicine asks that the individual physician assess his or her own care as part of recredentialing, it’s more difficult for a hospitalist than for an outpatient internist. Hospitalists don’t have a panel of diabetic patients, for instance, for which the outcomes data can be easily analyzed.

Hospitalists as a group also haven’t had a tradition of self-assessment or peer assessment. Further, hospitalist groups differ as to how they handle assessments of individual physicians.

“In general if you ask our [UCSF] hospitalists, the way that we assess competency is generally through hospital privileging,” Dr. Vidyarthi says. Because the hospital as a whole reviews the competency of all the doctors that work there, the process known as “privileging” has consisted of asking a couple of colleagues to write letters of recommendation. “The division is changing this, but that is just on the cusp.

“We’ve built a new system for our quality committee in which one layer is peer assessment, looking at just the individual cases that bubble up from an incident report or a root-cause analysis or other sources. We’re looking at and identifying both systems issues and individual issues and trying to build a way to feed back those assessments.”

But that’s just half the equation, she says, the flip side being continual self-assessment for what a hospitalist is doing well.

 

 

To Dr. Kealey, self-assessment plays a significant role in helping physicians with their career goals and ensuring that their careers are on track and on target.

At HealthPartners, physicians fill out a self-evaluation form on which they list all activities they’ve been involved in over the previous year. Then they are asked what they got out of these activities, what their career goals are, and whether they are meeting them. They’re also asked how the group can help them reach those goals.

“We ask them to pause and reflect on where they’re headed with their career and their life, and put it down in writing so that in that moment they take the time to ask, ‘What is it that I’m ultimately after?’ ” says Dr. Kealey.

Day to day, they are immersed in patient care and focused on doing a good job. “But in the trajectory of where they are headed—the committees, projects, and educational activities they are involved in—are they all aligned and pointing in the same direction and the right direction?” Dr Kealey asks.

The process, which HealthPartners hospitalists have been using for about 10 years, was modified from the American College of Physician Executives course “Managing Physician Performance.”

“It is a tool to help hospitalists pause and reflect on their career and how to move it forward,” Dr. Kealey says.

Marc B. Westle, DO, FACP, president and managing partner of the Asheville Hospitalist Group, PA, in Asheville, N.C., relies on ongoing conversations. This group also uses Crimson’s Physician Management Software to track various group quality and cost indicators, looking at data from as many angles as possible.

“It’s an excellent tool to look at a group, it is a poor tool to look at an individual,” Dr. Westle says. “Although the insurance companies like to say you can apply it to the individual, in reality there is no good way to attribute that data down to the physician level.”

Within the group data, it may be possible to recognize underperformers, but still it is anecdotal, based on experience and interaction.

“Under, ‘How am I doing?’ there is an objective category in the software where there are hard end-points and measures you can look at,” says Dr. Westle

On the subjective side, Dr. Westle collects data on relative value units (RVUs), non-monetary, numeric values Medicare uses to represent the relative amount of physician time, resources, and expertise needed to provide various services to patients. They review total RVUs as well as individual-components that make up total RVUs.

“I’ll track how many simple, moderate, or complex follow-up visits were made, how many simple or moderate histories and physicals or consultations, how many procedures are they doing.” Dr. Westle says. “I’ll track every statistic that way for every individual and give them that feedback so they can see how they’re doing from a performance and a work standard, compared to their peers within the group, and nationally as published by Medicare.”

Dr. Westle uses charts and graphs to drive his points home.

“It just gives them an idea about where they are,’’ he says. “It doesn’t mean they’re doing a bad job. Our patients may be sicker than some other patients. And that is why we do it as a group, too, because their patients should be similar to the group’s patients and the group’s patients may be different than the average Medicare patient.”

They also look at hospitalists’ quality of life, their schedules, and the quantity of work the average physician is doing compared with those around the country. They discuss scheduling, income, disposable income, and the kind of work they’re doing in the hospital. “All this comes into a discussion of where they are in their lives and are they happy with what they’re doing,” Dr. Westle says. TH

 

 

Andrea Sattinger is a medical writer based in North Carolina.

Reference

  1. Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094-1102.
Issue
The Hospitalist - 2007(12)
Issue
The Hospitalist - 2007(12)
Publications
Publications
Article Type
Display Headline
How Am I Doing?
Display Headline
How Am I Doing?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)