Health Systems Education Leadership: Learning From the VA Designated Education Officer Role

Article Type
Changed
Thu, 06/16/2022 - 14:10

The US Department of Veterans Affairs (VA) operates the largest integrated health care system in the United States, providing physical and mental health care to more than 9 million veterans enrolled each year through a national system of inpatient, outpatient, and long-term care settings.1 As 1 of 4 statutory missions, the VA conducts the largest training effort for health professionals in cooperation with affiliated academic institutions. From 2016 through 2020, an average of 123,000 trainees from various professions received training at the VA.2 Physician residents comprised the largest trainee group (37%), followed by associated health students and residents (20%), and nursing professionals (21%).2 In VA, associated health professions include all health care disciplines other than allopathic and osteopathic medicine, dentistry, and nursing. The associated health professions encompass about 40 specialties, including audiology, dietetics, physical and occupational therapy, optometry, pharmacy, podiatry, psychology, and social work. 

The VA also trains a smaller number of advanced fellows to address specialties important to the nation and veterans health that are not sufficiently addressed by standard accredited professional training.3 The VA Advanced Fellowship programs include 22 postresidency, postdoctoral, and postmasters fellowships to physicians and dentists, and associated health professions, including psychologists, social workers, and pharmacists. 3 From 2015 to 2019, 57 to 61% of medical school students reported having a VA clinical training experience during medical school.4 Of current VA employees, 20% of registered nurses, 64% of physicians, 73% of podiatrists and optometrists, and 81% of psychologists reported VA training prior to employment.5

Health professions education is led by the designated education officer (DEO) at each VA facility.6 Also known as the associate chief of staff for education (ACOS/E), the DEO is a leadership position that is accountable to local VA facility executive leadership as well as the national Office of Academic Affiliations (OAA), which directs all VA health professions training across the US.6 At most VA facilities, the DEO oversees clinical training and education reporting directly to the facility chief of staff. At the same time, the ACOS/E is accountable to the OAA to ensure adherence with national education directives and policy. The DEO oversees trainee programs through collaboration with training program directors, faculty, academic affiliates, and accreditation agencies across > 40 health professions.

The DEO is expected to possess expertise in leadership attributes identified by the US Office of Personnel Management as essential to build a federal corporate culture that drives results, serves customers, and builds successful teams and coalitions within and outside the VA.7 These leadership attributes include leading change, leading people, driving results, business acumen, and building coalitions.7 They are operationalized by OAA as 4 domains of expertise required to lead education across multiple professions, including: (1) creating and sustaining an organizational work environment that supports learning, discovery, and continuous improvement; (2) aligning and managing fiscal, human, and capital resources to meet organizational learning needs; (3) driving learning and performance results to impact organizational success; and (4) leading change and transformation through positioning and implementing innovative learning and education strategies (Table 1).6

Designated Education Officer Domains of Expertise and Task Examples

In this article we describe the VA DEO leadership role and the tasks required to lead education across multiple professions within the VA health care system. Given the broad scope of leading educational programs across multiple clinical professions and the interprofessional backgrounds of DEOs across the VA, we evaluated DEO self-perceived effectiveness to impact educational decisions and behavior by professional discipline. Our evaluation question is: Are different professional education and practice backgrounds functionally capable of providing leadership over all education of health professions training programs? Finally, we describe DEOs perceptions of facilitators and barriers to performing their DEO role within the VA.

Methods

We conducted a mixed methods analysis of data collected by OAA to assess DEO needs within a multiprofessional clinical learning environment. The needs assessment was conducted by an OAA evaluator (NH) with input on instrument development and data analysis from OAA leadership (KS, MB). This evaluation is categorized as an operations activity based on VA Handbook 1200 where information generated is used for business operations and quality improvement. 8 The overall project was subject to administrative rather than institutional review board oversight.

A needs assessment tool was developed based on the OAA domains of expertise.6 Prior to its administration, the tool was piloted with 8 DEOs in the field and the survey shortened based on their feedback. DEOs were asked about individual professional characteristics (eg, clinical profession, academic appointment, type of health professions training programs at the VA site) and their self-perceived effectiveness in impacting educational decisions and behaviors on general and profession-specific tasks within each of the 4 domains of expertise on a 5-point Likert scale (1, not effective; 5, very effective). 6,9 The needs assessment also included an open-ended question asking respondents to comment on any issues they felt important to understanding DEO role effectiveness.

The needs assessment was administered online via SurveyMonkey to 132 DEOs via email in September and October 2019. The DEOs represented 148 of 160 VA facilities with health professions education; 14 DEOs covered > 1 VA facility, and 12 positions were vacant. Email reminders were sent to nonresponders after 1 week. At 2 weeks, nonresponders received telephone reminders and personalized follow-up emails from OAA staff. The response rate at the end of 3 weeks was 96%.

Data Analysis

Mixed methods analyses included quantitative analyses to identify differences in general and profession-specific self-ratings of effectiveness in influencing educational decisions and behaviors by DEO profession, and qualitative analyses to further understand DEO’s perceptions of facilitators and barriers to DEO task effectiveness.10,11 Quantitative analyses included descriptive statistics for all variables followed by nonparametric tests including χ2 and Mann- Whitney U tests to assess differences between physician and other professional DEOs in descriptive characteristics and selfperceived effectiveness on general and profession- specific tasks. Quantitative analyses were conducted using SPSS software, version 26. Qualitative analyses consisted of rapid assessment procedures to identify facilitators and barriers to DEO effectiveness by profession using Atlas.ti version 8, which involved reviewing responses to the open-ended question and assigning each response to predetermined categories based on the organizational level it applied to (eg, individual DEO, VA facility, or external to the organization).12,13 Responses within categories were then summarized to identify the main themes.

Results 

Completed surveys were received from 127 respondents representing 139 VA facilities. Eighty percent were physicians and 20% were other professionals, including psychologists, pharmacists, dentists, dieticians, nurses, and nonclinicians. There were no statistically significant differences between physician and other professional DEOs in the percent working full time or length of time spent working in the position. About one-third of the sample had been in the position for < 2 years, one-third had been in the position for 2 to < 5 years, and one-third had been in the role for ≥ 5 years. Eighty percent reported having a faculty appointment with an academic affiliate. While 92% of physician DEOs had a faculty appointment, only 40% of other professional DEOs did (P < .001). Most faculty appointments for both groups were with a school of medicine. More physician DEOs than other professionals had training programs at their site for physicians (P = .003) and dentists (P < .001), but there were no statistically significant differences for having associated health, nursing, or advanced fellowship training programs at their sites. Across all DEOs, 98% reported training programs at their site for associated health professions, 95% for physician training, 93% for nursing training, 59% for dental training, and 48% for advanced fellowships.

Self-Perceived Effectiveness

There were no statistically significant differences between physician and other professional DEOs on self-perceived effectiveness in impacting educational decisions or behaviors for general tasks applicable across professions (Table 2). This result held even after controlling for length of time in the position and whether the DEO had an academic appointment. Generally, both groups reported being effective on tasks in the enabling learning domain, including applying policies and procedures related to trainees who rotate through the VA and maintaining adherence with accreditation agency standards across health professions. Mean score ranges for both physician and other professional DEOs reported moderate effectiveness in aligning resources effectiveness questions (2.45-3.72 vs 2.75-3.76), driving results questions (3.02-3.60 vs 3.39-3.48), and leading change questions (3.12-3.50 vs 3.42-3.80).

For profession-specific tasks, effectiveness ratings between the 2 groups were generally not statistically significant for medical, dental, and advanced fellowship training programs (Table 3). There was a pattern of statistically significant differences between physician and other professional DEOs for associated health and nursing training programs on tasks across the 4 domains of expertise with physicians having lower mean ratings compared with other professionals. Generally, physician DEOs had higher task effectiveness when compared with other professionals for medical training programs, and other professionals had higher task effectiveness ratings than did physicians for associated health or nursing training programs.

Facilitators and Barriers

Seventy responses related to facilitators and barriers to DEO effectiveness were received (59 from physicians and 11 from other professionals). Most responses were categorized as individual level facilitators or barriers (53% for physician and 64% for other professionals). Only 3% of comments were categorized as external to the organization (all made by physicians). The themes were similar for both groups and were aggregated in Table 4. Facilitators included continuing education, having a mentor who works at a similar type of facility, maintaining balance and time management when working with different training programs, learning to work and develop relationships with training program directors, developing an overall picture of each type of health professions training program, holding regular meetings with all health training programs and academic affiliates, having a formal education service line with budget and staffing, facility executive leadership who are knowledgeable of the education mission and DEO role, having a national oversight body, and the DEO’s relationships with academic affiliates.

Barriers to role effectiveness at the individual DEO level included assignment of multiple roles and a focus on regulation and monitoring with little time for development of new programs and strategic planning. The organizational level barriers included difficulty getting core services to engage with health professions trainees and siloed education leadership. 

Discussion

DEOs oversee multiple health professions training programs within local facilities. The DEO is accountable to local VA facility leadership and a national education office to lead local health professions education at local facilities and integrate these educational activities across the national VA system.

The VA DEO role is similar to the Accreditation Council for Graduate Medical Education designated institutional official (DIO) except that the VA DEO provides oversight of > 40 health professions training programs.14,15 The VA DEO, therefore, has broader oversight than the DIO role that focuses only on graduate physician education. Similar to the DIO, the VA DEO role initially emphasized the enabling learning and aligning resources domains to provide oversight and administration of health professions training programs. Over time, both roles have expanded to include defining and ensuring healthy clinical learning environments, aligning educational resources and training with the institutional mission, workforce, and societal needs, and creating continuous educational improvement models.6,16,17 To accomplish these expanded goals, both the DEO and the DIO work closely with other educational leaders at the academic affiliate and the VA facility. As health professions education advances, there will be increased emphasis placed on delivering educational programs to improve clinical practice and health care outcomes.18

Our findings that DEO profession did not influence self-ratings of effectiveness to influence educational decisions or behaviors on general tasks applicable across health professions suggest that education and practice background are not factors influencing selfratings. Nor were self-ratings influenced by other factors. Since the DEO is a senior leadership position, candidates for the position already may possess managerial and leadership skills. In our sample, several individuals commented that they had prior education leadership positions, eg, training program director or had years of experience working in the VA. Similarly, having an academic appointment may not be important for the performance of general administrative tasks. However, an academic appointment may be important for effective performance of educational tasks, such as clinical teaching, didactic training, and curriculum development, which were not measured in this study.

The finding of differences in self-ratings between physicians and other professionals on profession-specific tasks for associated health and nursing suggests that physicians may require additional curriculum to enhance their knowledge in managing other professional educational programs. For nursing specifically, this finding could also reflect substantial input from the lead nurse executive in the facility. DEOs also identified practical ways to facilitate their work with multiple health professions that could immediately be put into practice, including developing relationships and enhancing communication with training program directors, faculty, and academic affiliates of each profession.

Taken together, the quantitative and qualitative findings indicate that despite differences in professional backgrounds, DEOs have high self-ratings of their own effectiveness to influence educational decisions and behaviors on general tasks they are expected to accomplish. There are some professionspecific tasks where professional background does influence self-perceived effectiveness, ie, physicians have higher self-ratings on physician-specific tasks and other professionals have higher self-ratings on associated health or nursing tasks. These perceived differences may be mitigated by increasing facilitators and decreasing barriers identified for the individual DEO, within the organization, and external to the organization.

Limitations Our findings should be interpreted with the following limitations in mind. The selfreport nature of the data opens the possibility of self-report bias or Dunning-Kruger effects where effectiveness ratings could have been overestimated by respondents.21 Although respondents were assured of their anonymity and that results would only be reported in the aggregate, there is potential for providing more positive responses on a needs assessment administered by the national education program office. We recommend further work be conducted to validate the needs assessment tool against other data collection methods, such as actual outcomes of educational effectiveness. Our study did not incorporate measures of educational effectiveness to determine whether self-perceived DEO effectiveness is translated to better trainee or learning outcomes. Before this can happen, educational policymakers must identify the most important facility-level learning outcomes. Since the DEO is a facility level educational administrator, learning efeffectiveness must be defined at the facility level. The qualitative findings could also be expanded through the application of more detailed qualitative methods, such as indepth interviews. The tasks rated by DEOs were based on OAA’s current definition of the DEO role.6 As the field advances, DEO tasks will also evolve.22-24

Conclusions

The DEO is a senior educational leadership role that oversees all health professions training in the VA. Our findings are supportive of individuals from various health disciplines serving in the VA DEO role with responsibilities that span multiple health profession training programs. We recommend further work to validate the instrument used in this study, as well as the application of qualitative methods like indepth interviews to further our understanding of the DEO role.

References

1. US Department of Veterans Affairs, Veterans Health Administration. Updated April 18, 2022. Accessed May 6, 2022. https://www.va.gov/health/aboutvha.asp

2. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic Year 2019-2020. Published 2020. Accessed May 6, 2022. https://www.va.gov/OAA/docs /OAA_Statistics_2020.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Advanced Fellowships and Professional Development. Updated November 26, 2021. Accessed May 6, 2022. https://www.va.gov/oaa /advancedfellowships/advanced-fellowships.asp

4. Association of American Medical Colleges. Medical school graduation questionnaire, 2019 all schools summary report. Published July 2019. Accessed May 6, 2022. https://www.aamc.org/system/files/2019-08/2019-gq-all-schools -summary-report.pdf

5. US Department of Veterans Affairs, National Center for Organization Development. VA all employee survey. Published 2019. Accessed May 6, 2022. https://www.va.gov /NCOD/VAworkforcesurveys.asp

6. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Education leaders in the VA: the role of the designated education officer (DEO). Published December 2019. Accessed May 6, 2022. https://www.va.gov/OAA/docs/DEO_Learning _Leader_2019.pdf

7. US Office of Personnel Management. Policy, data oversight: guide to senior executive service qualifications. Published 2010. Accessed May 6, 2022. https://www.opm .gov/policy-data-oversight/senior-executive-service /executive-core-qualifications/

8. US Department of Veterans Affairs, Office of Research and Development. Program guide: 1200.21 VHA operations activities that may constitute research. Published January 9, 2019. Accessed May 6, 2022. https://www.research .va.gov/resources/policies/ProgramGuide-1200-21-VHA -Operations-Activities.pdf

9. Riesenberg LA, Rosenbaum PF, Stick SL. Competencies, essential training, and resources viewed by designated institutional officials as important to the position in graduate medical education [published correction appears in Acad Med. 2006 Dec;81(12):1025]. Acad Med. 2006;81(5):426- 431. doi:10.1097/01.ACM.0000222279.28824.f5

10. Palinkas LA, Mendon SJ, Hamilton AB. Inn o v a t i o n s i n M i x e d M e t h o d s E v a l u a - tions. Annu Rev Public Health. 2019;40:423-442. doi:10.1146/annurev-publhealth-040218-044215

11. Tashakkori A, Creswell JW. Exploring the nature of research questions in mixed methods research. J Mix Methods Res. 2007;1(3):207-211. doi:10.1177/1558689807302814

12. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611

13. Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

14. Bellini L, Hartmann D, Opas L. Beyond must: supporting the evolving role of the designated institutional official. J Grad Med Educ. 2010;2(2):147-150. doi:10.4300/JGME-D-10-00073.1

15. Riesenberg LA, Rosenbaum P, Stick SL. Characteristics, roles, and responsibilities of the Designated Institutional Official (DIO) position in graduate medical education education [published correction appears in Acad Med. 2006 Dec;81(12):1025] [published correction appears in Acad Med. 2006 Mar;81(3):274]. Acad Med. 2006;81(1):8-19. doi:10.1097/00001888-200601000-00005

16. Group on Resident Affairs Core Competency Task Force. Institutional GME leadership competencies. 2015. Accessed May 6, 2022. https://www.aamc.org/system /files/c/2/441248-institutionalgmeleadershipcompetencies .pdf

17. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):1687-1688. doi:10.1001/jama.2013.1931

18. Beliveau ME, Warnes CA, Harrington RA, et al. Organizational change, leadership, and the transformation of continuing professional development: lessons learned from the American College of Cardiology. J Contin Educ Health Prof. 2015;35(3):201-210. doi:10.1002/chp.21301

19. World Health Organization. Framework for Action on Interprofessional Education and Collaborative Practice. Published September 1, 2020. Accessed May 10, 2022. https://www.who.int/publications/i/item/framework -for-action-on-interprofessional-education-collaborative -practice

20. Weiss K, Passiment M, Riordan L, Wagner R for the National Collaborative for Improving the Clinical Learning Environment IP-CLE Report Work Group. Achieving the optimal interprofessional clinical learning environment: proceedings from an NCICLE symposium. Published January 18, 2019. Accessed May 6, 2022. doi:10.33385/NCICLE.0002

21. Althubaiti A. Information bias in health research: definition, pitfalls, and adjustment methods. J Multidiscip Healthc. 2016;9:211-217. Published 2016 May 4. doi:10.2147/JMDH.S104807

22. Gilman SC, Chokshi DA, Bowen JL, Rugen KW, Cox M. Connecting the dots: interprofessional health education and delivery system redesign at the Veterans Health Administration. Acad Med. 2014;89(8):1113-1116. doi:10.1097/ACM.0000000000000312

23. Health Professions Accreditors Collaborative. Guidance on developing quality interprofessional education for the health professions. Published February 1, 2019. Accessed May 6, 2022. https://healthprofessionsaccreditors.org/wp -content/uploads/2019/02/HPACGuidance02-01-19.pdf

24. Watts BV, Paull DE, Williams LC, Neily J, Hemphill RR, Brannen JL. Department of Veterans Affairs Chief Resident in Quality and Patient Safety Program: a model to spread change. Am J Med Qual. 2016;31(6):598-600. doi:10.1177/1062860616643403

Article PDF
Author and Disclosure Information

Nancy D. Harada, PhD, MPA, PTa,b; Karen M. Sanders, MDa,c; and Marjorie A. Bowman, MD, MPAa,d,e

aUS Department of Veterans Affairs, Office of Academic Affiliations
bDavid Geffen School of Medicine, University of California, Los Angeles
cVirginia Commonwealth University School of Medicine, Richmond
dUniversity of Pennsylvania, Philadelphia
eWright State University, Fairborn, Ohio

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

This evaluation was determined to be an operations activity based on VA Handbook 1200.21 and was subject to administrative rather than institutional review board oversight.

Issue
Federal Practitioner - 39(6)a
Publications
Topics
Page Number
266-273
Sections
Author and Disclosure Information

Nancy D. Harada, PhD, MPA, PTa,b; Karen M. Sanders, MDa,c; and Marjorie A. Bowman, MD, MPAa,d,e

aUS Department of Veterans Affairs, Office of Academic Affiliations
bDavid Geffen School of Medicine, University of California, Los Angeles
cVirginia Commonwealth University School of Medicine, Richmond
dUniversity of Pennsylvania, Philadelphia
eWright State University, Fairborn, Ohio

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

This evaluation was determined to be an operations activity based on VA Handbook 1200.21 and was subject to administrative rather than institutional review board oversight.

Author and Disclosure Information

Nancy D. Harada, PhD, MPA, PTa,b; Karen M. Sanders, MDa,c; and Marjorie A. Bowman, MD, MPAa,d,e

aUS Department of Veterans Affairs, Office of Academic Affiliations
bDavid Geffen School of Medicine, University of California, Los Angeles
cVirginia Commonwealth University School of Medicine, Richmond
dUniversity of Pennsylvania, Philadelphia
eWright State University, Fairborn, Ohio

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

This evaluation was determined to be an operations activity based on VA Handbook 1200.21 and was subject to administrative rather than institutional review board oversight.

Article PDF
Article PDF

The US Department of Veterans Affairs (VA) operates the largest integrated health care system in the United States, providing physical and mental health care to more than 9 million veterans enrolled each year through a national system of inpatient, outpatient, and long-term care settings.1 As 1 of 4 statutory missions, the VA conducts the largest training effort for health professionals in cooperation with affiliated academic institutions. From 2016 through 2020, an average of 123,000 trainees from various professions received training at the VA.2 Physician residents comprised the largest trainee group (37%), followed by associated health students and residents (20%), and nursing professionals (21%).2 In VA, associated health professions include all health care disciplines other than allopathic and osteopathic medicine, dentistry, and nursing. The associated health professions encompass about 40 specialties, including audiology, dietetics, physical and occupational therapy, optometry, pharmacy, podiatry, psychology, and social work. 

The VA also trains a smaller number of advanced fellows to address specialties important to the nation and veterans health that are not sufficiently addressed by standard accredited professional training.3 The VA Advanced Fellowship programs include 22 postresidency, postdoctoral, and postmasters fellowships to physicians and dentists, and associated health professions, including psychologists, social workers, and pharmacists. 3 From 2015 to 2019, 57 to 61% of medical school students reported having a VA clinical training experience during medical school.4 Of current VA employees, 20% of registered nurses, 64% of physicians, 73% of podiatrists and optometrists, and 81% of psychologists reported VA training prior to employment.5

Health professions education is led by the designated education officer (DEO) at each VA facility.6 Also known as the associate chief of staff for education (ACOS/E), the DEO is a leadership position that is accountable to local VA facility executive leadership as well as the national Office of Academic Affiliations (OAA), which directs all VA health professions training across the US.6 At most VA facilities, the DEO oversees clinical training and education reporting directly to the facility chief of staff. At the same time, the ACOS/E is accountable to the OAA to ensure adherence with national education directives and policy. The DEO oversees trainee programs through collaboration with training program directors, faculty, academic affiliates, and accreditation agencies across > 40 health professions.

The DEO is expected to possess expertise in leadership attributes identified by the US Office of Personnel Management as essential to build a federal corporate culture that drives results, serves customers, and builds successful teams and coalitions within and outside the VA.7 These leadership attributes include leading change, leading people, driving results, business acumen, and building coalitions.7 They are operationalized by OAA as 4 domains of expertise required to lead education across multiple professions, including: (1) creating and sustaining an organizational work environment that supports learning, discovery, and continuous improvement; (2) aligning and managing fiscal, human, and capital resources to meet organizational learning needs; (3) driving learning and performance results to impact organizational success; and (4) leading change and transformation through positioning and implementing innovative learning and education strategies (Table 1).6

Designated Education Officer Domains of Expertise and Task Examples

In this article we describe the VA DEO leadership role and the tasks required to lead education across multiple professions within the VA health care system. Given the broad scope of leading educational programs across multiple clinical professions and the interprofessional backgrounds of DEOs across the VA, we evaluated DEO self-perceived effectiveness to impact educational decisions and behavior by professional discipline. Our evaluation question is: Are different professional education and practice backgrounds functionally capable of providing leadership over all education of health professions training programs? Finally, we describe DEOs perceptions of facilitators and barriers to performing their DEO role within the VA.

Methods

We conducted a mixed methods analysis of data collected by OAA to assess DEO needs within a multiprofessional clinical learning environment. The needs assessment was conducted by an OAA evaluator (NH) with input on instrument development and data analysis from OAA leadership (KS, MB). This evaluation is categorized as an operations activity based on VA Handbook 1200 where information generated is used for business operations and quality improvement. 8 The overall project was subject to administrative rather than institutional review board oversight.

A needs assessment tool was developed based on the OAA domains of expertise.6 Prior to its administration, the tool was piloted with 8 DEOs in the field and the survey shortened based on their feedback. DEOs were asked about individual professional characteristics (eg, clinical profession, academic appointment, type of health professions training programs at the VA site) and their self-perceived effectiveness in impacting educational decisions and behaviors on general and profession-specific tasks within each of the 4 domains of expertise on a 5-point Likert scale (1, not effective; 5, very effective). 6,9 The needs assessment also included an open-ended question asking respondents to comment on any issues they felt important to understanding DEO role effectiveness.

The needs assessment was administered online via SurveyMonkey to 132 DEOs via email in September and October 2019. The DEOs represented 148 of 160 VA facilities with health professions education; 14 DEOs covered > 1 VA facility, and 12 positions were vacant. Email reminders were sent to nonresponders after 1 week. At 2 weeks, nonresponders received telephone reminders and personalized follow-up emails from OAA staff. The response rate at the end of 3 weeks was 96%.

Data Analysis

Mixed methods analyses included quantitative analyses to identify differences in general and profession-specific self-ratings of effectiveness in influencing educational decisions and behaviors by DEO profession, and qualitative analyses to further understand DEO’s perceptions of facilitators and barriers to DEO task effectiveness.10,11 Quantitative analyses included descriptive statistics for all variables followed by nonparametric tests including χ2 and Mann- Whitney U tests to assess differences between physician and other professional DEOs in descriptive characteristics and selfperceived effectiveness on general and profession- specific tasks. Quantitative analyses were conducted using SPSS software, version 26. Qualitative analyses consisted of rapid assessment procedures to identify facilitators and barriers to DEO effectiveness by profession using Atlas.ti version 8, which involved reviewing responses to the open-ended question and assigning each response to predetermined categories based on the organizational level it applied to (eg, individual DEO, VA facility, or external to the organization).12,13 Responses within categories were then summarized to identify the main themes.

Results 

Completed surveys were received from 127 respondents representing 139 VA facilities. Eighty percent were physicians and 20% were other professionals, including psychologists, pharmacists, dentists, dieticians, nurses, and nonclinicians. There were no statistically significant differences between physician and other professional DEOs in the percent working full time or length of time spent working in the position. About one-third of the sample had been in the position for < 2 years, one-third had been in the position for 2 to < 5 years, and one-third had been in the role for ≥ 5 years. Eighty percent reported having a faculty appointment with an academic affiliate. While 92% of physician DEOs had a faculty appointment, only 40% of other professional DEOs did (P < .001). Most faculty appointments for both groups were with a school of medicine. More physician DEOs than other professionals had training programs at their site for physicians (P = .003) and dentists (P < .001), but there were no statistically significant differences for having associated health, nursing, or advanced fellowship training programs at their sites. Across all DEOs, 98% reported training programs at their site for associated health professions, 95% for physician training, 93% for nursing training, 59% for dental training, and 48% for advanced fellowships.

Self-Perceived Effectiveness

There were no statistically significant differences between physician and other professional DEOs on self-perceived effectiveness in impacting educational decisions or behaviors for general tasks applicable across professions (Table 2). This result held even after controlling for length of time in the position and whether the DEO had an academic appointment. Generally, both groups reported being effective on tasks in the enabling learning domain, including applying policies and procedures related to trainees who rotate through the VA and maintaining adherence with accreditation agency standards across health professions. Mean score ranges for both physician and other professional DEOs reported moderate effectiveness in aligning resources effectiveness questions (2.45-3.72 vs 2.75-3.76), driving results questions (3.02-3.60 vs 3.39-3.48), and leading change questions (3.12-3.50 vs 3.42-3.80).

For profession-specific tasks, effectiveness ratings between the 2 groups were generally not statistically significant for medical, dental, and advanced fellowship training programs (Table 3). There was a pattern of statistically significant differences between physician and other professional DEOs for associated health and nursing training programs on tasks across the 4 domains of expertise with physicians having lower mean ratings compared with other professionals. Generally, physician DEOs had higher task effectiveness when compared with other professionals for medical training programs, and other professionals had higher task effectiveness ratings than did physicians for associated health or nursing training programs.

Facilitators and Barriers

Seventy responses related to facilitators and barriers to DEO effectiveness were received (59 from physicians and 11 from other professionals). Most responses were categorized as individual level facilitators or barriers (53% for physician and 64% for other professionals). Only 3% of comments were categorized as external to the organization (all made by physicians). The themes were similar for both groups and were aggregated in Table 4. Facilitators included continuing education, having a mentor who works at a similar type of facility, maintaining balance and time management when working with different training programs, learning to work and develop relationships with training program directors, developing an overall picture of each type of health professions training program, holding regular meetings with all health training programs and academic affiliates, having a formal education service line with budget and staffing, facility executive leadership who are knowledgeable of the education mission and DEO role, having a national oversight body, and the DEO’s relationships with academic affiliates.

Barriers to role effectiveness at the individual DEO level included assignment of multiple roles and a focus on regulation and monitoring with little time for development of new programs and strategic planning. The organizational level barriers included difficulty getting core services to engage with health professions trainees and siloed education leadership. 

Discussion

DEOs oversee multiple health professions training programs within local facilities. The DEO is accountable to local VA facility leadership and a national education office to lead local health professions education at local facilities and integrate these educational activities across the national VA system.

The VA DEO role is similar to the Accreditation Council for Graduate Medical Education designated institutional official (DIO) except that the VA DEO provides oversight of > 40 health professions training programs.14,15 The VA DEO, therefore, has broader oversight than the DIO role that focuses only on graduate physician education. Similar to the DIO, the VA DEO role initially emphasized the enabling learning and aligning resources domains to provide oversight and administration of health professions training programs. Over time, both roles have expanded to include defining and ensuring healthy clinical learning environments, aligning educational resources and training with the institutional mission, workforce, and societal needs, and creating continuous educational improvement models.6,16,17 To accomplish these expanded goals, both the DEO and the DIO work closely with other educational leaders at the academic affiliate and the VA facility. As health professions education advances, there will be increased emphasis placed on delivering educational programs to improve clinical practice and health care outcomes.18

Our findings that DEO profession did not influence self-ratings of effectiveness to influence educational decisions or behaviors on general tasks applicable across health professions suggest that education and practice background are not factors influencing selfratings. Nor were self-ratings influenced by other factors. Since the DEO is a senior leadership position, candidates for the position already may possess managerial and leadership skills. In our sample, several individuals commented that they had prior education leadership positions, eg, training program director or had years of experience working in the VA. Similarly, having an academic appointment may not be important for the performance of general administrative tasks. However, an academic appointment may be important for effective performance of educational tasks, such as clinical teaching, didactic training, and curriculum development, which were not measured in this study.

The finding of differences in self-ratings between physicians and other professionals on profession-specific tasks for associated health and nursing suggests that physicians may require additional curriculum to enhance their knowledge in managing other professional educational programs. For nursing specifically, this finding could also reflect substantial input from the lead nurse executive in the facility. DEOs also identified practical ways to facilitate their work with multiple health professions that could immediately be put into practice, including developing relationships and enhancing communication with training program directors, faculty, and academic affiliates of each profession.

Taken together, the quantitative and qualitative findings indicate that despite differences in professional backgrounds, DEOs have high self-ratings of their own effectiveness to influence educational decisions and behaviors on general tasks they are expected to accomplish. There are some professionspecific tasks where professional background does influence self-perceived effectiveness, ie, physicians have higher self-ratings on physician-specific tasks and other professionals have higher self-ratings on associated health or nursing tasks. These perceived differences may be mitigated by increasing facilitators and decreasing barriers identified for the individual DEO, within the organization, and external to the organization.

Limitations Our findings should be interpreted with the following limitations in mind. The selfreport nature of the data opens the possibility of self-report bias or Dunning-Kruger effects where effectiveness ratings could have been overestimated by respondents.21 Although respondents were assured of their anonymity and that results would only be reported in the aggregate, there is potential for providing more positive responses on a needs assessment administered by the national education program office. We recommend further work be conducted to validate the needs assessment tool against other data collection methods, such as actual outcomes of educational effectiveness. Our study did not incorporate measures of educational effectiveness to determine whether self-perceived DEO effectiveness is translated to better trainee or learning outcomes. Before this can happen, educational policymakers must identify the most important facility-level learning outcomes. Since the DEO is a facility level educational administrator, learning efeffectiveness must be defined at the facility level. The qualitative findings could also be expanded through the application of more detailed qualitative methods, such as indepth interviews. The tasks rated by DEOs were based on OAA’s current definition of the DEO role.6 As the field advances, DEO tasks will also evolve.22-24

Conclusions

The DEO is a senior educational leadership role that oversees all health professions training in the VA. Our findings are supportive of individuals from various health disciplines serving in the VA DEO role with responsibilities that span multiple health profession training programs. We recommend further work to validate the instrument used in this study, as well as the application of qualitative methods like indepth interviews to further our understanding of the DEO role.

The US Department of Veterans Affairs (VA) operates the largest integrated health care system in the United States, providing physical and mental health care to more than 9 million veterans enrolled each year through a national system of inpatient, outpatient, and long-term care settings.1 As 1 of 4 statutory missions, the VA conducts the largest training effort for health professionals in cooperation with affiliated academic institutions. From 2016 through 2020, an average of 123,000 trainees from various professions received training at the VA.2 Physician residents comprised the largest trainee group (37%), followed by associated health students and residents (20%), and nursing professionals (21%).2 In VA, associated health professions include all health care disciplines other than allopathic and osteopathic medicine, dentistry, and nursing. The associated health professions encompass about 40 specialties, including audiology, dietetics, physical and occupational therapy, optometry, pharmacy, podiatry, psychology, and social work. 

The VA also trains a smaller number of advanced fellows to address specialties important to the nation and veterans health that are not sufficiently addressed by standard accredited professional training.3 The VA Advanced Fellowship programs include 22 postresidency, postdoctoral, and postmasters fellowships to physicians and dentists, and associated health professions, including psychologists, social workers, and pharmacists. 3 From 2015 to 2019, 57 to 61% of medical school students reported having a VA clinical training experience during medical school.4 Of current VA employees, 20% of registered nurses, 64% of physicians, 73% of podiatrists and optometrists, and 81% of psychologists reported VA training prior to employment.5

Health professions education is led by the designated education officer (DEO) at each VA facility.6 Also known as the associate chief of staff for education (ACOS/E), the DEO is a leadership position that is accountable to local VA facility executive leadership as well as the national Office of Academic Affiliations (OAA), which directs all VA health professions training across the US.6 At most VA facilities, the DEO oversees clinical training and education reporting directly to the facility chief of staff. At the same time, the ACOS/E is accountable to the OAA to ensure adherence with national education directives and policy. The DEO oversees trainee programs through collaboration with training program directors, faculty, academic affiliates, and accreditation agencies across > 40 health professions.

The DEO is expected to possess expertise in leadership attributes identified by the US Office of Personnel Management as essential to build a federal corporate culture that drives results, serves customers, and builds successful teams and coalitions within and outside the VA.7 These leadership attributes include leading change, leading people, driving results, business acumen, and building coalitions.7 They are operationalized by OAA as 4 domains of expertise required to lead education across multiple professions, including: (1) creating and sustaining an organizational work environment that supports learning, discovery, and continuous improvement; (2) aligning and managing fiscal, human, and capital resources to meet organizational learning needs; (3) driving learning and performance results to impact organizational success; and (4) leading change and transformation through positioning and implementing innovative learning and education strategies (Table 1).6

Designated Education Officer Domains of Expertise and Task Examples

In this article we describe the VA DEO leadership role and the tasks required to lead education across multiple professions within the VA health care system. Given the broad scope of leading educational programs across multiple clinical professions and the interprofessional backgrounds of DEOs across the VA, we evaluated DEO self-perceived effectiveness to impact educational decisions and behavior by professional discipline. Our evaluation question is: Are different professional education and practice backgrounds functionally capable of providing leadership over all education of health professions training programs? Finally, we describe DEOs perceptions of facilitators and barriers to performing their DEO role within the VA.

Methods

We conducted a mixed methods analysis of data collected by OAA to assess DEO needs within a multiprofessional clinical learning environment. The needs assessment was conducted by an OAA evaluator (NH) with input on instrument development and data analysis from OAA leadership (KS, MB). This evaluation is categorized as an operations activity based on VA Handbook 1200 where information generated is used for business operations and quality improvement. 8 The overall project was subject to administrative rather than institutional review board oversight.

A needs assessment tool was developed based on the OAA domains of expertise.6 Prior to its administration, the tool was piloted with 8 DEOs in the field and the survey shortened based on their feedback. DEOs were asked about individual professional characteristics (eg, clinical profession, academic appointment, type of health professions training programs at the VA site) and their self-perceived effectiveness in impacting educational decisions and behaviors on general and profession-specific tasks within each of the 4 domains of expertise on a 5-point Likert scale (1, not effective; 5, very effective). 6,9 The needs assessment also included an open-ended question asking respondents to comment on any issues they felt important to understanding DEO role effectiveness.

The needs assessment was administered online via SurveyMonkey to 132 DEOs via email in September and October 2019. The DEOs represented 148 of 160 VA facilities with health professions education; 14 DEOs covered > 1 VA facility, and 12 positions were vacant. Email reminders were sent to nonresponders after 1 week. At 2 weeks, nonresponders received telephone reminders and personalized follow-up emails from OAA staff. The response rate at the end of 3 weeks was 96%.

Data Analysis

Mixed methods analyses included quantitative analyses to identify differences in general and profession-specific self-ratings of effectiveness in influencing educational decisions and behaviors by DEO profession, and qualitative analyses to further understand DEO’s perceptions of facilitators and barriers to DEO task effectiveness.10,11 Quantitative analyses included descriptive statistics for all variables followed by nonparametric tests including χ2 and Mann- Whitney U tests to assess differences between physician and other professional DEOs in descriptive characteristics and selfperceived effectiveness on general and profession- specific tasks. Quantitative analyses were conducted using SPSS software, version 26. Qualitative analyses consisted of rapid assessment procedures to identify facilitators and barriers to DEO effectiveness by profession using Atlas.ti version 8, which involved reviewing responses to the open-ended question and assigning each response to predetermined categories based on the organizational level it applied to (eg, individual DEO, VA facility, or external to the organization).12,13 Responses within categories were then summarized to identify the main themes.

Results 

Completed surveys were received from 127 respondents representing 139 VA facilities. Eighty percent were physicians and 20% were other professionals, including psychologists, pharmacists, dentists, dieticians, nurses, and nonclinicians. There were no statistically significant differences between physician and other professional DEOs in the percent working full time or length of time spent working in the position. About one-third of the sample had been in the position for < 2 years, one-third had been in the position for 2 to < 5 years, and one-third had been in the role for ≥ 5 years. Eighty percent reported having a faculty appointment with an academic affiliate. While 92% of physician DEOs had a faculty appointment, only 40% of other professional DEOs did (P < .001). Most faculty appointments for both groups were with a school of medicine. More physician DEOs than other professionals had training programs at their site for physicians (P = .003) and dentists (P < .001), but there were no statistically significant differences for having associated health, nursing, or advanced fellowship training programs at their sites. Across all DEOs, 98% reported training programs at their site for associated health professions, 95% for physician training, 93% for nursing training, 59% for dental training, and 48% for advanced fellowships.

Self-Perceived Effectiveness

There were no statistically significant differences between physician and other professional DEOs on self-perceived effectiveness in impacting educational decisions or behaviors for general tasks applicable across professions (Table 2). This result held even after controlling for length of time in the position and whether the DEO had an academic appointment. Generally, both groups reported being effective on tasks in the enabling learning domain, including applying policies and procedures related to trainees who rotate through the VA and maintaining adherence with accreditation agency standards across health professions. Mean score ranges for both physician and other professional DEOs reported moderate effectiveness in aligning resources effectiveness questions (2.45-3.72 vs 2.75-3.76), driving results questions (3.02-3.60 vs 3.39-3.48), and leading change questions (3.12-3.50 vs 3.42-3.80).

For profession-specific tasks, effectiveness ratings between the 2 groups were generally not statistically significant for medical, dental, and advanced fellowship training programs (Table 3). There was a pattern of statistically significant differences between physician and other professional DEOs for associated health and nursing training programs on tasks across the 4 domains of expertise with physicians having lower mean ratings compared with other professionals. Generally, physician DEOs had higher task effectiveness when compared with other professionals for medical training programs, and other professionals had higher task effectiveness ratings than did physicians for associated health or nursing training programs.

Facilitators and Barriers

Seventy responses related to facilitators and barriers to DEO effectiveness were received (59 from physicians and 11 from other professionals). Most responses were categorized as individual level facilitators or barriers (53% for physician and 64% for other professionals). Only 3% of comments were categorized as external to the organization (all made by physicians). The themes were similar for both groups and were aggregated in Table 4. Facilitators included continuing education, having a mentor who works at a similar type of facility, maintaining balance and time management when working with different training programs, learning to work and develop relationships with training program directors, developing an overall picture of each type of health professions training program, holding regular meetings with all health training programs and academic affiliates, having a formal education service line with budget and staffing, facility executive leadership who are knowledgeable of the education mission and DEO role, having a national oversight body, and the DEO’s relationships with academic affiliates.

Barriers to role effectiveness at the individual DEO level included assignment of multiple roles and a focus on regulation and monitoring with little time for development of new programs and strategic planning. The organizational level barriers included difficulty getting core services to engage with health professions trainees and siloed education leadership. 

Discussion

DEOs oversee multiple health professions training programs within local facilities. The DEO is accountable to local VA facility leadership and a national education office to lead local health professions education at local facilities and integrate these educational activities across the national VA system.

The VA DEO role is similar to the Accreditation Council for Graduate Medical Education designated institutional official (DIO) except that the VA DEO provides oversight of > 40 health professions training programs.14,15 The VA DEO, therefore, has broader oversight than the DIO role that focuses only on graduate physician education. Similar to the DIO, the VA DEO role initially emphasized the enabling learning and aligning resources domains to provide oversight and administration of health professions training programs. Over time, both roles have expanded to include defining and ensuring healthy clinical learning environments, aligning educational resources and training with the institutional mission, workforce, and societal needs, and creating continuous educational improvement models.6,16,17 To accomplish these expanded goals, both the DEO and the DIO work closely with other educational leaders at the academic affiliate and the VA facility. As health professions education advances, there will be increased emphasis placed on delivering educational programs to improve clinical practice and health care outcomes.18

Our findings that DEO profession did not influence self-ratings of effectiveness to influence educational decisions or behaviors on general tasks applicable across health professions suggest that education and practice background are not factors influencing selfratings. Nor were self-ratings influenced by other factors. Since the DEO is a senior leadership position, candidates for the position already may possess managerial and leadership skills. In our sample, several individuals commented that they had prior education leadership positions, eg, training program director or had years of experience working in the VA. Similarly, having an academic appointment may not be important for the performance of general administrative tasks. However, an academic appointment may be important for effective performance of educational tasks, such as clinical teaching, didactic training, and curriculum development, which were not measured in this study.

The finding of differences in self-ratings between physicians and other professionals on profession-specific tasks for associated health and nursing suggests that physicians may require additional curriculum to enhance their knowledge in managing other professional educational programs. For nursing specifically, this finding could also reflect substantial input from the lead nurse executive in the facility. DEOs also identified practical ways to facilitate their work with multiple health professions that could immediately be put into practice, including developing relationships and enhancing communication with training program directors, faculty, and academic affiliates of each profession.

Taken together, the quantitative and qualitative findings indicate that despite differences in professional backgrounds, DEOs have high self-ratings of their own effectiveness to influence educational decisions and behaviors on general tasks they are expected to accomplish. There are some professionspecific tasks where professional background does influence self-perceived effectiveness, ie, physicians have higher self-ratings on physician-specific tasks and other professionals have higher self-ratings on associated health or nursing tasks. These perceived differences may be mitigated by increasing facilitators and decreasing barriers identified for the individual DEO, within the organization, and external to the organization.

Limitations Our findings should be interpreted with the following limitations in mind. The selfreport nature of the data opens the possibility of self-report bias or Dunning-Kruger effects where effectiveness ratings could have been overestimated by respondents.21 Although respondents were assured of their anonymity and that results would only be reported in the aggregate, there is potential for providing more positive responses on a needs assessment administered by the national education program office. We recommend further work be conducted to validate the needs assessment tool against other data collection methods, such as actual outcomes of educational effectiveness. Our study did not incorporate measures of educational effectiveness to determine whether self-perceived DEO effectiveness is translated to better trainee or learning outcomes. Before this can happen, educational policymakers must identify the most important facility-level learning outcomes. Since the DEO is a facility level educational administrator, learning efeffectiveness must be defined at the facility level. The qualitative findings could also be expanded through the application of more detailed qualitative methods, such as indepth interviews. The tasks rated by DEOs were based on OAA’s current definition of the DEO role.6 As the field advances, DEO tasks will also evolve.22-24

Conclusions

The DEO is a senior educational leadership role that oversees all health professions training in the VA. Our findings are supportive of individuals from various health disciplines serving in the VA DEO role with responsibilities that span multiple health profession training programs. We recommend further work to validate the instrument used in this study, as well as the application of qualitative methods like indepth interviews to further our understanding of the DEO role.

References

1. US Department of Veterans Affairs, Veterans Health Administration. Updated April 18, 2022. Accessed May 6, 2022. https://www.va.gov/health/aboutvha.asp

2. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic Year 2019-2020. Published 2020. Accessed May 6, 2022. https://www.va.gov/OAA/docs /OAA_Statistics_2020.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Advanced Fellowships and Professional Development. Updated November 26, 2021. Accessed May 6, 2022. https://www.va.gov/oaa /advancedfellowships/advanced-fellowships.asp

4. Association of American Medical Colleges. Medical school graduation questionnaire, 2019 all schools summary report. Published July 2019. Accessed May 6, 2022. https://www.aamc.org/system/files/2019-08/2019-gq-all-schools -summary-report.pdf

5. US Department of Veterans Affairs, National Center for Organization Development. VA all employee survey. Published 2019. Accessed May 6, 2022. https://www.va.gov /NCOD/VAworkforcesurveys.asp

6. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Education leaders in the VA: the role of the designated education officer (DEO). Published December 2019. Accessed May 6, 2022. https://www.va.gov/OAA/docs/DEO_Learning _Leader_2019.pdf

7. US Office of Personnel Management. Policy, data oversight: guide to senior executive service qualifications. Published 2010. Accessed May 6, 2022. https://www.opm .gov/policy-data-oversight/senior-executive-service /executive-core-qualifications/

8. US Department of Veterans Affairs, Office of Research and Development. Program guide: 1200.21 VHA operations activities that may constitute research. Published January 9, 2019. Accessed May 6, 2022. https://www.research .va.gov/resources/policies/ProgramGuide-1200-21-VHA -Operations-Activities.pdf

9. Riesenberg LA, Rosenbaum PF, Stick SL. Competencies, essential training, and resources viewed by designated institutional officials as important to the position in graduate medical education [published correction appears in Acad Med. 2006 Dec;81(12):1025]. Acad Med. 2006;81(5):426- 431. doi:10.1097/01.ACM.0000222279.28824.f5

10. Palinkas LA, Mendon SJ, Hamilton AB. Inn o v a t i o n s i n M i x e d M e t h o d s E v a l u a - tions. Annu Rev Public Health. 2019;40:423-442. doi:10.1146/annurev-publhealth-040218-044215

11. Tashakkori A, Creswell JW. Exploring the nature of research questions in mixed methods research. J Mix Methods Res. 2007;1(3):207-211. doi:10.1177/1558689807302814

12. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611

13. Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

14. Bellini L, Hartmann D, Opas L. Beyond must: supporting the evolving role of the designated institutional official. J Grad Med Educ. 2010;2(2):147-150. doi:10.4300/JGME-D-10-00073.1

15. Riesenberg LA, Rosenbaum P, Stick SL. Characteristics, roles, and responsibilities of the Designated Institutional Official (DIO) position in graduate medical education education [published correction appears in Acad Med. 2006 Dec;81(12):1025] [published correction appears in Acad Med. 2006 Mar;81(3):274]. Acad Med. 2006;81(1):8-19. doi:10.1097/00001888-200601000-00005

16. Group on Resident Affairs Core Competency Task Force. Institutional GME leadership competencies. 2015. Accessed May 6, 2022. https://www.aamc.org/system /files/c/2/441248-institutionalgmeleadershipcompetencies .pdf

17. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):1687-1688. doi:10.1001/jama.2013.1931

18. Beliveau ME, Warnes CA, Harrington RA, et al. Organizational change, leadership, and the transformation of continuing professional development: lessons learned from the American College of Cardiology. J Contin Educ Health Prof. 2015;35(3):201-210. doi:10.1002/chp.21301

19. World Health Organization. Framework for Action on Interprofessional Education and Collaborative Practice. Published September 1, 2020. Accessed May 10, 2022. https://www.who.int/publications/i/item/framework -for-action-on-interprofessional-education-collaborative -practice

20. Weiss K, Passiment M, Riordan L, Wagner R for the National Collaborative for Improving the Clinical Learning Environment IP-CLE Report Work Group. Achieving the optimal interprofessional clinical learning environment: proceedings from an NCICLE symposium. Published January 18, 2019. Accessed May 6, 2022. doi:10.33385/NCICLE.0002

21. Althubaiti A. Information bias in health research: definition, pitfalls, and adjustment methods. J Multidiscip Healthc. 2016;9:211-217. Published 2016 May 4. doi:10.2147/JMDH.S104807

22. Gilman SC, Chokshi DA, Bowen JL, Rugen KW, Cox M. Connecting the dots: interprofessional health education and delivery system redesign at the Veterans Health Administration. Acad Med. 2014;89(8):1113-1116. doi:10.1097/ACM.0000000000000312

23. Health Professions Accreditors Collaborative. Guidance on developing quality interprofessional education for the health professions. Published February 1, 2019. Accessed May 6, 2022. https://healthprofessionsaccreditors.org/wp -content/uploads/2019/02/HPACGuidance02-01-19.pdf

24. Watts BV, Paull DE, Williams LC, Neily J, Hemphill RR, Brannen JL. Department of Veterans Affairs Chief Resident in Quality and Patient Safety Program: a model to spread change. Am J Med Qual. 2016;31(6):598-600. doi:10.1177/1062860616643403

References

1. US Department of Veterans Affairs, Veterans Health Administration. Updated April 18, 2022. Accessed May 6, 2022. https://www.va.gov/health/aboutvha.asp

2. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic Year 2019-2020. Published 2020. Accessed May 6, 2022. https://www.va.gov/OAA/docs /OAA_Statistics_2020.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Advanced Fellowships and Professional Development. Updated November 26, 2021. Accessed May 6, 2022. https://www.va.gov/oaa /advancedfellowships/advanced-fellowships.asp

4. Association of American Medical Colleges. Medical school graduation questionnaire, 2019 all schools summary report. Published July 2019. Accessed May 6, 2022. https://www.aamc.org/system/files/2019-08/2019-gq-all-schools -summary-report.pdf

5. US Department of Veterans Affairs, National Center for Organization Development. VA all employee survey. Published 2019. Accessed May 6, 2022. https://www.va.gov /NCOD/VAworkforcesurveys.asp

6. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Education leaders in the VA: the role of the designated education officer (DEO). Published December 2019. Accessed May 6, 2022. https://www.va.gov/OAA/docs/DEO_Learning _Leader_2019.pdf

7. US Office of Personnel Management. Policy, data oversight: guide to senior executive service qualifications. Published 2010. Accessed May 6, 2022. https://www.opm .gov/policy-data-oversight/senior-executive-service /executive-core-qualifications/

8. US Department of Veterans Affairs, Office of Research and Development. Program guide: 1200.21 VHA operations activities that may constitute research. Published January 9, 2019. Accessed May 6, 2022. https://www.research .va.gov/resources/policies/ProgramGuide-1200-21-VHA -Operations-Activities.pdf

9. Riesenberg LA, Rosenbaum PF, Stick SL. Competencies, essential training, and resources viewed by designated institutional officials as important to the position in graduate medical education [published correction appears in Acad Med. 2006 Dec;81(12):1025]. Acad Med. 2006;81(5):426- 431. doi:10.1097/01.ACM.0000222279.28824.f5

10. Palinkas LA, Mendon SJ, Hamilton AB. Inn o v a t i o n s i n M i x e d M e t h o d s E v a l u a - tions. Annu Rev Public Health. 2019;40:423-442. doi:10.1146/annurev-publhealth-040218-044215

11. Tashakkori A, Creswell JW. Exploring the nature of research questions in mixed methods research. J Mix Methods Res. 2007;1(3):207-211. doi:10.1177/1558689807302814

12. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866. doi:10.1177/104973230201200611

13. Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

14. Bellini L, Hartmann D, Opas L. Beyond must: supporting the evolving role of the designated institutional official. J Grad Med Educ. 2010;2(2):147-150. doi:10.4300/JGME-D-10-00073.1

15. Riesenberg LA, Rosenbaum P, Stick SL. Characteristics, roles, and responsibilities of the Designated Institutional Official (DIO) position in graduate medical education education [published correction appears in Acad Med. 2006 Dec;81(12):1025] [published correction appears in Acad Med. 2006 Mar;81(3):274]. Acad Med. 2006;81(1):8-19. doi:10.1097/00001888-200601000-00005

16. Group on Resident Affairs Core Competency Task Force. Institutional GME leadership competencies. 2015. Accessed May 6, 2022. https://www.aamc.org/system /files/c/2/441248-institutionalgmeleadershipcompetencies .pdf

17. Weiss KB, Bagian JP, Nasca TJ. The clinical learning environment: the foundation of graduate medical education. JAMA. 2013;309(16):1687-1688. doi:10.1001/jama.2013.1931

18. Beliveau ME, Warnes CA, Harrington RA, et al. Organizational change, leadership, and the transformation of continuing professional development: lessons learned from the American College of Cardiology. J Contin Educ Health Prof. 2015;35(3):201-210. doi:10.1002/chp.21301

19. World Health Organization. Framework for Action on Interprofessional Education and Collaborative Practice. Published September 1, 2020. Accessed May 10, 2022. https://www.who.int/publications/i/item/framework -for-action-on-interprofessional-education-collaborative -practice

20. Weiss K, Passiment M, Riordan L, Wagner R for the National Collaborative for Improving the Clinical Learning Environment IP-CLE Report Work Group. Achieving the optimal interprofessional clinical learning environment: proceedings from an NCICLE symposium. Published January 18, 2019. Accessed May 6, 2022. doi:10.33385/NCICLE.0002

21. Althubaiti A. Information bias in health research: definition, pitfalls, and adjustment methods. J Multidiscip Healthc. 2016;9:211-217. Published 2016 May 4. doi:10.2147/JMDH.S104807

22. Gilman SC, Chokshi DA, Bowen JL, Rugen KW, Cox M. Connecting the dots: interprofessional health education and delivery system redesign at the Veterans Health Administration. Acad Med. 2014;89(8):1113-1116. doi:10.1097/ACM.0000000000000312

23. Health Professions Accreditors Collaborative. Guidance on developing quality interprofessional education for the health professions. Published February 1, 2019. Accessed May 6, 2022. https://healthprofessionsaccreditors.org/wp -content/uploads/2019/02/HPACGuidance02-01-19.pdf

24. Watts BV, Paull DE, Williams LC, Neily J, Hemphill RR, Brannen JL. Department of Veterans Affairs Chief Resident in Quality and Patient Safety Program: a model to spread change. Am J Med Qual. 2016;31(6):598-600. doi:10.1177/1062860616643403

Issue
Federal Practitioner - 39(6)a
Issue
Federal Practitioner - 39(6)a
Page Number
266-273
Page Number
266-273
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 06/15/2022 - 12:30
Un-Gate On Date
Wed, 06/15/2022 - 12:30
Use ProPublica
CFC Schedule Remove Status
Wed, 06/15/2022 - 12:30
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

The Expansion of Associated Health Training in the VA

Article Type
Changed
Mon, 08/09/2021 - 14:45

The US Department of Veterans Affairs (VA) is the largest health care delivery system in the United States, comprising 1293 sites of care, including 171 medical centers.1 One of the 4 statutory missions of the VA is to train health care professionals (HCPs) to meet the needs of the VA and the nation.2 Through partnerships with more than 1800 accredited colleges, universities, and training programs, the VA provides training annually to nearly 118,000 health professions trainees (HPTs) across a variety of health care professions, and all of whom provide direct clinical care to veterans.3

In the VA, the Office of Academic Affiliations (OAA) is charged with overseeing health professions training and the VA’s partnership with medical and associated health (AH) professions schools, which was first codified in Policy Memorandum No. 2 in 1946.4,5 Given the scope and breadth of health professions education offered through the VA, OAA is in a unique position to address health care shortage areas as well as influence the educational standards for certain professions.

Many of these health care professions fall under the rubric of AH, which include mental health (MH) specialties, rehabilitative specialties, and others. These professions are critical to address in the expanding world of health care in the United States with its increased specialization and emphasis on coordination of care with interprofessional teams. During the 2019/2020 academic year, the VA provided clinical training to approximately 21,000 AH HPTs from > 40 professions with just over 20% receiving financial support through the OAA. Of the HPTs who train at VA without compensation, most spend shorter amounts of time in clinical rotations in the VA, are in pregraduate-degree education programs where payment for clinical rotations is not expected and may not be eligible for hire immediately on completion of their clinical training experience. The 17 funded professions have been strategically selected by the OAA to ensure a robust pipeline of HCPs to meet the needs of veterans and the nation.

To meet the demands of AH professionals (AHPs), the OAA implemented targeted expansion over the past 10 years. While not exhaustive, this paper describes several expansion efforts based on VA special initiatives, including enhancing clinical access in rural settings and shifting toward postgraduate-degree training and specialization. By aligning expansion with VA priorities as well as trends in health care more broadly, the OAA can ensure that there is a supply of well-trained AHPs who have developed the requisite competencies to contribute to our nation’s health care needs. Further, expansion can help train and recruit health professionals who can be hired into VA positions ready to care for the complex needs of veterans.

Associated Health Professionals

Overseen by the OAA, AH expansion is designed to address the specific needs of the VA and the US health care system. Data from the VA Workforce Management and Consulting (WMC) shows that the VA employment of AHPs has grown from 87,351 AHPs hired in fiscal year (FY) 2010 to 119,120 as of April 2020. This represents an average yearly growth rate of 3.4% and a total growth rate of 36%. The Bureau of Labor Statistics predictions for 2019/2029 suggest that certain AHPs are expected to have a 10-year growth rates of 20% or more to meet the changing health care needs of patients especially as the population ages; the growth rates for many AHPs far surpasses that of physicians, which is anticipated to be 4% (Table).6,7 The VA WMC expects an additional 52,283 AHPs will be hired by the VA by FY 2030 based on the 10-year average growth rate (Kali Clark, Veterans Health Administration Workforce Management and Consulting Office, email communication, May 28, 2020).

OAA AH Education Funded Professions: Academic Year 2020/2021 and Projected Growth table

One of the driving forces behind the growth rate is the move toward using AHPs to supplement health care for a variety of health conditions.8,9 Examples include the integration of rehabilitation professionals, alternative care professionals (eg, massage therapists, practitioners who offer training in yoga and meditation), chiropractors, MH professionals, and pharmacists in the treatment of chronic pain, the use of a wider range of professionals in the treatment of MH conditions, and the integration of MH professionals into traditional medical settings, such as primary care. This intentional move to a more well-integrated model of interprofessional care is apparent in many other health care systems throughout the United States. Within the VA, this shift may be most evident through the introduction of the Whole Health model of care. The Whole Health model of care uses an interprofessional team to assess and care for veterans, using a personalized health plan addressing medical and MH conditions as well as behavioral, social, or spiritual concerns.10 The Whole Health model of care provides veterans with access to a variety of health care services, including but not limited to MH services, spiritual interventions, exercise-based programs, yoga, meditation, and nutrition counseling.

The OAA and AH education division have focused expansion to meet the increased need for MH and rehabilitation providers, to enhance interprofessional education, and to emphasize postgraduate-degree clinical training. This focus reflects the trends seen in health care training broadly throughout the nation and the intentional pivot is a model of these trends and a model for how to intentionally address these trends. Specific to the VA, focused expansion plans have allowed OAA to address VA strategic initiatives such as pain management and caring for rural veterans.

Funded Training Positions

As a result of recent AH expansion efforts, there has been a 33% increase in stipend-funded positions during the past 10 years, a rate that directly corresponds with the growth of AHPs in the VA. Recent AH expansion efforts can contribute to a particularly positive impact in highly rural and underserved areas where recruiting providers remains challenging.

 

 

The OAA launched the Mental Health Education Expansion (MHEE) initiative in 2012, which has now added 782 funded training slots across 10 health professions, 8 of which are psychology, pharmacy, chaplaincy, professional MH counseling, marriage and family therapy (MFT), social work (SW), occupational therapy (OT), and physician assistant (PA). Through the MHEE initiative, the VA has established funded internships for licensed professional mental health counselors and marriage and family therapists, as these professions are targeted for expanding the overall MH workforce in the VA. The OAA currently funds more than 50 total HPT positions for these 2 professions with an aim of increasing their recruitment to the VA MH workforce over the next decade. The MHEE is aligned with specified VA priorities to train a future VA workforce prepared for interprofessional collaboration and clinical care in an increasingly integrated and complex environment. This expansion effort also aligns with an increasing understanding of the importance of addressing the MH needs of our nation by ensuring there is an adequate supply of competent, well-trained clinicians entering the workforce.

The OAA has created and expanded residencies and fellowships in multiple rehabilitation professions, including chiropractic, physical therapy (PT), and OT. With the increased focus on the management of chronic pain in the nation combined with a specific emphasis on this clinical need in the VA, chiropractors have been deemed essential HCPs. In 2014, the VA established 5 chiropractic residency programs while partnering with the Council on Chiropractic Education to develop accreditation standards for residency training. OAA’s efforts have yielded 5 accredited residency programs, the first in the United States. In 2020, the VA doubled the number of available chiropractic residency programs, and future expansion is anticipated. Since 2010, PT residencies have expanded from 1 to 28 programs (42 funded positions) across 4 board certification specialties: cardiovascular-pulmonary, geriatric, neurologic, and orthopedic. Similarly, the VA was one of the first organizations to achieve accreditation for OT fellowships; there are currently 5 accredited OT fellowship programs across 3 areas of practice: assistive technology, MH, and physical rehabilitation. The VA OT fellowship program focused on assistive technology is the only program in the United States at this time.

Interprofessional Education

As one of the primary focus areas for AH expansion, interprofessional education (IPE) has been recognized as increasingly important for the provision of health care and the development of HPT programs. IPE can develop professionals who appreciate the roles of diverse professions and can use teamwork to enhance clinical outcomes for patients.11 There also are a growing number of professional organizations supporting the Interprofessional Education Collaborative with many representing AHPs.12 Collaboration across HCPs is an important way of reducing health care costs by enhancing clinical outcomes, communication, and teamwork.13-16 The VA and the nation’s health care system benefit from the by-products of interprofessional collaboration through investment in targeted training programs. In each phase of the AH expansion, special consideration was given to applicant programs offering unique and innovative clinical and educational experiences consistent with the promotion of interprofessional care. In doing so, increased numbers of AH HPTs have engaged in team-based clinical care.

Pain Management Pharmacy

The efforts of AH to align expansion with high-priority agency-wide efforts has resulted in the growth of pharmacy residency positions focused on pain management. Pharmacy postgraduate year (PGY) 2 residencies focusing on opioid reduction are an example of VA efforts to improve response to managing chronic pain and the long-term risks from opioid use during this national public health crisis.17 These residency programs focus on strategies to reduce the use of opioid medications in the clinical setting and teaching effective clinical interventions for reducing the rates of opioid addiction in veterans while still recognizing the need to identify and treat chronic pain. Before expansion efforts in 2018, there were 6 pharmacy residency programs focused on opioid use reduction in the VA, 8 pharmacy PGY2 residency positions were added in academic year 2019/2020, an additional 5 positions are being added in academic year 2021/2022 with the explicit goal of managing patients with high-risk chronic pain.

Rural Health

The lack of MH providers in rural areas has received much attention and is particularly important in the VA because veterans are more likely to live in less populated areas.18 The VA mandate to address this population was codified by the creation of the Office of Rural Health in 2006 via 38 USC § 7308.19Creating health professions training programs in rural settings provides HPTs the opportunity to learn professional competencies and train with faculty knowledgeable about this population—all of which provide a comprehensive training experience and serve as a recruitment pathway to hire HPTs into staff positions at these sites.19

When MHEE was initiated, not all regions of the country had funded VA psychology training programs, and this geographic gap in psychology training was a contributing factor to recruitment difficulties for psychologists in rural areas. As a result, the request for proposal process in the OAA highlighted and incentivized rurality when considering funding for new training programs. The OAA defined rurality as the number of patients served by the proposed health care facility who lived in a rural or highly rural zip code according to VA Support Service Center Capital Assets data.20 As a result, VA psychology doctoral internships expanded to be available in all states, the District of Columbia, and Puerto Rico. MH training programs were started in the highly rural states of Montana and Wyoming. These expansion efforts promise to be an essential component to addressing the gaps in coverage in rural settings as noted in recent research.21

Pregraduate to Postgraduate Programs

The OAA AH education division supports a significant number of pregraduate-degree and postgraduate-degree training. Some professions, such as psychology, pharmacy, SW, PT, speech pathology, OT, and nutrition/dietetics receive funding at both levels of training. More recent, the OAA has started to move funding from pregraduate to postgraduate-degree positions, specifically within professions where pregraduate funding is uncommon for both federal and nonfederal training positions. The effort is designed to better align stipend-paid training programs with the VA Professional Qualification Standards and the final level of training required for employment in the VA.22This means that HPTs receive stipend support during the highest level of their clinical training before degree conferral, eligibility for VA employment, or while participating in a postgraduate-degree residency or fellowship.

 

 

Additionally, this shift in focus and the resulting internal assessment of professions has allowed the OAA to fund more specialized training opportunities, which sometimes go beyond what is required by accrediting bodies or for recruitment into VA positions. For example, the OAA is supporting SW fellowship programs and PA residency positions to allow for greater specialization within these professions; the accrediting agencies for both professions have recently finalized their accreditation standards, and the OAA played a role in moving these standards forward.

While postgraduate residencies and fellowships are not required for all AH HPTs or for employment in the VA, there is a shift in some professions to encourage postgraduate training in advanced competencies in specialized areas. Participation in a residency or fellowship training program affords HPTs additional time and diverse clinical experiences to acquire clinical skills, all while under the supervision of a highly trained practitioner. This additional training also allows for a longitudinal assessment of the HPT to ensure an alignment of the HPTs’ knowledge, abilities, and skills with the expectation should they pursue VA employment.

In academic year 2019/2020, the OAA AH education division in conjunction with the PA national program office transitioned the entirety of the PA pregraduate-degree student positions (415 funded positions) to residency positions, increasing residency positions from 19 to 32 funded positions. This shift in emphasis for funding did not negatively impact the total number of pregraduate PA students receiving training in the VA and has created a pipeline of residency graduates who are ready to enter VA staff positions. To date, the VA has 14 PA residency programs across 3 specialties: emergency medicine (EM), MH, and primary care/geriatrics. Of these tracks, the VA offers 5 EM and 4 MH residencies that position graduates to be eligible for specialty certification. The National Commission on Certification of Physician Assistants established Certificates of Added Qualifications (CAQ) to recognize and document specialty knowledge, skills, and experience. The VA MH residency programs have been established to align with the CAQ expectations, and residents immediately qualify to take the CAQ examination after the completion of training.

Currently, the same process to move pregraduate to postgraduate funding is being implemented for PT and OT. Within the PT profession, there is increased momentum toward residency and fellowship training programs to respond to the changing complexity of the health care systemand reduce the need of complex care to be provided by non-VA providers in the community.23 Both PT and OT have entered the initial phases of transitioning to residency or fellowship-funded positions. The OAA is partnering with these professions to move positions to postgraduate degree within the next 3 years with a commensurate increase in funding. The initial data indicate that 80% of graduated VA PT residents are board-certification eligible, and 89% of those who are eligible passed the examination on their first attempt.

Since 2013, the VA psychology training also has realized a growth in postgraduate-degree residencies. Psychology residency positions have increased 99% to 453 funded positions. This growth represents increased specialization in neuropsychology, geropsychology, rehabilitation psychology, and health psychology. Additionally, postgraduate residencies meet most jurisdictional requirements for postdoctoral supervised experience and better prepare HPTs to enter specialty staff positions that are necessary to care for aging veterans.

Additional professions are being targeted for postgraduate-degree training programs, including dietetics and speech pathology, to align with upcoming changes in the qualification standards for employment. While the process to transition positions to postgraduate-degree training programs can take 3 to 5 years, the outcomes are expected to result in better prepared HPTs who can fill staff vacancies in the VA.

Conclusions

Through its funding and oversight of numerous professions, the OAA is uniquely situated to adapt its portfolio to meet the needs of the VA and the nation. Over the past 10 years, the OAA has expanded its total number of HPT positions to enhance interprofessional care, respond to the VA’s strategic initiatives, address the care needs of rural veterans, and shift positions to postgraduate training programs. The OAA’s investment in high-quality training programs builds a strong health care workforce ready to meet the needs of an increasingly complex and integrated health care environment.

The OAA anticipates future expansion, especially related to promoting rural training opportunities and shifting to postgraduate training programs as a means of promoting advanced health care and health system competencies while continuing to align with workforce projections. Furthermore, while there are data on the percentage of VA staff who participated in OAA training program through the VA All Employee Survey (AES), the range for AH professions is wide. For example, about 37% of rehabilitative staff reported participating in an OAA training program, and 72% of VA psychologists reported having an OAA training experience. To maximize the hiring of HPTs, OAA will continue its partnership with WMC to enact programs aimed at streamlining the hiring process so that veterans have access to HCPs who are specifically trained to work with them.

References

1. US Department of Veterans Affairs. Providing health care for veterans. Updated April 23, 2021. Accessed July 15, 2021. https://www.va.gov/health

2. Veterans’ Benefits. 38 USC §7301 and §7302 (1991). Accessed May 18, 2020. https://www.govinfo.gov/content/pkg/USCODE-2018-title38/pdf/USCODE-2018-title38-partV-chap73-subchapI-sec7302.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic year 2019-2020. Published 2021. Accessed July 15, 2021. https://www.va.gov/OAA/docs/OAA_Statistics_2020.pdf

4. US Department of Veterans Affairs, VHA Office of Academic Affiliations. VA Policy Memorandum # 2. Policy in association of veterans’ hospitals with medical schools. Published January 30, 1946. Accessed October 13, 2020. https://www.va.gov/oaa/Archive/PolicyMemo2.pdf

5. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Mission of the office of academic affiliations. Updated September 24, 2019. Accessed July 15, 2021. https://www.va.gov/oaa/oaa_mission.asp

6. US Bureau of Labor Statistics, Office of Occupational Statistics and Employment Projections Occupational Outlook Handbook. Healthcare occupations. Updated May 14, 2021. Accessed July 15, 2021. https://www.bls.gov/ooh/healthcare/home.htm

7. Windmill IM, Freeman BA. Demand for audiology services: 30-yr projections and impact on academic programs. J Am Acad Audiol. 2013;24(5):407-416. doi:10.3766/jaaa.24.5.7

8. US Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Workforce. HRSA health workforce: behavioral health workforce projections, 2017-2030. Accessed July 15, 2021. https://bhw.hrsa.gov/sites/default/files/bureau-health-workforce/data-research/bh-workforce-projections-fact-sheet.pdf

9. Centers for Disease Control and Prevention, National Center for Health Statistics. NCHS data brief, No. 325. Use of yoga, meditation, and chiropractors among US adults aged 18 and over. Published November 2018. Accessed September 24, 2020. https://www.cdc.gov/nchs/data/databriefs/db325-h.pdf

10. US Department of Veterans Affairs, Veterans Health Administration Whole Health. Updated July 6, 2021. Accessed July 15, 2021. https://www.va.gov/wholehealth

11. Clark KM. Interprofessional education: making our way out of the silos. Respir Care. 2018;63(5): 637-639. doi:10.4187/respcare.06234

12. Interprofessional Education Collaborative. What is interprofessional education (IPE)? Accessed July 15, 2021. https://www.ipecollaborative.org/about-us

13. Nester J. The importance of interprofessional practice and education in the era of accountable care. N C Med J. 2016;77(2):128-132. doi.10.18043/ncm.77.2.128

14.. Hardin L, Kilian A, Murphy E. Bundled payments for care improvement: preparing for the medical diagnosis-related groups. J Nurs Adm. 2017;47(6): 313-319. doi:10.1097/NNA.0000000000000492

15. Guraya SY, Barr H. The effectiveness of interprofessional education in healthcare: a systematic review and meta-analysis. Kaohsiung J Med Sci. 2018;34(2):125-184. doi:10.1016/j.kjms.2017.12.009

16. Ateah CA, Snow W, Wenter P, et al. Stereotyping as a barrier to collaboration: does interprofessional education make a difference? Nurse Educ Today. 2011;31(2):208-213. doi:10.1016/j.nedt.2010.06.004

17. US Department of Veterans Affairs, US Department of Defense. VA/DoD Clinical Practice Guideline for Managing Opioid Therapy for Chronic Pain. Published May 7, 1991. Updated February 2017. Accessed July 15, 2021. https://www.va.gov/HOMELESS/nchav/resources/docs/mental-health/substance-abuse/VA_DoD-CLINICAL-PRACTICE-GUIDELINE-FOR-OPIOID-THERAPY-FOR-CHRONIC-PAIN-508.pdf

18. US Department of Veterans Affairs, Office of Rural Health. VHA office of rural health. Updated March 17, 2021. Accessed July 15, 2021. https://www.ruralhealth.va.gov19. Curran V, Rourke J. The role of medical education in the recruitment and retention of rural physicians. Med Teach. 2004;26(3):265-272. doi:10.1080/0142159042000192055

20. US Department of Veterans Affairs. VHA Support Service Center Capital Assets. Updated December 1, 2020. Accessed July 15, 2021. https://www.data.va.gov/dataset/VHA-Support-Service-Center-Capital-Assets-VSSC-/2fr5-sktm

21. Domino ME, Lin CC, Morrisey JP, et al. Training psychologists for rural practice: exploring opportunities and constraints. J Rural Health. 2019;35(1):35-41. doi:10.1111/jrh.12299

22. US Department of Veterans Affairs. VA Directive 5005: Staffing. Published March 4, 2020. Accessed July 15, 2021. https://www.va.gov/vapubs/viewPublication.asp?Pub_ID=1140&FType=2

23. Furze JA, Freeman BA. Physical therapy and fellowship education: reflections on the past, present, and future. Phys Ther. 2016;96(7):949-960. doi:10.2522/ptj.20150473

Article PDF
Author and Disclosure Information

Erin Patel is an Acting Chief, Health Professions Education; Jeffrey Bates is an Acting Director, Associated Health; Jocelyn Holguin and Stacy Pommer are National Affiliations Officers, Associated Health; Samuel King is a Statistician, Associated Health; Paul Greenberg is an Acting Chief Academic Affiliations Officer; Karen Sanders is a Senior Advisor; all in Office of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs (VA). Anthony Albanese is Chief of Medicine, VA Northern California Health Care System. Marjorie Bowman is an Acting Assistant Under Secretary for Health, Discovery, Education and Affiliate Networks, Veterans Health Administration, US Department of Veterans Affairs. Paul Greenberg is a Professor of Surgery (Ophthalmology), Alpert Medical School, Brown University in Providence, Rhode Island. Anthony Albanese is a Clinical Professor of Medicine (Gastroenterology, Hepatology, Addiction Medicine) at UC Davis School of Medicine in Sacramento, California. Karen Sanders is a Professor, Internal Medicine, Division of Rheumatology, Allergy and Immunology at Virginia Commonwealth University School of Medicine in Richmond, Virginia. Marjorie Bowman is an Emeritus Professor at University of Pennsylvania in Philadelphia.
Correspondence: Erin Patel (erin.patel@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 38(8)a
Publications
Topics
Page Number
374-380
Sections
Author and Disclosure Information

Erin Patel is an Acting Chief, Health Professions Education; Jeffrey Bates is an Acting Director, Associated Health; Jocelyn Holguin and Stacy Pommer are National Affiliations Officers, Associated Health; Samuel King is a Statistician, Associated Health; Paul Greenberg is an Acting Chief Academic Affiliations Officer; Karen Sanders is a Senior Advisor; all in Office of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs (VA). Anthony Albanese is Chief of Medicine, VA Northern California Health Care System. Marjorie Bowman is an Acting Assistant Under Secretary for Health, Discovery, Education and Affiliate Networks, Veterans Health Administration, US Department of Veterans Affairs. Paul Greenberg is a Professor of Surgery (Ophthalmology), Alpert Medical School, Brown University in Providence, Rhode Island. Anthony Albanese is a Clinical Professor of Medicine (Gastroenterology, Hepatology, Addiction Medicine) at UC Davis School of Medicine in Sacramento, California. Karen Sanders is a Professor, Internal Medicine, Division of Rheumatology, Allergy and Immunology at Virginia Commonwealth University School of Medicine in Richmond, Virginia. Marjorie Bowman is an Emeritus Professor at University of Pennsylvania in Philadelphia.
Correspondence: Erin Patel (erin.patel@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Erin Patel is an Acting Chief, Health Professions Education; Jeffrey Bates is an Acting Director, Associated Health; Jocelyn Holguin and Stacy Pommer are National Affiliations Officers, Associated Health; Samuel King is a Statistician, Associated Health; Paul Greenberg is an Acting Chief Academic Affiliations Officer; Karen Sanders is a Senior Advisor; all in Office of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs (VA). Anthony Albanese is Chief of Medicine, VA Northern California Health Care System. Marjorie Bowman is an Acting Assistant Under Secretary for Health, Discovery, Education and Affiliate Networks, Veterans Health Administration, US Department of Veterans Affairs. Paul Greenberg is a Professor of Surgery (Ophthalmology), Alpert Medical School, Brown University in Providence, Rhode Island. Anthony Albanese is a Clinical Professor of Medicine (Gastroenterology, Hepatology, Addiction Medicine) at UC Davis School of Medicine in Sacramento, California. Karen Sanders is a Professor, Internal Medicine, Division of Rheumatology, Allergy and Immunology at Virginia Commonwealth University School of Medicine in Richmond, Virginia. Marjorie Bowman is an Emeritus Professor at University of Pennsylvania in Philadelphia.
Correspondence: Erin Patel (erin.patel@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF

The US Department of Veterans Affairs (VA) is the largest health care delivery system in the United States, comprising 1293 sites of care, including 171 medical centers.1 One of the 4 statutory missions of the VA is to train health care professionals (HCPs) to meet the needs of the VA and the nation.2 Through partnerships with more than 1800 accredited colleges, universities, and training programs, the VA provides training annually to nearly 118,000 health professions trainees (HPTs) across a variety of health care professions, and all of whom provide direct clinical care to veterans.3

In the VA, the Office of Academic Affiliations (OAA) is charged with overseeing health professions training and the VA’s partnership with medical and associated health (AH) professions schools, which was first codified in Policy Memorandum No. 2 in 1946.4,5 Given the scope and breadth of health professions education offered through the VA, OAA is in a unique position to address health care shortage areas as well as influence the educational standards for certain professions.

Many of these health care professions fall under the rubric of AH, which include mental health (MH) specialties, rehabilitative specialties, and others. These professions are critical to address in the expanding world of health care in the United States with its increased specialization and emphasis on coordination of care with interprofessional teams. During the 2019/2020 academic year, the VA provided clinical training to approximately 21,000 AH HPTs from > 40 professions with just over 20% receiving financial support through the OAA. Of the HPTs who train at VA without compensation, most spend shorter amounts of time in clinical rotations in the VA, are in pregraduate-degree education programs where payment for clinical rotations is not expected and may not be eligible for hire immediately on completion of their clinical training experience. The 17 funded professions have been strategically selected by the OAA to ensure a robust pipeline of HCPs to meet the needs of veterans and the nation.

To meet the demands of AH professionals (AHPs), the OAA implemented targeted expansion over the past 10 years. While not exhaustive, this paper describes several expansion efforts based on VA special initiatives, including enhancing clinical access in rural settings and shifting toward postgraduate-degree training and specialization. By aligning expansion with VA priorities as well as trends in health care more broadly, the OAA can ensure that there is a supply of well-trained AHPs who have developed the requisite competencies to contribute to our nation’s health care needs. Further, expansion can help train and recruit health professionals who can be hired into VA positions ready to care for the complex needs of veterans.

Associated Health Professionals

Overseen by the OAA, AH expansion is designed to address the specific needs of the VA and the US health care system. Data from the VA Workforce Management and Consulting (WMC) shows that the VA employment of AHPs has grown from 87,351 AHPs hired in fiscal year (FY) 2010 to 119,120 as of April 2020. This represents an average yearly growth rate of 3.4% and a total growth rate of 36%. The Bureau of Labor Statistics predictions for 2019/2029 suggest that certain AHPs are expected to have a 10-year growth rates of 20% or more to meet the changing health care needs of patients especially as the population ages; the growth rates for many AHPs far surpasses that of physicians, which is anticipated to be 4% (Table).6,7 The VA WMC expects an additional 52,283 AHPs will be hired by the VA by FY 2030 based on the 10-year average growth rate (Kali Clark, Veterans Health Administration Workforce Management and Consulting Office, email communication, May 28, 2020).

OAA AH Education Funded Professions: Academic Year 2020/2021 and Projected Growth table

One of the driving forces behind the growth rate is the move toward using AHPs to supplement health care for a variety of health conditions.8,9 Examples include the integration of rehabilitation professionals, alternative care professionals (eg, massage therapists, practitioners who offer training in yoga and meditation), chiropractors, MH professionals, and pharmacists in the treatment of chronic pain, the use of a wider range of professionals in the treatment of MH conditions, and the integration of MH professionals into traditional medical settings, such as primary care. This intentional move to a more well-integrated model of interprofessional care is apparent in many other health care systems throughout the United States. Within the VA, this shift may be most evident through the introduction of the Whole Health model of care. The Whole Health model of care uses an interprofessional team to assess and care for veterans, using a personalized health plan addressing medical and MH conditions as well as behavioral, social, or spiritual concerns.10 The Whole Health model of care provides veterans with access to a variety of health care services, including but not limited to MH services, spiritual interventions, exercise-based programs, yoga, meditation, and nutrition counseling.

The OAA and AH education division have focused expansion to meet the increased need for MH and rehabilitation providers, to enhance interprofessional education, and to emphasize postgraduate-degree clinical training. This focus reflects the trends seen in health care training broadly throughout the nation and the intentional pivot is a model of these trends and a model for how to intentionally address these trends. Specific to the VA, focused expansion plans have allowed OAA to address VA strategic initiatives such as pain management and caring for rural veterans.

Funded Training Positions

As a result of recent AH expansion efforts, there has been a 33% increase in stipend-funded positions during the past 10 years, a rate that directly corresponds with the growth of AHPs in the VA. Recent AH expansion efforts can contribute to a particularly positive impact in highly rural and underserved areas where recruiting providers remains challenging.

 

 

The OAA launched the Mental Health Education Expansion (MHEE) initiative in 2012, which has now added 782 funded training slots across 10 health professions, 8 of which are psychology, pharmacy, chaplaincy, professional MH counseling, marriage and family therapy (MFT), social work (SW), occupational therapy (OT), and physician assistant (PA). Through the MHEE initiative, the VA has established funded internships for licensed professional mental health counselors and marriage and family therapists, as these professions are targeted for expanding the overall MH workforce in the VA. The OAA currently funds more than 50 total HPT positions for these 2 professions with an aim of increasing their recruitment to the VA MH workforce over the next decade. The MHEE is aligned with specified VA priorities to train a future VA workforce prepared for interprofessional collaboration and clinical care in an increasingly integrated and complex environment. This expansion effort also aligns with an increasing understanding of the importance of addressing the MH needs of our nation by ensuring there is an adequate supply of competent, well-trained clinicians entering the workforce.

The OAA has created and expanded residencies and fellowships in multiple rehabilitation professions, including chiropractic, physical therapy (PT), and OT. With the increased focus on the management of chronic pain in the nation combined with a specific emphasis on this clinical need in the VA, chiropractors have been deemed essential HCPs. In 2014, the VA established 5 chiropractic residency programs while partnering with the Council on Chiropractic Education to develop accreditation standards for residency training. OAA’s efforts have yielded 5 accredited residency programs, the first in the United States. In 2020, the VA doubled the number of available chiropractic residency programs, and future expansion is anticipated. Since 2010, PT residencies have expanded from 1 to 28 programs (42 funded positions) across 4 board certification specialties: cardiovascular-pulmonary, geriatric, neurologic, and orthopedic. Similarly, the VA was one of the first organizations to achieve accreditation for OT fellowships; there are currently 5 accredited OT fellowship programs across 3 areas of practice: assistive technology, MH, and physical rehabilitation. The VA OT fellowship program focused on assistive technology is the only program in the United States at this time.

Interprofessional Education

As one of the primary focus areas for AH expansion, interprofessional education (IPE) has been recognized as increasingly important for the provision of health care and the development of HPT programs. IPE can develop professionals who appreciate the roles of diverse professions and can use teamwork to enhance clinical outcomes for patients.11 There also are a growing number of professional organizations supporting the Interprofessional Education Collaborative with many representing AHPs.12 Collaboration across HCPs is an important way of reducing health care costs by enhancing clinical outcomes, communication, and teamwork.13-16 The VA and the nation’s health care system benefit from the by-products of interprofessional collaboration through investment in targeted training programs. In each phase of the AH expansion, special consideration was given to applicant programs offering unique and innovative clinical and educational experiences consistent with the promotion of interprofessional care. In doing so, increased numbers of AH HPTs have engaged in team-based clinical care.

Pain Management Pharmacy

The efforts of AH to align expansion with high-priority agency-wide efforts has resulted in the growth of pharmacy residency positions focused on pain management. Pharmacy postgraduate year (PGY) 2 residencies focusing on opioid reduction are an example of VA efforts to improve response to managing chronic pain and the long-term risks from opioid use during this national public health crisis.17 These residency programs focus on strategies to reduce the use of opioid medications in the clinical setting and teaching effective clinical interventions for reducing the rates of opioid addiction in veterans while still recognizing the need to identify and treat chronic pain. Before expansion efforts in 2018, there were 6 pharmacy residency programs focused on opioid use reduction in the VA, 8 pharmacy PGY2 residency positions were added in academic year 2019/2020, an additional 5 positions are being added in academic year 2021/2022 with the explicit goal of managing patients with high-risk chronic pain.

Rural Health

The lack of MH providers in rural areas has received much attention and is particularly important in the VA because veterans are more likely to live in less populated areas.18 The VA mandate to address this population was codified by the creation of the Office of Rural Health in 2006 via 38 USC § 7308.19Creating health professions training programs in rural settings provides HPTs the opportunity to learn professional competencies and train with faculty knowledgeable about this population—all of which provide a comprehensive training experience and serve as a recruitment pathway to hire HPTs into staff positions at these sites.19

When MHEE was initiated, not all regions of the country had funded VA psychology training programs, and this geographic gap in psychology training was a contributing factor to recruitment difficulties for psychologists in rural areas. As a result, the request for proposal process in the OAA highlighted and incentivized rurality when considering funding for new training programs. The OAA defined rurality as the number of patients served by the proposed health care facility who lived in a rural or highly rural zip code according to VA Support Service Center Capital Assets data.20 As a result, VA psychology doctoral internships expanded to be available in all states, the District of Columbia, and Puerto Rico. MH training programs were started in the highly rural states of Montana and Wyoming. These expansion efforts promise to be an essential component to addressing the gaps in coverage in rural settings as noted in recent research.21

Pregraduate to Postgraduate Programs

The OAA AH education division supports a significant number of pregraduate-degree and postgraduate-degree training. Some professions, such as psychology, pharmacy, SW, PT, speech pathology, OT, and nutrition/dietetics receive funding at both levels of training. More recent, the OAA has started to move funding from pregraduate to postgraduate-degree positions, specifically within professions where pregraduate funding is uncommon for both federal and nonfederal training positions. The effort is designed to better align stipend-paid training programs with the VA Professional Qualification Standards and the final level of training required for employment in the VA.22This means that HPTs receive stipend support during the highest level of their clinical training before degree conferral, eligibility for VA employment, or while participating in a postgraduate-degree residency or fellowship.

 

 

Additionally, this shift in focus and the resulting internal assessment of professions has allowed the OAA to fund more specialized training opportunities, which sometimes go beyond what is required by accrediting bodies or for recruitment into VA positions. For example, the OAA is supporting SW fellowship programs and PA residency positions to allow for greater specialization within these professions; the accrediting agencies for both professions have recently finalized their accreditation standards, and the OAA played a role in moving these standards forward.

While postgraduate residencies and fellowships are not required for all AH HPTs or for employment in the VA, there is a shift in some professions to encourage postgraduate training in advanced competencies in specialized areas. Participation in a residency or fellowship training program affords HPTs additional time and diverse clinical experiences to acquire clinical skills, all while under the supervision of a highly trained practitioner. This additional training also allows for a longitudinal assessment of the HPT to ensure an alignment of the HPTs’ knowledge, abilities, and skills with the expectation should they pursue VA employment.

In academic year 2019/2020, the OAA AH education division in conjunction with the PA national program office transitioned the entirety of the PA pregraduate-degree student positions (415 funded positions) to residency positions, increasing residency positions from 19 to 32 funded positions. This shift in emphasis for funding did not negatively impact the total number of pregraduate PA students receiving training in the VA and has created a pipeline of residency graduates who are ready to enter VA staff positions. To date, the VA has 14 PA residency programs across 3 specialties: emergency medicine (EM), MH, and primary care/geriatrics. Of these tracks, the VA offers 5 EM and 4 MH residencies that position graduates to be eligible for specialty certification. The National Commission on Certification of Physician Assistants established Certificates of Added Qualifications (CAQ) to recognize and document specialty knowledge, skills, and experience. The VA MH residency programs have been established to align with the CAQ expectations, and residents immediately qualify to take the CAQ examination after the completion of training.

Currently, the same process to move pregraduate to postgraduate funding is being implemented for PT and OT. Within the PT profession, there is increased momentum toward residency and fellowship training programs to respond to the changing complexity of the health care systemand reduce the need of complex care to be provided by non-VA providers in the community.23 Both PT and OT have entered the initial phases of transitioning to residency or fellowship-funded positions. The OAA is partnering with these professions to move positions to postgraduate degree within the next 3 years with a commensurate increase in funding. The initial data indicate that 80% of graduated VA PT residents are board-certification eligible, and 89% of those who are eligible passed the examination on their first attempt.

Since 2013, the VA psychology training also has realized a growth in postgraduate-degree residencies. Psychology residency positions have increased 99% to 453 funded positions. This growth represents increased specialization in neuropsychology, geropsychology, rehabilitation psychology, and health psychology. Additionally, postgraduate residencies meet most jurisdictional requirements for postdoctoral supervised experience and better prepare HPTs to enter specialty staff positions that are necessary to care for aging veterans.

Additional professions are being targeted for postgraduate-degree training programs, including dietetics and speech pathology, to align with upcoming changes in the qualification standards for employment. While the process to transition positions to postgraduate-degree training programs can take 3 to 5 years, the outcomes are expected to result in better prepared HPTs who can fill staff vacancies in the VA.

Conclusions

Through its funding and oversight of numerous professions, the OAA is uniquely situated to adapt its portfolio to meet the needs of the VA and the nation. Over the past 10 years, the OAA has expanded its total number of HPT positions to enhance interprofessional care, respond to the VA’s strategic initiatives, address the care needs of rural veterans, and shift positions to postgraduate training programs. The OAA’s investment in high-quality training programs builds a strong health care workforce ready to meet the needs of an increasingly complex and integrated health care environment.

The OAA anticipates future expansion, especially related to promoting rural training opportunities and shifting to postgraduate training programs as a means of promoting advanced health care and health system competencies while continuing to align with workforce projections. Furthermore, while there are data on the percentage of VA staff who participated in OAA training program through the VA All Employee Survey (AES), the range for AH professions is wide. For example, about 37% of rehabilitative staff reported participating in an OAA training program, and 72% of VA psychologists reported having an OAA training experience. To maximize the hiring of HPTs, OAA will continue its partnership with WMC to enact programs aimed at streamlining the hiring process so that veterans have access to HCPs who are specifically trained to work with them.

The US Department of Veterans Affairs (VA) is the largest health care delivery system in the United States, comprising 1293 sites of care, including 171 medical centers.1 One of the 4 statutory missions of the VA is to train health care professionals (HCPs) to meet the needs of the VA and the nation.2 Through partnerships with more than 1800 accredited colleges, universities, and training programs, the VA provides training annually to nearly 118,000 health professions trainees (HPTs) across a variety of health care professions, and all of whom provide direct clinical care to veterans.3

In the VA, the Office of Academic Affiliations (OAA) is charged with overseeing health professions training and the VA’s partnership with medical and associated health (AH) professions schools, which was first codified in Policy Memorandum No. 2 in 1946.4,5 Given the scope and breadth of health professions education offered through the VA, OAA is in a unique position to address health care shortage areas as well as influence the educational standards for certain professions.

Many of these health care professions fall under the rubric of AH, which include mental health (MH) specialties, rehabilitative specialties, and others. These professions are critical to address in the expanding world of health care in the United States with its increased specialization and emphasis on coordination of care with interprofessional teams. During the 2019/2020 academic year, the VA provided clinical training to approximately 21,000 AH HPTs from > 40 professions with just over 20% receiving financial support through the OAA. Of the HPTs who train at VA without compensation, most spend shorter amounts of time in clinical rotations in the VA, are in pregraduate-degree education programs where payment for clinical rotations is not expected and may not be eligible for hire immediately on completion of their clinical training experience. The 17 funded professions have been strategically selected by the OAA to ensure a robust pipeline of HCPs to meet the needs of veterans and the nation.

To meet the demands of AH professionals (AHPs), the OAA implemented targeted expansion over the past 10 years. While not exhaustive, this paper describes several expansion efforts based on VA special initiatives, including enhancing clinical access in rural settings and shifting toward postgraduate-degree training and specialization. By aligning expansion with VA priorities as well as trends in health care more broadly, the OAA can ensure that there is a supply of well-trained AHPs who have developed the requisite competencies to contribute to our nation’s health care needs. Further, expansion can help train and recruit health professionals who can be hired into VA positions ready to care for the complex needs of veterans.

Associated Health Professionals

Overseen by the OAA, AH expansion is designed to address the specific needs of the VA and the US health care system. Data from the VA Workforce Management and Consulting (WMC) shows that the VA employment of AHPs has grown from 87,351 AHPs hired in fiscal year (FY) 2010 to 119,120 as of April 2020. This represents an average yearly growth rate of 3.4% and a total growth rate of 36%. The Bureau of Labor Statistics predictions for 2019/2029 suggest that certain AHPs are expected to have a 10-year growth rates of 20% or more to meet the changing health care needs of patients especially as the population ages; the growth rates for many AHPs far surpasses that of physicians, which is anticipated to be 4% (Table).6,7 The VA WMC expects an additional 52,283 AHPs will be hired by the VA by FY 2030 based on the 10-year average growth rate (Kali Clark, Veterans Health Administration Workforce Management and Consulting Office, email communication, May 28, 2020).

OAA AH Education Funded Professions: Academic Year 2020/2021 and Projected Growth table

One of the driving forces behind the growth rate is the move toward using AHPs to supplement health care for a variety of health conditions.8,9 Examples include the integration of rehabilitation professionals, alternative care professionals (eg, massage therapists, practitioners who offer training in yoga and meditation), chiropractors, MH professionals, and pharmacists in the treatment of chronic pain, the use of a wider range of professionals in the treatment of MH conditions, and the integration of MH professionals into traditional medical settings, such as primary care. This intentional move to a more well-integrated model of interprofessional care is apparent in many other health care systems throughout the United States. Within the VA, this shift may be most evident through the introduction of the Whole Health model of care. The Whole Health model of care uses an interprofessional team to assess and care for veterans, using a personalized health plan addressing medical and MH conditions as well as behavioral, social, or spiritual concerns.10 The Whole Health model of care provides veterans with access to a variety of health care services, including but not limited to MH services, spiritual interventions, exercise-based programs, yoga, meditation, and nutrition counseling.

The OAA and AH education division have focused expansion to meet the increased need for MH and rehabilitation providers, to enhance interprofessional education, and to emphasize postgraduate-degree clinical training. This focus reflects the trends seen in health care training broadly throughout the nation and the intentional pivot is a model of these trends and a model for how to intentionally address these trends. Specific to the VA, focused expansion plans have allowed OAA to address VA strategic initiatives such as pain management and caring for rural veterans.

Funded Training Positions

As a result of recent AH expansion efforts, there has been a 33% increase in stipend-funded positions during the past 10 years, a rate that directly corresponds with the growth of AHPs in the VA. Recent AH expansion efforts can contribute to a particularly positive impact in highly rural and underserved areas where recruiting providers remains challenging.

 

 

The OAA launched the Mental Health Education Expansion (MHEE) initiative in 2012, which has now added 782 funded training slots across 10 health professions, 8 of which are psychology, pharmacy, chaplaincy, professional MH counseling, marriage and family therapy (MFT), social work (SW), occupational therapy (OT), and physician assistant (PA). Through the MHEE initiative, the VA has established funded internships for licensed professional mental health counselors and marriage and family therapists, as these professions are targeted for expanding the overall MH workforce in the VA. The OAA currently funds more than 50 total HPT positions for these 2 professions with an aim of increasing their recruitment to the VA MH workforce over the next decade. The MHEE is aligned with specified VA priorities to train a future VA workforce prepared for interprofessional collaboration and clinical care in an increasingly integrated and complex environment. This expansion effort also aligns with an increasing understanding of the importance of addressing the MH needs of our nation by ensuring there is an adequate supply of competent, well-trained clinicians entering the workforce.

The OAA has created and expanded residencies and fellowships in multiple rehabilitation professions, including chiropractic, physical therapy (PT), and OT. With the increased focus on the management of chronic pain in the nation combined with a specific emphasis on this clinical need in the VA, chiropractors have been deemed essential HCPs. In 2014, the VA established 5 chiropractic residency programs while partnering with the Council on Chiropractic Education to develop accreditation standards for residency training. OAA’s efforts have yielded 5 accredited residency programs, the first in the United States. In 2020, the VA doubled the number of available chiropractic residency programs, and future expansion is anticipated. Since 2010, PT residencies have expanded from 1 to 28 programs (42 funded positions) across 4 board certification specialties: cardiovascular-pulmonary, geriatric, neurologic, and orthopedic. Similarly, the VA was one of the first organizations to achieve accreditation for OT fellowships; there are currently 5 accredited OT fellowship programs across 3 areas of practice: assistive technology, MH, and physical rehabilitation. The VA OT fellowship program focused on assistive technology is the only program in the United States at this time.

Interprofessional Education

As one of the primary focus areas for AH expansion, interprofessional education (IPE) has been recognized as increasingly important for the provision of health care and the development of HPT programs. IPE can develop professionals who appreciate the roles of diverse professions and can use teamwork to enhance clinical outcomes for patients.11 There also are a growing number of professional organizations supporting the Interprofessional Education Collaborative with many representing AHPs.12 Collaboration across HCPs is an important way of reducing health care costs by enhancing clinical outcomes, communication, and teamwork.13-16 The VA and the nation’s health care system benefit from the by-products of interprofessional collaboration through investment in targeted training programs. In each phase of the AH expansion, special consideration was given to applicant programs offering unique and innovative clinical and educational experiences consistent with the promotion of interprofessional care. In doing so, increased numbers of AH HPTs have engaged in team-based clinical care.

Pain Management Pharmacy

The efforts of AH to align expansion with high-priority agency-wide efforts has resulted in the growth of pharmacy residency positions focused on pain management. Pharmacy postgraduate year (PGY) 2 residencies focusing on opioid reduction are an example of VA efforts to improve response to managing chronic pain and the long-term risks from opioid use during this national public health crisis.17 These residency programs focus on strategies to reduce the use of opioid medications in the clinical setting and teaching effective clinical interventions for reducing the rates of opioid addiction in veterans while still recognizing the need to identify and treat chronic pain. Before expansion efforts in 2018, there were 6 pharmacy residency programs focused on opioid use reduction in the VA, 8 pharmacy PGY2 residency positions were added in academic year 2019/2020, an additional 5 positions are being added in academic year 2021/2022 with the explicit goal of managing patients with high-risk chronic pain.

Rural Health

The lack of MH providers in rural areas has received much attention and is particularly important in the VA because veterans are more likely to live in less populated areas.18 The VA mandate to address this population was codified by the creation of the Office of Rural Health in 2006 via 38 USC § 7308.19Creating health professions training programs in rural settings provides HPTs the opportunity to learn professional competencies and train with faculty knowledgeable about this population—all of which provide a comprehensive training experience and serve as a recruitment pathway to hire HPTs into staff positions at these sites.19

When MHEE was initiated, not all regions of the country had funded VA psychology training programs, and this geographic gap in psychology training was a contributing factor to recruitment difficulties for psychologists in rural areas. As a result, the request for proposal process in the OAA highlighted and incentivized rurality when considering funding for new training programs. The OAA defined rurality as the number of patients served by the proposed health care facility who lived in a rural or highly rural zip code according to VA Support Service Center Capital Assets data.20 As a result, VA psychology doctoral internships expanded to be available in all states, the District of Columbia, and Puerto Rico. MH training programs were started in the highly rural states of Montana and Wyoming. These expansion efforts promise to be an essential component to addressing the gaps in coverage in rural settings as noted in recent research.21

Pregraduate to Postgraduate Programs

The OAA AH education division supports a significant number of pregraduate-degree and postgraduate-degree training. Some professions, such as psychology, pharmacy, SW, PT, speech pathology, OT, and nutrition/dietetics receive funding at both levels of training. More recent, the OAA has started to move funding from pregraduate to postgraduate-degree positions, specifically within professions where pregraduate funding is uncommon for both federal and nonfederal training positions. The effort is designed to better align stipend-paid training programs with the VA Professional Qualification Standards and the final level of training required for employment in the VA.22This means that HPTs receive stipend support during the highest level of their clinical training before degree conferral, eligibility for VA employment, or while participating in a postgraduate-degree residency or fellowship.

 

 

Additionally, this shift in focus and the resulting internal assessment of professions has allowed the OAA to fund more specialized training opportunities, which sometimes go beyond what is required by accrediting bodies or for recruitment into VA positions. For example, the OAA is supporting SW fellowship programs and PA residency positions to allow for greater specialization within these professions; the accrediting agencies for both professions have recently finalized their accreditation standards, and the OAA played a role in moving these standards forward.

While postgraduate residencies and fellowships are not required for all AH HPTs or for employment in the VA, there is a shift in some professions to encourage postgraduate training in advanced competencies in specialized areas. Participation in a residency or fellowship training program affords HPTs additional time and diverse clinical experiences to acquire clinical skills, all while under the supervision of a highly trained practitioner. This additional training also allows for a longitudinal assessment of the HPT to ensure an alignment of the HPTs’ knowledge, abilities, and skills with the expectation should they pursue VA employment.

In academic year 2019/2020, the OAA AH education division in conjunction with the PA national program office transitioned the entirety of the PA pregraduate-degree student positions (415 funded positions) to residency positions, increasing residency positions from 19 to 32 funded positions. This shift in emphasis for funding did not negatively impact the total number of pregraduate PA students receiving training in the VA and has created a pipeline of residency graduates who are ready to enter VA staff positions. To date, the VA has 14 PA residency programs across 3 specialties: emergency medicine (EM), MH, and primary care/geriatrics. Of these tracks, the VA offers 5 EM and 4 MH residencies that position graduates to be eligible for specialty certification. The National Commission on Certification of Physician Assistants established Certificates of Added Qualifications (CAQ) to recognize and document specialty knowledge, skills, and experience. The VA MH residency programs have been established to align with the CAQ expectations, and residents immediately qualify to take the CAQ examination after the completion of training.

Currently, the same process to move pregraduate to postgraduate funding is being implemented for PT and OT. Within the PT profession, there is increased momentum toward residency and fellowship training programs to respond to the changing complexity of the health care systemand reduce the need of complex care to be provided by non-VA providers in the community.23 Both PT and OT have entered the initial phases of transitioning to residency or fellowship-funded positions. The OAA is partnering with these professions to move positions to postgraduate degree within the next 3 years with a commensurate increase in funding. The initial data indicate that 80% of graduated VA PT residents are board-certification eligible, and 89% of those who are eligible passed the examination on their first attempt.

Since 2013, the VA psychology training also has realized a growth in postgraduate-degree residencies. Psychology residency positions have increased 99% to 453 funded positions. This growth represents increased specialization in neuropsychology, geropsychology, rehabilitation psychology, and health psychology. Additionally, postgraduate residencies meet most jurisdictional requirements for postdoctoral supervised experience and better prepare HPTs to enter specialty staff positions that are necessary to care for aging veterans.

Additional professions are being targeted for postgraduate-degree training programs, including dietetics and speech pathology, to align with upcoming changes in the qualification standards for employment. While the process to transition positions to postgraduate-degree training programs can take 3 to 5 years, the outcomes are expected to result in better prepared HPTs who can fill staff vacancies in the VA.

Conclusions

Through its funding and oversight of numerous professions, the OAA is uniquely situated to adapt its portfolio to meet the needs of the VA and the nation. Over the past 10 years, the OAA has expanded its total number of HPT positions to enhance interprofessional care, respond to the VA’s strategic initiatives, address the care needs of rural veterans, and shift positions to postgraduate training programs. The OAA’s investment in high-quality training programs builds a strong health care workforce ready to meet the needs of an increasingly complex and integrated health care environment.

The OAA anticipates future expansion, especially related to promoting rural training opportunities and shifting to postgraduate training programs as a means of promoting advanced health care and health system competencies while continuing to align with workforce projections. Furthermore, while there are data on the percentage of VA staff who participated in OAA training program through the VA All Employee Survey (AES), the range for AH professions is wide. For example, about 37% of rehabilitative staff reported participating in an OAA training program, and 72% of VA psychologists reported having an OAA training experience. To maximize the hiring of HPTs, OAA will continue its partnership with WMC to enact programs aimed at streamlining the hiring process so that veterans have access to HCPs who are specifically trained to work with them.

References

1. US Department of Veterans Affairs. Providing health care for veterans. Updated April 23, 2021. Accessed July 15, 2021. https://www.va.gov/health

2. Veterans’ Benefits. 38 USC §7301 and §7302 (1991). Accessed May 18, 2020. https://www.govinfo.gov/content/pkg/USCODE-2018-title38/pdf/USCODE-2018-title38-partV-chap73-subchapI-sec7302.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic year 2019-2020. Published 2021. Accessed July 15, 2021. https://www.va.gov/OAA/docs/OAA_Statistics_2020.pdf

4. US Department of Veterans Affairs, VHA Office of Academic Affiliations. VA Policy Memorandum # 2. Policy in association of veterans’ hospitals with medical schools. Published January 30, 1946. Accessed October 13, 2020. https://www.va.gov/oaa/Archive/PolicyMemo2.pdf

5. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Mission of the office of academic affiliations. Updated September 24, 2019. Accessed July 15, 2021. https://www.va.gov/oaa/oaa_mission.asp

6. US Bureau of Labor Statistics, Office of Occupational Statistics and Employment Projections Occupational Outlook Handbook. Healthcare occupations. Updated May 14, 2021. Accessed July 15, 2021. https://www.bls.gov/ooh/healthcare/home.htm

7. Windmill IM, Freeman BA. Demand for audiology services: 30-yr projections and impact on academic programs. J Am Acad Audiol. 2013;24(5):407-416. doi:10.3766/jaaa.24.5.7

8. US Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Workforce. HRSA health workforce: behavioral health workforce projections, 2017-2030. Accessed July 15, 2021. https://bhw.hrsa.gov/sites/default/files/bureau-health-workforce/data-research/bh-workforce-projections-fact-sheet.pdf

9. Centers for Disease Control and Prevention, National Center for Health Statistics. NCHS data brief, No. 325. Use of yoga, meditation, and chiropractors among US adults aged 18 and over. Published November 2018. Accessed September 24, 2020. https://www.cdc.gov/nchs/data/databriefs/db325-h.pdf

10. US Department of Veterans Affairs, Veterans Health Administration Whole Health. Updated July 6, 2021. Accessed July 15, 2021. https://www.va.gov/wholehealth

11. Clark KM. Interprofessional education: making our way out of the silos. Respir Care. 2018;63(5): 637-639. doi:10.4187/respcare.06234

12. Interprofessional Education Collaborative. What is interprofessional education (IPE)? Accessed July 15, 2021. https://www.ipecollaborative.org/about-us

13. Nester J. The importance of interprofessional practice and education in the era of accountable care. N C Med J. 2016;77(2):128-132. doi.10.18043/ncm.77.2.128

14.. Hardin L, Kilian A, Murphy E. Bundled payments for care improvement: preparing for the medical diagnosis-related groups. J Nurs Adm. 2017;47(6): 313-319. doi:10.1097/NNA.0000000000000492

15. Guraya SY, Barr H. The effectiveness of interprofessional education in healthcare: a systematic review and meta-analysis. Kaohsiung J Med Sci. 2018;34(2):125-184. doi:10.1016/j.kjms.2017.12.009

16. Ateah CA, Snow W, Wenter P, et al. Stereotyping as a barrier to collaboration: does interprofessional education make a difference? Nurse Educ Today. 2011;31(2):208-213. doi:10.1016/j.nedt.2010.06.004

17. US Department of Veterans Affairs, US Department of Defense. VA/DoD Clinical Practice Guideline for Managing Opioid Therapy for Chronic Pain. Published May 7, 1991. Updated February 2017. Accessed July 15, 2021. https://www.va.gov/HOMELESS/nchav/resources/docs/mental-health/substance-abuse/VA_DoD-CLINICAL-PRACTICE-GUIDELINE-FOR-OPIOID-THERAPY-FOR-CHRONIC-PAIN-508.pdf

18. US Department of Veterans Affairs, Office of Rural Health. VHA office of rural health. Updated March 17, 2021. Accessed July 15, 2021. https://www.ruralhealth.va.gov19. Curran V, Rourke J. The role of medical education in the recruitment and retention of rural physicians. Med Teach. 2004;26(3):265-272. doi:10.1080/0142159042000192055

20. US Department of Veterans Affairs. VHA Support Service Center Capital Assets. Updated December 1, 2020. Accessed July 15, 2021. https://www.data.va.gov/dataset/VHA-Support-Service-Center-Capital-Assets-VSSC-/2fr5-sktm

21. Domino ME, Lin CC, Morrisey JP, et al. Training psychologists for rural practice: exploring opportunities and constraints. J Rural Health. 2019;35(1):35-41. doi:10.1111/jrh.12299

22. US Department of Veterans Affairs. VA Directive 5005: Staffing. Published March 4, 2020. Accessed July 15, 2021. https://www.va.gov/vapubs/viewPublication.asp?Pub_ID=1140&FType=2

23. Furze JA, Freeman BA. Physical therapy and fellowship education: reflections on the past, present, and future. Phys Ther. 2016;96(7):949-960. doi:10.2522/ptj.20150473

References

1. US Department of Veterans Affairs. Providing health care for veterans. Updated April 23, 2021. Accessed July 15, 2021. https://www.va.gov/health

2. Veterans’ Benefits. 38 USC §7301 and §7302 (1991). Accessed May 18, 2020. https://www.govinfo.gov/content/pkg/USCODE-2018-title38/pdf/USCODE-2018-title38-partV-chap73-subchapI-sec7302.pdf

3. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education: academic year 2019-2020. Published 2021. Accessed July 15, 2021. https://www.va.gov/OAA/docs/OAA_Statistics_2020.pdf

4. US Department of Veterans Affairs, VHA Office of Academic Affiliations. VA Policy Memorandum # 2. Policy in association of veterans’ hospitals with medical schools. Published January 30, 1946. Accessed October 13, 2020. https://www.va.gov/oaa/Archive/PolicyMemo2.pdf

5. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Mission of the office of academic affiliations. Updated September 24, 2019. Accessed July 15, 2021. https://www.va.gov/oaa/oaa_mission.asp

6. US Bureau of Labor Statistics, Office of Occupational Statistics and Employment Projections Occupational Outlook Handbook. Healthcare occupations. Updated May 14, 2021. Accessed July 15, 2021. https://www.bls.gov/ooh/healthcare/home.htm

7. Windmill IM, Freeman BA. Demand for audiology services: 30-yr projections and impact on academic programs. J Am Acad Audiol. 2013;24(5):407-416. doi:10.3766/jaaa.24.5.7

8. US Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Workforce. HRSA health workforce: behavioral health workforce projections, 2017-2030. Accessed July 15, 2021. https://bhw.hrsa.gov/sites/default/files/bureau-health-workforce/data-research/bh-workforce-projections-fact-sheet.pdf

9. Centers for Disease Control and Prevention, National Center for Health Statistics. NCHS data brief, No. 325. Use of yoga, meditation, and chiropractors among US adults aged 18 and over. Published November 2018. Accessed September 24, 2020. https://www.cdc.gov/nchs/data/databriefs/db325-h.pdf

10. US Department of Veterans Affairs, Veterans Health Administration Whole Health. Updated July 6, 2021. Accessed July 15, 2021. https://www.va.gov/wholehealth

11. Clark KM. Interprofessional education: making our way out of the silos. Respir Care. 2018;63(5): 637-639. doi:10.4187/respcare.06234

12. Interprofessional Education Collaborative. What is interprofessional education (IPE)? Accessed July 15, 2021. https://www.ipecollaborative.org/about-us

13. Nester J. The importance of interprofessional practice and education in the era of accountable care. N C Med J. 2016;77(2):128-132. doi.10.18043/ncm.77.2.128

14.. Hardin L, Kilian A, Murphy E. Bundled payments for care improvement: preparing for the medical diagnosis-related groups. J Nurs Adm. 2017;47(6): 313-319. doi:10.1097/NNA.0000000000000492

15. Guraya SY, Barr H. The effectiveness of interprofessional education in healthcare: a systematic review and meta-analysis. Kaohsiung J Med Sci. 2018;34(2):125-184. doi:10.1016/j.kjms.2017.12.009

16. Ateah CA, Snow W, Wenter P, et al. Stereotyping as a barrier to collaboration: does interprofessional education make a difference? Nurse Educ Today. 2011;31(2):208-213. doi:10.1016/j.nedt.2010.06.004

17. US Department of Veterans Affairs, US Department of Defense. VA/DoD Clinical Practice Guideline for Managing Opioid Therapy for Chronic Pain. Published May 7, 1991. Updated February 2017. Accessed July 15, 2021. https://www.va.gov/HOMELESS/nchav/resources/docs/mental-health/substance-abuse/VA_DoD-CLINICAL-PRACTICE-GUIDELINE-FOR-OPIOID-THERAPY-FOR-CHRONIC-PAIN-508.pdf

18. US Department of Veterans Affairs, Office of Rural Health. VHA office of rural health. Updated March 17, 2021. Accessed July 15, 2021. https://www.ruralhealth.va.gov19. Curran V, Rourke J. The role of medical education in the recruitment and retention of rural physicians. Med Teach. 2004;26(3):265-272. doi:10.1080/0142159042000192055

20. US Department of Veterans Affairs. VHA Support Service Center Capital Assets. Updated December 1, 2020. Accessed July 15, 2021. https://www.data.va.gov/dataset/VHA-Support-Service-Center-Capital-Assets-VSSC-/2fr5-sktm

21. Domino ME, Lin CC, Morrisey JP, et al. Training psychologists for rural practice: exploring opportunities and constraints. J Rural Health. 2019;35(1):35-41. doi:10.1111/jrh.12299

22. US Department of Veterans Affairs. VA Directive 5005: Staffing. Published March 4, 2020. Accessed July 15, 2021. https://www.va.gov/vapubs/viewPublication.asp?Pub_ID=1140&FType=2

23. Furze JA, Freeman BA. Physical therapy and fellowship education: reflections on the past, present, and future. Phys Ther. 2016;96(7):949-960. doi:10.2522/ptj.20150473

Issue
Federal Practitioner - 38(8)a
Issue
Federal Practitioner - 38(8)a
Page Number
374-380
Page Number
374-380
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Changes In Functional Status Related To health Maintenance Visits To Family Physicians

Article Type
Changed
Mon, 01/14/2019 - 11:10
Display Headline
Changes In Functional Status Related To health Maintenance Visits To Family Physicians

 

BACKGROUND: Physicians do not provide preventive care at the level recommended by national organizations. This may be because of physicians’ lack of training or low level of confidence or because of patients’ fears, beliefs, and lack of health knowledge.

METHODS: We used an observational prospective cohort study in an academic family practice office to investigate changes in patients’ functional status associated with receiving recommendations to change behavior from family physicians. Patients 18 years and older presenting for health maintenance visits to family physicians completed a functional status instrument and a brief intake questionnaire by telephone before their visit. After the visit patients were randomized to a debriefing interview or an observation-only group. The interview included the Patient/Doctor Interaction Scale and an assessment of whether patients received a recommendation to change behavior.

RESULTS: One hundred thirty-two patients were randomized to the debriefing group, and of those, 92% completed assessments at 3 months. Patients reporting recommendations to change behavior had lower scores at 1 and 3 months for mental health, social health, and self-esteem and higher anxiety and depression scores than patients not receiving these recommendations.

CONCLUSIONS: There are declines in social and emotional functional status in patients presenting to family practice clinicians for health maintenance visits during which recommendations for behavioral change were made. Such declines may inhibit physicians from making recommendations for behavioral change or patients from accepting them.

Preventive care is not sought by patients or provided by physicians at the level recommended by national organizations.1 This may be because of inadequate attention resulting from a physician’s lack of training,24 forgetting to provide preventive care,5 negative attitude toward such care,6 or low confidence in its effectiveness.7 The low level of preventive services delivery may also be caused by inadequate reimbursement to physicians,8 out-of-pocket costs to patients, patient fears of finding disease, patients’ health beliefs,9 lack of agreement between the physician and the patient regarding the need for behavioral change,10 and lack of information given to the patient.11 It is likely that the interaction among multiple factors results in the lack of provision of preventive care. Our previous work11 suggested an additional explanation: We found statistically significant declines in emotional and social functioning of patients who had been advised to change health-related behaviors, even though no organic illness was diagnosed. Similar results were reported by Stoate,12 who found patients with no acute complaints felt worse after receiving routine preventive care. Assuming the declines in functioning are representative of a widespread phenomenon, this may explain patients’ resistance to purely preventive care and provide insight into why physicians cite overall lack of gratification and satisfaction with providing it.2,3

These findings may be an extension of other known negative effects of preventive medicine. It is known that the diagnosis (or labeling) of asymptomatic patients with diseases is associated with negative outcomes. For example, the diagnosis of asymptomatic hypertension has been associated with a greater number of sick days,13 as well as lower income.14,15 The investigators did not find a decrease in psychological well-being, however. It is likely that similar functional changes could occur with the diagnoses of other conditions. For example, the cessation of smoking can cause physical symptoms (withdrawal) and be associated with the onset of depression.16 Starting a health promotion habit, such as physical exercise, may cause temporary symptoms as well. Patients may feel guilty if not engaging in the healthy behavior that was recommended or disappointed if results from behavioral change are not immediate. Changes in the way that a family functions may result from the knowledge of a new diagnosis or new behaviors, such as dietary changes.

We hypothesized that the pressure exerted by the physician’s advice challenges patients with limited confidence in their ability to manage change and causes a decrease in social and emotional functioning. To investigate this phenomenon, we conducted an observational study involving patients presenting for health maintenance in an academic family practice center.

Methods

Our study was completed in 3 phases: recruitment and baseline data collection, postvisit data collection, and telephone follow-up of patients at 1 month and 3 months. Patients from all socioeconomic strata aged 18 years and older presenting for health maintenance visits at the Family Practice Center of Bowman Gray School of Medicine were eligible. The Family Practice Center is an academic office where family-physician faculty, residents, and physician assistants care for patients. Appointment lists were screened to identify likely candidates, excluding those who were younger than 18 years and those presenting for acute care. Patients who met the inclusion criteria were contacted by telephone before their clinic visits, and after providing informed consent were given the Duke Health Profile (DUKE)17 and a brief intake questionnaire. We used the intake questionnaire to gather information on the reasons for visiting the clinician, previous experience with the clinician, and visit expectations. The DUKE profile is a 17-item questionnaire with 6 health measures (physical, mental, social, general, perceived health, and self-esteem) and 4 dysfunction measures (anxiety, depression, pain, and disability). The DUKE takes a broad view of health, has been validated in family practice populations, and is easy to administer.18

 

 

Enrolled patients were randomized into 2 groups in a 1:2 ratio: an observation-only group and a group that received a debriefing interview after their visit to the family practice clinician. As they left the examination room, the interview group was asked to complete a brief questionnaire that included a patient satisfaction instrument, the Patient/Doctor Interaction Scale (PDIS),19 and a debriefing instrument. The debriefing instrument addressed the patient’s views about the visit, specific behavioral changes recommended by the provider (with no preset response set), patient’s perception of the need for behavioral change, methods suggested to accomplish the change, and the patient’s perception of the likelihood of success in accomplishing the change. The PDIS is a 17-item patient satisfaction scale that assesses the portion of patient satisfaction involving interactions with the physician; we modified it slightly by adding 3 more general satisfaction questions. It was developed and validated in a family practice office and has been shown to be related to higher recall rates.20 The scale has balanced positive and negative questions, uses a 5-point scale, and has an adequate completion rate. The clinicians were also asked to complete a brief questionnaire characterizing their perceptions of encounters with patients enrolled in both groups.

There was telephone follow-up of all patients at 1 month and 3 months after the visit to the clinician. A maximum of 6 attempts was made to contact participants. The telephone calls included repeat administration of the DUKE to assess functional status, questions about additional visits to the clinician or other healthcare providers, and about progress toward achieving recommended behavioral changes. We included all data in the analysis, in concordance with the intention-to-treat principle.

Data from completed forms were entered into a database by a trained, experienced research assistant. Before entry, each form was inspected for completeness, ambiguity of responses, or other irregularities. All unclear responses were referred to the investigators. Range checks were conducted periodically as data were entered to detect errors and were repeated as part of the data cleaning procedures before analysis. Descriptive statistics were calculated on all variables, including the DUKE subscale scores and the PDIS scores. Initial statistical analysis was carried out to test for differences in DUKE means between patients randomized to the observation-only group and the debriefing group. Differences in the PDIS and DUKE subscale means were tested at baseline, 1 month, and 3 months using a repeated measures approach (SAS subroutine PROC MIXED, SAS Institute, Cary, NC).

Results

We recruited participants during a 9-month period beginning in September 1995. Including the 3-month follow-up period, data collection was completed in 12 months. Of the 208 patients recruited, 68 (34%) were randomized to the observation-only group, and 132 (68%) were assigned to receive the debriefing interview. In the observation-only group, 64 (94%) patients were successfully contacted for the 1-month assessment, and 62 (91%) for the 3-month assessment. Of the 132 patients assigned to the debriefing group, 2 refused to complete the debriefing interview. Of those completing the debriefing interview, 123 (93%) were successfully contacted for the 1-month assessment, and 122 (92%) for the 3-month assessment.

The average age of the enrolled patients was 47.4 years (standard deviation [SD] =11.9, range=19-76 years) and 68.0% were women; 32.5% were African American, 65.0% were white, and the rest represented a variety of ethnic groups Table 1. The average educational level of the patients was 14.2 years (SD=3.0). A percent of 12.8 reported annual family incomes less than $15,000, 13.4% between $15,000 and $25,000, and more than half had incomes of $35,000 or more. The interview group had higher income levels than the observation group; otherwise there were no significant differences.

The reasons for patient visits were Papanicolaou tests, pelvic examinations, and routine health maintenance, although some of these visits incorporated a follow-up of a chronic condition. No statistically significant differences were found between the study groups by reason for visit. After the visits, 63 (48%) of the 132 patients in the interview group reported that their clinician recommended a specific type of behavioral change. Of the patients reporting having been given a recommendation, 11 were asked to quit smoking; 15 to change medications or the way that medications were taken; 33 to alter their diet, exercise level, or lose weight; and 4 received recommendations related to stress reduction. Additional miscellaneous changes were also recommended. It is interesting that patients sometimes reported some form of mental health behavioral change, but alcohol was only mentioned rarely by the patients. We expected behavioral changes related to alcohol use or abuse to be mentioned frequently.

 

 

PDIS assessment only occurred at baseline. The mean PDIS score for all observations was 55.8 (SD=5.5, range=35-66). No significant differences in satisfaction were found between the observation and interview groups, and there was no difference in patient satisfaction found between patients who reported receiving a recommendation for behavioral change and those who did not.

We performed the analysis of the outcomes (ie, changes in functional status by recommendation to change behavior), in 2 steps. In the first step, we examined the effect of the debriefing interview by comparing the DUKE subscale scores for the interview and observation groups at 1 month and 3 months. No significant differences were found for any of the DUKE subscales. Based on results indicating that the debriefing interviews had no effect on outcomes, we created 2 groups for subsequent analyses. One group consisted of patients who reported that they received behavioral change recommendations. The second group included all other patients enrolled in the study. This is a conservative method that would tend to underestimate differences because a portion of the observation group may have received behavioral change recommendations. Analysis of differences between these 2 groups during the 3 observations was carried out using a repeated measures approach. A repeated measures model (using SAS PROC MIXED) was fit for each of the DUKE subscale means. The model included time of assessment (baseline, 1 month, 3 months), race, age, sex, and educational level as independent variables. Table 1, Table 2 shows the means for each of the DUKE subscales at each assessment for patients who did and did not receive recommendations to change behavior. As Table 2 indicates, no significant differences (P <.05) were found between patients who did and did not receive a recommendation to change behavior at baseline. At the 1-month follow-up assessment, the mean scores for mental health, social health, and self-esteem were lower for patients who received a behavioral change recommendation. At 3 months, the differences in mental, social, and self-esteem found at 1 month persisted, and the means coresforanxiety, anxiety/depression, and depression were worse for patients receiving recommendations. Race and sex were not significantly associated with differences for any of the DUKE subscale scores shown in Table 2. Education, however, was significantly associated with every subscale. Age was associated with self-esteem. The functional status scores for patients who received recommendations to change behavior declined as educational level increased. Older, better-educated patients who received behavioral change recommendations were the most likely to report reduced functional status and self-esteem during the 3-month follow-up period. Finally, as the results in Table 2 show, the effects observed in the patients who received recommendations were consistent at the 1-month and 3-month assessments. The differences were all in the negative direction except for the pain subscale score. Reports of pain decreased from the baseline to the 1-month assessment and then were higher at the 3-month assessment.

The Effect of Specific Recommendations

To further investigate influences on DUKE subscale scores, we evaluated the effect of the type of change recommended by the clinician for patients given a specific type of recommendation.

Four categories of recommended changes were found: medication compliance, diet and exercise, smoking cessation, and stress control. Table 3 shows the mean DUKE subscale scores by the type of change recommended. As Table 3 indicates, statistically significant differences among the mean scores by type of change recommended were found at the 3-month assessment for disability, mental health, and self-esteem. Post hoc examination of the mean scores in these cases suggested that disability scores for patients who were asked to stop smoking were worse than those of patients asked to improve medication compliance or change their diet. Mental health scores for patients who were asked to stop smoking were significantly poorer than for those asked to try to control stress. No group differences were found for self-esteem scores.

Discussion

The results from our study provide further support that certain elements of patient functioning decline 3 months after behavioral change is recommended by a clinician. These results confirm our previous findings11 and those reported by others.12 We found that social and emotional functioning varied according to whether the patient reported that their clinician recommended that they change a behavior. The observation group had functional declines similar to the entire interview group; the declines fell intermediately between those of patients who reported being asked to make behavioral changes (who had functional declines) and those who reported no behavioral changes (who had no functional declines). This suggests that the debriefing after the office visit had no impact on the outcomes, and reinforces that it was the behavioral change recommendation that led to the declines in self-reported functional status.

 

 

Negative reactions by patients to recommendations for behavioral change may be one reason that such changes are not recommended more often by clinicians. Patients may also resist getting recommendations for behavioral change by not raising the subject with the clinician or by avoiding visits where such recommendations would be likely to occur (such as physical examinations). Physicians may avoid recommending changes to reduce potential conflict with the patient.

Declines in functioning were greatest when the physician recommended that the patient stop smoking, and potential declines were seen for patients with diet or exercise recommendations. No detectable functional decline occurred for those patients who had been given recommendations to make medication-related or dietary changes. This suggests that behavioral changes are perceived as being more difficult for patients than simple medication changes and that being faced with a recommendation to make a behavioral change is associated with lower levels of functioning. The patients who reported being told to quit smoking had marked increases in depression and anxiety and overall disability at 3 months after the visit. These results are consistent with the literature that smoking and depression are interlinked.16 Our study also supports the use of antidepressants such as buproprion21 for helping patients to quit smoking, but our study was done before the drug was approved and commonly used for this indication.

Limitations

The results from our study are subject to limitations and should be interpreted cautiously. First, we did not include sufficient numbers to allow testing of the association between success in the specific behavioral changes and functional decline. It is possible that the decline in functional status is limited to patients who were not successful in changing behavior. It is also possible that the declines are focused on a select group of behavioral changes, such as smoking. Further study is needed to test such associations. Second, data were collected at only 1 family practice center. The patient population is diverse but not necessarily representative of the community at large. Our study was also completed in North Carolina, a state known for tobacco consumption, which may have affected the results pertaining to smoking-related behavioral changes and functional decline. Further studies should emphasize whether functional declines reverse with a longer time frame and whether there is a relationship with successful behavioral change. Research should also consider whether physician behaviors can have a positive impact on a patient’s functional status.

Acknowledgments

Our research was supported by a grant from the American Academy of Family Physicians Foundation (#G9609). We would like to thank Dottie Greek for excellence in project management.

References

1. Tiara DA, Safran DG, Seto TB, Rogers WH, Tarlov AR. The relationship between patient income and physician discussion of health risk behaviors. JAMA 1997;278:1412-7.

2. WB, Belcher DW, Inui TS. Implementing preventive care in clinical practice: problems for managers, clinicians, and patients. Med Care 1981;38:195-216.

3. AS. Encouraging the practice of preventive medicine and health promotion. Public Health Rep 1982;97:216-9.

4. B. Preventive medicine in general practice. Br J Med 1982;284:921-2.

5. MA. Promoting health and preventing disease in health care settings: an analysis of barriers. Prev Med 1987;16:119-30.

6. CT, George LK, Fupt JL, Brodie KH. Health promotion in primary care: a survey of US family practitioners. Prev Med 1985;14:63-7.

7. C, Sobal J, Muncie H, Levine D, Antlitz A. Health promotion: physicians’ beliefs, attitudes and practices. Am J Prev Med 1986;2:82-8.

8. MP, Green LW, Fultz FG. Principles of changing health behaviors. Cancer 1988;62:1768-75.

9. MH, Maiman LA. Sociobehavioral determinants of compliance with health and medical care recommendations. Med Care 1975;13:10-24.

10. SK, Hickam DH. How health professionals influence health behavior: patient-provider interaction and health care outcomes. In: Glanz K, Lewis FM, Rimer BK, eds. Health behavior and health education: theory, research, and practice. San Francisco, Calif: Jossey-Bass Publishers, 1990.

11. MA, Herndon A, Sharp PC, Dignan MB. Assessment of the Patient-Doctor Interaction Scale (PDIS) for measuring patient satisfaction. Patient Educ Couns 1992;19:75-80.

12. HG. Can health screening damage your health? J Royal Coll Gen Prac 1989;39:193-5.

13. H, Sackett D, Taylor D, et al. Increased absenteeism from work after detection and labeling of hypertensive patients. N Eng J Med 1978;229:741-4.

14. DW, Haynes RB, Sackett DL, Gibson ES. Long-term follow-up of absenteeism among working men following the detection and treatment of their hypertension. Clin Invest Med 1981;4:173-7.

15. ME, Gibson ES, Terry CW, et al. Effects of labeling on income, work and social function among hypertensive employees. J Chron Dis 1984;37:417-23.

16. RE, Lichtenstein E. Long-term effects of behavioral smoking cessation interventions. Behav Res Ther 1987;18:297-324.

17. GR, Broadhead WE, Tse C-Kj. The Duke Health Profile, a 17-item measure of health and dysfunction. Med Care 1990;28:1056-72.

18. Parkerson GR, Broadhead WE, Tse C-Kj. Development of the 17-item Duke Health Profile. Fam Pract 1991;8:396-401.

19. DR, Smith JK. Assessing residents’ behavioral science skills: patients’ views of physician-patient interaction. J Fam Pract 1983;17:479-83.

20. D, Tippy P. Communicating information to patients: patient satisfaction and adherence as associated with resident skill. J Fam Pract 1988;26:643-7.

21. RD, Sachs DP, Glover Ed, et al. A comparison of sustained-release buproprion and placebo for smoking cessation. N Engl J Med 1997;337:1195-202.

Author and Disclosure Information

 

Marjorie A. Bowman, MD, MPA
Mark Dignan, PHD, MPH
Sonia Crandall, PHD
MONIKA BAIER, MS
Philadelphia, Pennsylvania; Denver, Colorado; And Winston-Salem, North Carolina
Submitted, revised, December 14, 1999.
From the Department of Family Practice and Community Medicine, University of Pennsylvania, Philadelphia (M.A.B.); the School of Public Health, University of Alabama, Birmingham (M.D.); the Center for Research Methodology and Biometrics (M.B.), AMC Cancer Research Center, Denver; and the Department of Family and Community Medicine, Wake Forest University School of Medicine, Winston-Salem (S.A.). Reprint requests should be addressed to Marjorie A. Bowman, MD, MPA, Department of Family Practice and Community Medicine, 2 Gates, University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104-4283.

Issue
The Journal of Family Practice - 49(05)
Publications
Page Number
428-433
Legacy Keywords
,Primary health careprimary preventionhealth behaviorquality of life. (J Fam Pract 2000; 49:428-433)
Sections
Author and Disclosure Information

 

Marjorie A. Bowman, MD, MPA
Mark Dignan, PHD, MPH
Sonia Crandall, PHD
MONIKA BAIER, MS
Philadelphia, Pennsylvania; Denver, Colorado; And Winston-Salem, North Carolina
Submitted, revised, December 14, 1999.
From the Department of Family Practice and Community Medicine, University of Pennsylvania, Philadelphia (M.A.B.); the School of Public Health, University of Alabama, Birmingham (M.D.); the Center for Research Methodology and Biometrics (M.B.), AMC Cancer Research Center, Denver; and the Department of Family and Community Medicine, Wake Forest University School of Medicine, Winston-Salem (S.A.). Reprint requests should be addressed to Marjorie A. Bowman, MD, MPA, Department of Family Practice and Community Medicine, 2 Gates, University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104-4283.

Author and Disclosure Information

 

Marjorie A. Bowman, MD, MPA
Mark Dignan, PHD, MPH
Sonia Crandall, PHD
MONIKA BAIER, MS
Philadelphia, Pennsylvania; Denver, Colorado; And Winston-Salem, North Carolina
Submitted, revised, December 14, 1999.
From the Department of Family Practice and Community Medicine, University of Pennsylvania, Philadelphia (M.A.B.); the School of Public Health, University of Alabama, Birmingham (M.D.); the Center for Research Methodology and Biometrics (M.B.), AMC Cancer Research Center, Denver; and the Department of Family and Community Medicine, Wake Forest University School of Medicine, Winston-Salem (S.A.). Reprint requests should be addressed to Marjorie A. Bowman, MD, MPA, Department of Family Practice and Community Medicine, 2 Gates, University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104-4283.

 

BACKGROUND: Physicians do not provide preventive care at the level recommended by national organizations. This may be because of physicians’ lack of training or low level of confidence or because of patients’ fears, beliefs, and lack of health knowledge.

METHODS: We used an observational prospective cohort study in an academic family practice office to investigate changes in patients’ functional status associated with receiving recommendations to change behavior from family physicians. Patients 18 years and older presenting for health maintenance visits to family physicians completed a functional status instrument and a brief intake questionnaire by telephone before their visit. After the visit patients were randomized to a debriefing interview or an observation-only group. The interview included the Patient/Doctor Interaction Scale and an assessment of whether patients received a recommendation to change behavior.

RESULTS: One hundred thirty-two patients were randomized to the debriefing group, and of those, 92% completed assessments at 3 months. Patients reporting recommendations to change behavior had lower scores at 1 and 3 months for mental health, social health, and self-esteem and higher anxiety and depression scores than patients not receiving these recommendations.

CONCLUSIONS: There are declines in social and emotional functional status in patients presenting to family practice clinicians for health maintenance visits during which recommendations for behavioral change were made. Such declines may inhibit physicians from making recommendations for behavioral change or patients from accepting them.

Preventive care is not sought by patients or provided by physicians at the level recommended by national organizations.1 This may be because of inadequate attention resulting from a physician’s lack of training,24 forgetting to provide preventive care,5 negative attitude toward such care,6 or low confidence in its effectiveness.7 The low level of preventive services delivery may also be caused by inadequate reimbursement to physicians,8 out-of-pocket costs to patients, patient fears of finding disease, patients’ health beliefs,9 lack of agreement between the physician and the patient regarding the need for behavioral change,10 and lack of information given to the patient.11 It is likely that the interaction among multiple factors results in the lack of provision of preventive care. Our previous work11 suggested an additional explanation: We found statistically significant declines in emotional and social functioning of patients who had been advised to change health-related behaviors, even though no organic illness was diagnosed. Similar results were reported by Stoate,12 who found patients with no acute complaints felt worse after receiving routine preventive care. Assuming the declines in functioning are representative of a widespread phenomenon, this may explain patients’ resistance to purely preventive care and provide insight into why physicians cite overall lack of gratification and satisfaction with providing it.2,3

These findings may be an extension of other known negative effects of preventive medicine. It is known that the diagnosis (or labeling) of asymptomatic patients with diseases is associated with negative outcomes. For example, the diagnosis of asymptomatic hypertension has been associated with a greater number of sick days,13 as well as lower income.14,15 The investigators did not find a decrease in psychological well-being, however. It is likely that similar functional changes could occur with the diagnoses of other conditions. For example, the cessation of smoking can cause physical symptoms (withdrawal) and be associated with the onset of depression.16 Starting a health promotion habit, such as physical exercise, may cause temporary symptoms as well. Patients may feel guilty if not engaging in the healthy behavior that was recommended or disappointed if results from behavioral change are not immediate. Changes in the way that a family functions may result from the knowledge of a new diagnosis or new behaviors, such as dietary changes.

We hypothesized that the pressure exerted by the physician’s advice challenges patients with limited confidence in their ability to manage change and causes a decrease in social and emotional functioning. To investigate this phenomenon, we conducted an observational study involving patients presenting for health maintenance in an academic family practice center.

Methods

Our study was completed in 3 phases: recruitment and baseline data collection, postvisit data collection, and telephone follow-up of patients at 1 month and 3 months. Patients from all socioeconomic strata aged 18 years and older presenting for health maintenance visits at the Family Practice Center of Bowman Gray School of Medicine were eligible. The Family Practice Center is an academic office where family-physician faculty, residents, and physician assistants care for patients. Appointment lists were screened to identify likely candidates, excluding those who were younger than 18 years and those presenting for acute care. Patients who met the inclusion criteria were contacted by telephone before their clinic visits, and after providing informed consent were given the Duke Health Profile (DUKE)17 and a brief intake questionnaire. We used the intake questionnaire to gather information on the reasons for visiting the clinician, previous experience with the clinician, and visit expectations. The DUKE profile is a 17-item questionnaire with 6 health measures (physical, mental, social, general, perceived health, and self-esteem) and 4 dysfunction measures (anxiety, depression, pain, and disability). The DUKE takes a broad view of health, has been validated in family practice populations, and is easy to administer.18

 

 

Enrolled patients were randomized into 2 groups in a 1:2 ratio: an observation-only group and a group that received a debriefing interview after their visit to the family practice clinician. As they left the examination room, the interview group was asked to complete a brief questionnaire that included a patient satisfaction instrument, the Patient/Doctor Interaction Scale (PDIS),19 and a debriefing instrument. The debriefing instrument addressed the patient’s views about the visit, specific behavioral changes recommended by the provider (with no preset response set), patient’s perception of the need for behavioral change, methods suggested to accomplish the change, and the patient’s perception of the likelihood of success in accomplishing the change. The PDIS is a 17-item patient satisfaction scale that assesses the portion of patient satisfaction involving interactions with the physician; we modified it slightly by adding 3 more general satisfaction questions. It was developed and validated in a family practice office and has been shown to be related to higher recall rates.20 The scale has balanced positive and negative questions, uses a 5-point scale, and has an adequate completion rate. The clinicians were also asked to complete a brief questionnaire characterizing their perceptions of encounters with patients enrolled in both groups.

There was telephone follow-up of all patients at 1 month and 3 months after the visit to the clinician. A maximum of 6 attempts was made to contact participants. The telephone calls included repeat administration of the DUKE to assess functional status, questions about additional visits to the clinician or other healthcare providers, and about progress toward achieving recommended behavioral changes. We included all data in the analysis, in concordance with the intention-to-treat principle.

Data from completed forms were entered into a database by a trained, experienced research assistant. Before entry, each form was inspected for completeness, ambiguity of responses, or other irregularities. All unclear responses were referred to the investigators. Range checks were conducted periodically as data were entered to detect errors and were repeated as part of the data cleaning procedures before analysis. Descriptive statistics were calculated on all variables, including the DUKE subscale scores and the PDIS scores. Initial statistical analysis was carried out to test for differences in DUKE means between patients randomized to the observation-only group and the debriefing group. Differences in the PDIS and DUKE subscale means were tested at baseline, 1 month, and 3 months using a repeated measures approach (SAS subroutine PROC MIXED, SAS Institute, Cary, NC).

Results

We recruited participants during a 9-month period beginning in September 1995. Including the 3-month follow-up period, data collection was completed in 12 months. Of the 208 patients recruited, 68 (34%) were randomized to the observation-only group, and 132 (68%) were assigned to receive the debriefing interview. In the observation-only group, 64 (94%) patients were successfully contacted for the 1-month assessment, and 62 (91%) for the 3-month assessment. Of the 132 patients assigned to the debriefing group, 2 refused to complete the debriefing interview. Of those completing the debriefing interview, 123 (93%) were successfully contacted for the 1-month assessment, and 122 (92%) for the 3-month assessment.

The average age of the enrolled patients was 47.4 years (standard deviation [SD] =11.9, range=19-76 years) and 68.0% were women; 32.5% were African American, 65.0% were white, and the rest represented a variety of ethnic groups Table 1. The average educational level of the patients was 14.2 years (SD=3.0). A percent of 12.8 reported annual family incomes less than $15,000, 13.4% between $15,000 and $25,000, and more than half had incomes of $35,000 or more. The interview group had higher income levels than the observation group; otherwise there were no significant differences.

The reasons for patient visits were Papanicolaou tests, pelvic examinations, and routine health maintenance, although some of these visits incorporated a follow-up of a chronic condition. No statistically significant differences were found between the study groups by reason for visit. After the visits, 63 (48%) of the 132 patients in the interview group reported that their clinician recommended a specific type of behavioral change. Of the patients reporting having been given a recommendation, 11 were asked to quit smoking; 15 to change medications or the way that medications were taken; 33 to alter their diet, exercise level, or lose weight; and 4 received recommendations related to stress reduction. Additional miscellaneous changes were also recommended. It is interesting that patients sometimes reported some form of mental health behavioral change, but alcohol was only mentioned rarely by the patients. We expected behavioral changes related to alcohol use or abuse to be mentioned frequently.

 

 

PDIS assessment only occurred at baseline. The mean PDIS score for all observations was 55.8 (SD=5.5, range=35-66). No significant differences in satisfaction were found between the observation and interview groups, and there was no difference in patient satisfaction found between patients who reported receiving a recommendation for behavioral change and those who did not.

We performed the analysis of the outcomes (ie, changes in functional status by recommendation to change behavior), in 2 steps. In the first step, we examined the effect of the debriefing interview by comparing the DUKE subscale scores for the interview and observation groups at 1 month and 3 months. No significant differences were found for any of the DUKE subscales. Based on results indicating that the debriefing interviews had no effect on outcomes, we created 2 groups for subsequent analyses. One group consisted of patients who reported that they received behavioral change recommendations. The second group included all other patients enrolled in the study. This is a conservative method that would tend to underestimate differences because a portion of the observation group may have received behavioral change recommendations. Analysis of differences between these 2 groups during the 3 observations was carried out using a repeated measures approach. A repeated measures model (using SAS PROC MIXED) was fit for each of the DUKE subscale means. The model included time of assessment (baseline, 1 month, 3 months), race, age, sex, and educational level as independent variables. Table 1, Table 2 shows the means for each of the DUKE subscales at each assessment for patients who did and did not receive recommendations to change behavior. As Table 2 indicates, no significant differences (P <.05) were found between patients who did and did not receive a recommendation to change behavior at baseline. At the 1-month follow-up assessment, the mean scores for mental health, social health, and self-esteem were lower for patients who received a behavioral change recommendation. At 3 months, the differences in mental, social, and self-esteem found at 1 month persisted, and the means coresforanxiety, anxiety/depression, and depression were worse for patients receiving recommendations. Race and sex were not significantly associated with differences for any of the DUKE subscale scores shown in Table 2. Education, however, was significantly associated with every subscale. Age was associated with self-esteem. The functional status scores for patients who received recommendations to change behavior declined as educational level increased. Older, better-educated patients who received behavioral change recommendations were the most likely to report reduced functional status and self-esteem during the 3-month follow-up period. Finally, as the results in Table 2 show, the effects observed in the patients who received recommendations were consistent at the 1-month and 3-month assessments. The differences were all in the negative direction except for the pain subscale score. Reports of pain decreased from the baseline to the 1-month assessment and then were higher at the 3-month assessment.

The Effect of Specific Recommendations

To further investigate influences on DUKE subscale scores, we evaluated the effect of the type of change recommended by the clinician for patients given a specific type of recommendation.

Four categories of recommended changes were found: medication compliance, diet and exercise, smoking cessation, and stress control. Table 3 shows the mean DUKE subscale scores by the type of change recommended. As Table 3 indicates, statistically significant differences among the mean scores by type of change recommended were found at the 3-month assessment for disability, mental health, and self-esteem. Post hoc examination of the mean scores in these cases suggested that disability scores for patients who were asked to stop smoking were worse than those of patients asked to improve medication compliance or change their diet. Mental health scores for patients who were asked to stop smoking were significantly poorer than for those asked to try to control stress. No group differences were found for self-esteem scores.

Discussion

The results from our study provide further support that certain elements of patient functioning decline 3 months after behavioral change is recommended by a clinician. These results confirm our previous findings11 and those reported by others.12 We found that social and emotional functioning varied according to whether the patient reported that their clinician recommended that they change a behavior. The observation group had functional declines similar to the entire interview group; the declines fell intermediately between those of patients who reported being asked to make behavioral changes (who had functional declines) and those who reported no behavioral changes (who had no functional declines). This suggests that the debriefing after the office visit had no impact on the outcomes, and reinforces that it was the behavioral change recommendation that led to the declines in self-reported functional status.

 

 

Negative reactions by patients to recommendations for behavioral change may be one reason that such changes are not recommended more often by clinicians. Patients may also resist getting recommendations for behavioral change by not raising the subject with the clinician or by avoiding visits where such recommendations would be likely to occur (such as physical examinations). Physicians may avoid recommending changes to reduce potential conflict with the patient.

Declines in functioning were greatest when the physician recommended that the patient stop smoking, and potential declines were seen for patients with diet or exercise recommendations. No detectable functional decline occurred for those patients who had been given recommendations to make medication-related or dietary changes. This suggests that behavioral changes are perceived as being more difficult for patients than simple medication changes and that being faced with a recommendation to make a behavioral change is associated with lower levels of functioning. The patients who reported being told to quit smoking had marked increases in depression and anxiety and overall disability at 3 months after the visit. These results are consistent with the literature that smoking and depression are interlinked.16 Our study also supports the use of antidepressants such as buproprion21 for helping patients to quit smoking, but our study was done before the drug was approved and commonly used for this indication.

Limitations

The results from our study are subject to limitations and should be interpreted cautiously. First, we did not include sufficient numbers to allow testing of the association between success in the specific behavioral changes and functional decline. It is possible that the decline in functional status is limited to patients who were not successful in changing behavior. It is also possible that the declines are focused on a select group of behavioral changes, such as smoking. Further study is needed to test such associations. Second, data were collected at only 1 family practice center. The patient population is diverse but not necessarily representative of the community at large. Our study was also completed in North Carolina, a state known for tobacco consumption, which may have affected the results pertaining to smoking-related behavioral changes and functional decline. Further studies should emphasize whether functional declines reverse with a longer time frame and whether there is a relationship with successful behavioral change. Research should also consider whether physician behaviors can have a positive impact on a patient’s functional status.

Acknowledgments

Our research was supported by a grant from the American Academy of Family Physicians Foundation (#G9609). We would like to thank Dottie Greek for excellence in project management.

 

BACKGROUND: Physicians do not provide preventive care at the level recommended by national organizations. This may be because of physicians’ lack of training or low level of confidence or because of patients’ fears, beliefs, and lack of health knowledge.

METHODS: We used an observational prospective cohort study in an academic family practice office to investigate changes in patients’ functional status associated with receiving recommendations to change behavior from family physicians. Patients 18 years and older presenting for health maintenance visits to family physicians completed a functional status instrument and a brief intake questionnaire by telephone before their visit. After the visit patients were randomized to a debriefing interview or an observation-only group. The interview included the Patient/Doctor Interaction Scale and an assessment of whether patients received a recommendation to change behavior.

RESULTS: One hundred thirty-two patients were randomized to the debriefing group, and of those, 92% completed assessments at 3 months. Patients reporting recommendations to change behavior had lower scores at 1 and 3 months for mental health, social health, and self-esteem and higher anxiety and depression scores than patients not receiving these recommendations.

CONCLUSIONS: There are declines in social and emotional functional status in patients presenting to family practice clinicians for health maintenance visits during which recommendations for behavioral change were made. Such declines may inhibit physicians from making recommendations for behavioral change or patients from accepting them.

Preventive care is not sought by patients or provided by physicians at the level recommended by national organizations.1 This may be because of inadequate attention resulting from a physician’s lack of training,24 forgetting to provide preventive care,5 negative attitude toward such care,6 or low confidence in its effectiveness.7 The low level of preventive services delivery may also be caused by inadequate reimbursement to physicians,8 out-of-pocket costs to patients, patient fears of finding disease, patients’ health beliefs,9 lack of agreement between the physician and the patient regarding the need for behavioral change,10 and lack of information given to the patient.11 It is likely that the interaction among multiple factors results in the lack of provision of preventive care. Our previous work11 suggested an additional explanation: We found statistically significant declines in emotional and social functioning of patients who had been advised to change health-related behaviors, even though no organic illness was diagnosed. Similar results were reported by Stoate,12 who found patients with no acute complaints felt worse after receiving routine preventive care. Assuming the declines in functioning are representative of a widespread phenomenon, this may explain patients’ resistance to purely preventive care and provide insight into why physicians cite overall lack of gratification and satisfaction with providing it.2,3

These findings may be an extension of other known negative effects of preventive medicine. It is known that the diagnosis (or labeling) of asymptomatic patients with diseases is associated with negative outcomes. For example, the diagnosis of asymptomatic hypertension has been associated with a greater number of sick days,13 as well as lower income.14,15 The investigators did not find a decrease in psychological well-being, however. It is likely that similar functional changes could occur with the diagnoses of other conditions. For example, the cessation of smoking can cause physical symptoms (withdrawal) and be associated with the onset of depression.16 Starting a health promotion habit, such as physical exercise, may cause temporary symptoms as well. Patients may feel guilty if not engaging in the healthy behavior that was recommended or disappointed if results from behavioral change are not immediate. Changes in the way that a family functions may result from the knowledge of a new diagnosis or new behaviors, such as dietary changes.

We hypothesized that the pressure exerted by the physician’s advice challenges patients with limited confidence in their ability to manage change and causes a decrease in social and emotional functioning. To investigate this phenomenon, we conducted an observational study involving patients presenting for health maintenance in an academic family practice center.

Methods

Our study was completed in 3 phases: recruitment and baseline data collection, postvisit data collection, and telephone follow-up of patients at 1 month and 3 months. Patients from all socioeconomic strata aged 18 years and older presenting for health maintenance visits at the Family Practice Center of Bowman Gray School of Medicine were eligible. The Family Practice Center is an academic office where family-physician faculty, residents, and physician assistants care for patients. Appointment lists were screened to identify likely candidates, excluding those who were younger than 18 years and those presenting for acute care. Patients who met the inclusion criteria were contacted by telephone before their clinic visits, and after providing informed consent were given the Duke Health Profile (DUKE)17 and a brief intake questionnaire. We used the intake questionnaire to gather information on the reasons for visiting the clinician, previous experience with the clinician, and visit expectations. The DUKE profile is a 17-item questionnaire with 6 health measures (physical, mental, social, general, perceived health, and self-esteem) and 4 dysfunction measures (anxiety, depression, pain, and disability). The DUKE takes a broad view of health, has been validated in family practice populations, and is easy to administer.18

 

 

Enrolled patients were randomized into 2 groups in a 1:2 ratio: an observation-only group and a group that received a debriefing interview after their visit to the family practice clinician. As they left the examination room, the interview group was asked to complete a brief questionnaire that included a patient satisfaction instrument, the Patient/Doctor Interaction Scale (PDIS),19 and a debriefing instrument. The debriefing instrument addressed the patient’s views about the visit, specific behavioral changes recommended by the provider (with no preset response set), patient’s perception of the need for behavioral change, methods suggested to accomplish the change, and the patient’s perception of the likelihood of success in accomplishing the change. The PDIS is a 17-item patient satisfaction scale that assesses the portion of patient satisfaction involving interactions with the physician; we modified it slightly by adding 3 more general satisfaction questions. It was developed and validated in a family practice office and has been shown to be related to higher recall rates.20 The scale has balanced positive and negative questions, uses a 5-point scale, and has an adequate completion rate. The clinicians were also asked to complete a brief questionnaire characterizing their perceptions of encounters with patients enrolled in both groups.

There was telephone follow-up of all patients at 1 month and 3 months after the visit to the clinician. A maximum of 6 attempts was made to contact participants. The telephone calls included repeat administration of the DUKE to assess functional status, questions about additional visits to the clinician or other healthcare providers, and about progress toward achieving recommended behavioral changes. We included all data in the analysis, in concordance with the intention-to-treat principle.

Data from completed forms were entered into a database by a trained, experienced research assistant. Before entry, each form was inspected for completeness, ambiguity of responses, or other irregularities. All unclear responses were referred to the investigators. Range checks were conducted periodically as data were entered to detect errors and were repeated as part of the data cleaning procedures before analysis. Descriptive statistics were calculated on all variables, including the DUKE subscale scores and the PDIS scores. Initial statistical analysis was carried out to test for differences in DUKE means between patients randomized to the observation-only group and the debriefing group. Differences in the PDIS and DUKE subscale means were tested at baseline, 1 month, and 3 months using a repeated measures approach (SAS subroutine PROC MIXED, SAS Institute, Cary, NC).

Results

We recruited participants during a 9-month period beginning in September 1995. Including the 3-month follow-up period, data collection was completed in 12 months. Of the 208 patients recruited, 68 (34%) were randomized to the observation-only group, and 132 (68%) were assigned to receive the debriefing interview. In the observation-only group, 64 (94%) patients were successfully contacted for the 1-month assessment, and 62 (91%) for the 3-month assessment. Of the 132 patients assigned to the debriefing group, 2 refused to complete the debriefing interview. Of those completing the debriefing interview, 123 (93%) were successfully contacted for the 1-month assessment, and 122 (92%) for the 3-month assessment.

The average age of the enrolled patients was 47.4 years (standard deviation [SD] =11.9, range=19-76 years) and 68.0% were women; 32.5% were African American, 65.0% were white, and the rest represented a variety of ethnic groups Table 1. The average educational level of the patients was 14.2 years (SD=3.0). A percent of 12.8 reported annual family incomes less than $15,000, 13.4% between $15,000 and $25,000, and more than half had incomes of $35,000 or more. The interview group had higher income levels than the observation group; otherwise there were no significant differences.

The reasons for patient visits were Papanicolaou tests, pelvic examinations, and routine health maintenance, although some of these visits incorporated a follow-up of a chronic condition. No statistically significant differences were found between the study groups by reason for visit. After the visits, 63 (48%) of the 132 patients in the interview group reported that their clinician recommended a specific type of behavioral change. Of the patients reporting having been given a recommendation, 11 were asked to quit smoking; 15 to change medications or the way that medications were taken; 33 to alter their diet, exercise level, or lose weight; and 4 received recommendations related to stress reduction. Additional miscellaneous changes were also recommended. It is interesting that patients sometimes reported some form of mental health behavioral change, but alcohol was only mentioned rarely by the patients. We expected behavioral changes related to alcohol use or abuse to be mentioned frequently.

 

 

PDIS assessment only occurred at baseline. The mean PDIS score for all observations was 55.8 (SD=5.5, range=35-66). No significant differences in satisfaction were found between the observation and interview groups, and there was no difference in patient satisfaction found between patients who reported receiving a recommendation for behavioral change and those who did not.

We performed the analysis of the outcomes (ie, changes in functional status by recommendation to change behavior), in 2 steps. In the first step, we examined the effect of the debriefing interview by comparing the DUKE subscale scores for the interview and observation groups at 1 month and 3 months. No significant differences were found for any of the DUKE subscales. Based on results indicating that the debriefing interviews had no effect on outcomes, we created 2 groups for subsequent analyses. One group consisted of patients who reported that they received behavioral change recommendations. The second group included all other patients enrolled in the study. This is a conservative method that would tend to underestimate differences because a portion of the observation group may have received behavioral change recommendations. Analysis of differences between these 2 groups during the 3 observations was carried out using a repeated measures approach. A repeated measures model (using SAS PROC MIXED) was fit for each of the DUKE subscale means. The model included time of assessment (baseline, 1 month, 3 months), race, age, sex, and educational level as independent variables. Table 1, Table 2 shows the means for each of the DUKE subscales at each assessment for patients who did and did not receive recommendations to change behavior. As Table 2 indicates, no significant differences (P <.05) were found between patients who did and did not receive a recommendation to change behavior at baseline. At the 1-month follow-up assessment, the mean scores for mental health, social health, and self-esteem were lower for patients who received a behavioral change recommendation. At 3 months, the differences in mental, social, and self-esteem found at 1 month persisted, and the means coresforanxiety, anxiety/depression, and depression were worse for patients receiving recommendations. Race and sex were not significantly associated with differences for any of the DUKE subscale scores shown in Table 2. Education, however, was significantly associated with every subscale. Age was associated with self-esteem. The functional status scores for patients who received recommendations to change behavior declined as educational level increased. Older, better-educated patients who received behavioral change recommendations were the most likely to report reduced functional status and self-esteem during the 3-month follow-up period. Finally, as the results in Table 2 show, the effects observed in the patients who received recommendations were consistent at the 1-month and 3-month assessments. The differences were all in the negative direction except for the pain subscale score. Reports of pain decreased from the baseline to the 1-month assessment and then were higher at the 3-month assessment.

The Effect of Specific Recommendations

To further investigate influences on DUKE subscale scores, we evaluated the effect of the type of change recommended by the clinician for patients given a specific type of recommendation.

Four categories of recommended changes were found: medication compliance, diet and exercise, smoking cessation, and stress control. Table 3 shows the mean DUKE subscale scores by the type of change recommended. As Table 3 indicates, statistically significant differences among the mean scores by type of change recommended were found at the 3-month assessment for disability, mental health, and self-esteem. Post hoc examination of the mean scores in these cases suggested that disability scores for patients who were asked to stop smoking were worse than those of patients asked to improve medication compliance or change their diet. Mental health scores for patients who were asked to stop smoking were significantly poorer than for those asked to try to control stress. No group differences were found for self-esteem scores.

Discussion

The results from our study provide further support that certain elements of patient functioning decline 3 months after behavioral change is recommended by a clinician. These results confirm our previous findings11 and those reported by others.12 We found that social and emotional functioning varied according to whether the patient reported that their clinician recommended that they change a behavior. The observation group had functional declines similar to the entire interview group; the declines fell intermediately between those of patients who reported being asked to make behavioral changes (who had functional declines) and those who reported no behavioral changes (who had no functional declines). This suggests that the debriefing after the office visit had no impact on the outcomes, and reinforces that it was the behavioral change recommendation that led to the declines in self-reported functional status.

 

 

Negative reactions by patients to recommendations for behavioral change may be one reason that such changes are not recommended more often by clinicians. Patients may also resist getting recommendations for behavioral change by not raising the subject with the clinician or by avoiding visits where such recommendations would be likely to occur (such as physical examinations). Physicians may avoid recommending changes to reduce potential conflict with the patient.

Declines in functioning were greatest when the physician recommended that the patient stop smoking, and potential declines were seen for patients with diet or exercise recommendations. No detectable functional decline occurred for those patients who had been given recommendations to make medication-related or dietary changes. This suggests that behavioral changes are perceived as being more difficult for patients than simple medication changes and that being faced with a recommendation to make a behavioral change is associated with lower levels of functioning. The patients who reported being told to quit smoking had marked increases in depression and anxiety and overall disability at 3 months after the visit. These results are consistent with the literature that smoking and depression are interlinked.16 Our study also supports the use of antidepressants such as buproprion21 for helping patients to quit smoking, but our study was done before the drug was approved and commonly used for this indication.

Limitations

The results from our study are subject to limitations and should be interpreted cautiously. First, we did not include sufficient numbers to allow testing of the association between success in the specific behavioral changes and functional decline. It is possible that the decline in functional status is limited to patients who were not successful in changing behavior. It is also possible that the declines are focused on a select group of behavioral changes, such as smoking. Further study is needed to test such associations. Second, data were collected at only 1 family practice center. The patient population is diverse but not necessarily representative of the community at large. Our study was also completed in North Carolina, a state known for tobacco consumption, which may have affected the results pertaining to smoking-related behavioral changes and functional decline. Further studies should emphasize whether functional declines reverse with a longer time frame and whether there is a relationship with successful behavioral change. Research should also consider whether physician behaviors can have a positive impact on a patient’s functional status.

Acknowledgments

Our research was supported by a grant from the American Academy of Family Physicians Foundation (#G9609). We would like to thank Dottie Greek for excellence in project management.

References

1. Tiara DA, Safran DG, Seto TB, Rogers WH, Tarlov AR. The relationship between patient income and physician discussion of health risk behaviors. JAMA 1997;278:1412-7.

2. WB, Belcher DW, Inui TS. Implementing preventive care in clinical practice: problems for managers, clinicians, and patients. Med Care 1981;38:195-216.

3. AS. Encouraging the practice of preventive medicine and health promotion. Public Health Rep 1982;97:216-9.

4. B. Preventive medicine in general practice. Br J Med 1982;284:921-2.

5. MA. Promoting health and preventing disease in health care settings: an analysis of barriers. Prev Med 1987;16:119-30.

6. CT, George LK, Fupt JL, Brodie KH. Health promotion in primary care: a survey of US family practitioners. Prev Med 1985;14:63-7.

7. C, Sobal J, Muncie H, Levine D, Antlitz A. Health promotion: physicians’ beliefs, attitudes and practices. Am J Prev Med 1986;2:82-8.

8. MP, Green LW, Fultz FG. Principles of changing health behaviors. Cancer 1988;62:1768-75.

9. MH, Maiman LA. Sociobehavioral determinants of compliance with health and medical care recommendations. Med Care 1975;13:10-24.

10. SK, Hickam DH. How health professionals influence health behavior: patient-provider interaction and health care outcomes. In: Glanz K, Lewis FM, Rimer BK, eds. Health behavior and health education: theory, research, and practice. San Francisco, Calif: Jossey-Bass Publishers, 1990.

11. MA, Herndon A, Sharp PC, Dignan MB. Assessment of the Patient-Doctor Interaction Scale (PDIS) for measuring patient satisfaction. Patient Educ Couns 1992;19:75-80.

12. HG. Can health screening damage your health? J Royal Coll Gen Prac 1989;39:193-5.

13. H, Sackett D, Taylor D, et al. Increased absenteeism from work after detection and labeling of hypertensive patients. N Eng J Med 1978;229:741-4.

14. DW, Haynes RB, Sackett DL, Gibson ES. Long-term follow-up of absenteeism among working men following the detection and treatment of their hypertension. Clin Invest Med 1981;4:173-7.

15. ME, Gibson ES, Terry CW, et al. Effects of labeling on income, work and social function among hypertensive employees. J Chron Dis 1984;37:417-23.

16. RE, Lichtenstein E. Long-term effects of behavioral smoking cessation interventions. Behav Res Ther 1987;18:297-324.

17. GR, Broadhead WE, Tse C-Kj. The Duke Health Profile, a 17-item measure of health and dysfunction. Med Care 1990;28:1056-72.

18. Parkerson GR, Broadhead WE, Tse C-Kj. Development of the 17-item Duke Health Profile. Fam Pract 1991;8:396-401.

19. DR, Smith JK. Assessing residents’ behavioral science skills: patients’ views of physician-patient interaction. J Fam Pract 1983;17:479-83.

20. D, Tippy P. Communicating information to patients: patient satisfaction and adherence as associated with resident skill. J Fam Pract 1988;26:643-7.

21. RD, Sachs DP, Glover Ed, et al. A comparison of sustained-release buproprion and placebo for smoking cessation. N Engl J Med 1997;337:1195-202.

References

1. Tiara DA, Safran DG, Seto TB, Rogers WH, Tarlov AR. The relationship between patient income and physician discussion of health risk behaviors. JAMA 1997;278:1412-7.

2. WB, Belcher DW, Inui TS. Implementing preventive care in clinical practice: problems for managers, clinicians, and patients. Med Care 1981;38:195-216.

3. AS. Encouraging the practice of preventive medicine and health promotion. Public Health Rep 1982;97:216-9.

4. B. Preventive medicine in general practice. Br J Med 1982;284:921-2.

5. MA. Promoting health and preventing disease in health care settings: an analysis of barriers. Prev Med 1987;16:119-30.

6. CT, George LK, Fupt JL, Brodie KH. Health promotion in primary care: a survey of US family practitioners. Prev Med 1985;14:63-7.

7. C, Sobal J, Muncie H, Levine D, Antlitz A. Health promotion: physicians’ beliefs, attitudes and practices. Am J Prev Med 1986;2:82-8.

8. MP, Green LW, Fultz FG. Principles of changing health behaviors. Cancer 1988;62:1768-75.

9. MH, Maiman LA. Sociobehavioral determinants of compliance with health and medical care recommendations. Med Care 1975;13:10-24.

10. SK, Hickam DH. How health professionals influence health behavior: patient-provider interaction and health care outcomes. In: Glanz K, Lewis FM, Rimer BK, eds. Health behavior and health education: theory, research, and practice. San Francisco, Calif: Jossey-Bass Publishers, 1990.

11. MA, Herndon A, Sharp PC, Dignan MB. Assessment of the Patient-Doctor Interaction Scale (PDIS) for measuring patient satisfaction. Patient Educ Couns 1992;19:75-80.

12. HG. Can health screening damage your health? J Royal Coll Gen Prac 1989;39:193-5.

13. H, Sackett D, Taylor D, et al. Increased absenteeism from work after detection and labeling of hypertensive patients. N Eng J Med 1978;229:741-4.

14. DW, Haynes RB, Sackett DL, Gibson ES. Long-term follow-up of absenteeism among working men following the detection and treatment of their hypertension. Clin Invest Med 1981;4:173-7.

15. ME, Gibson ES, Terry CW, et al. Effects of labeling on income, work and social function among hypertensive employees. J Chron Dis 1984;37:417-23.

16. RE, Lichtenstein E. Long-term effects of behavioral smoking cessation interventions. Behav Res Ther 1987;18:297-324.

17. GR, Broadhead WE, Tse C-Kj. The Duke Health Profile, a 17-item measure of health and dysfunction. Med Care 1990;28:1056-72.

18. Parkerson GR, Broadhead WE, Tse C-Kj. Development of the 17-item Duke Health Profile. Fam Pract 1991;8:396-401.

19. DR, Smith JK. Assessing residents’ behavioral science skills: patients’ views of physician-patient interaction. J Fam Pract 1983;17:479-83.

20. D, Tippy P. Communicating information to patients: patient satisfaction and adherence as associated with resident skill. J Fam Pract 1988;26:643-7.

21. RD, Sachs DP, Glover Ed, et al. A comparison of sustained-release buproprion and placebo for smoking cessation. N Engl J Med 1997;337:1195-202.

Issue
The Journal of Family Practice - 49(05)
Issue
The Journal of Family Practice - 49(05)
Page Number
428-433
Page Number
428-433
Publications
Publications
Article Type
Display Headline
Changes In Functional Status Related To health Maintenance Visits To Family Physicians
Display Headline
Changes In Functional Status Related To health Maintenance Visits To Family Physicians
Legacy Keywords
,Primary health careprimary preventionhealth behaviorquality of life. (J Fam Pract 2000; 49:428-433)
Legacy Keywords
,Primary health careprimary preventionhealth behaviorquality of life. (J Fam Pract 2000; 49:428-433)
Sections
PURLs Copyright

Disallow All Ads
Alternative CME
Use ProPublica

Morbidity, Mortality, and Charges for Hospital Care of the Elderly A Comparison of Internists' and Family Physicians' Admissions

Article Type
Changed
Fri, 01/18/2019 - 10:30
Display Headline
Morbidity, Mortality, and Charges for Hospital Care of the Elderly A Comparison of Internists' and Family Physicians' Admissions
Article PDF
Issue
The Journal of Family Practice - 40(5)
Publications
Sections
Article PDF
Article PDF
Issue
The Journal of Family Practice - 40(5)
Issue
The Journal of Family Practice - 40(5)
Publications
Publications
Article Type
Display Headline
Morbidity, Mortality, and Charges for Hospital Care of the Elderly A Comparison of Internists' and Family Physicians' Admissions
Display Headline
Morbidity, Mortality, and Charges for Hospital Care of the Elderly A Comparison of Internists' and Family Physicians' Admissions
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media

A Comparison of Morbidity and Mortality for Family Physicians' and Internists' Admissions

Article Type
Changed
Fri, 01/18/2019 - 09:04
Display Headline
A Comparison of Morbidity and Mortality for Family Physicians' and Internists' Admissions
Article PDF
Issue
The Journal of Family Practice - 31(5)
Publications
Sections
Article PDF
Article PDF
Issue
The Journal of Family Practice - 31(5)
Issue
The Journal of Family Practice - 31(5)
Publications
Publications
Article Type
Display Headline
A Comparison of Morbidity and Mortality for Family Physicians' and Internists' Admissions
Display Headline
A Comparison of Morbidity and Mortality for Family Physicians' and Internists' Admissions
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media

The Quality of Care Provided by Family Physicians

Article Type
Changed
Fri, 01/18/2019 - 09:15
Display Headline
The Quality of Care Provided by Family Physicians
Article PDF
Issue
The Journal of Family Practice - 28(3)
Publications
Sections
Article PDF
Article PDF
Issue
The Journal of Family Practice - 28(3)
Issue
The Journal of Family Practice - 28(3)
Publications
Publications
Article Type
Display Headline
The Quality of Care Provided by Family Physicians
Display Headline
The Quality of Care Provided by Family Physicians
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media