Video-Based Coaching for Dermatology Resident Surgical Education

Article Type
Changed
Wed, 10/04/2023 - 13:26
Display Headline
Video-Based Coaching for Dermatology Resident Surgical Education

To the Editor:

Video-based coaching (VBC) involves a surgeon recording a surgery and then reviewing the video with a surgical coach; it is a form of education that is gaining popularity among surgical specialties.1 Video-based education is underutilized in dermatology residency training.2 We conducted a pilot study at our dermatology residency program to evaluate the efficacy and feasibility of VBC.

The University of Texas at Austin Dell Medical School institutional review board approved this study. All 4 first-year dermatology residents were recruited to participate in this study. Participants filled out a prestudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Participants used a head-mounted point-of-view camera to record themselves performing a wide local excision on the trunk or extremities of a live human patient. Participants then reviewed the recording on their own and scored themselves using the Objective Structured Assessment of Technical Skills (OSATS) scoring table (scored from 1 to 5, with 5 being the highest possible score for each element), which is a validated tool for assessing surgical skills (eTable 1).3 Given that there were no assistants participating in the surgery, this element of the OSATS scoring table was excluded, making a maximum possible score of 30 and a minimum possible score of 6. After scoring themselves, participants then had a 1-on-1 coaching session with a fellowship-trained dermatologic surgeon (M.F. or T.H.) via online teleconferencing.

CT112004176_eTable1.jpg

During the coaching session, participants and coaches reviewed the video. The surgical coaches also scored the residents using the OSATS, then residents and coaches discussed how the resident could improve using the OSATS scores as a guide. The residents then completed a poststudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Descriptive statistics were reported.

CT112004176_eTable2.jpg

On average, residents spent 31.3 minutes reviewing their own surgeries and scoring themselves. The average time for a coaching session, which included time spent scoring, was 13.8 minutes. Residents scored themselves lower than the surgical coaches did by an average of 5.25 points (eTable 2). Residents gave themselves an average total score of 10.5, while their respective surgical coaches gave the residents an average score of 15.75. There was a trend of residents with greater surgical experience having higher OSATS scores (Figure). After the coaching session, 3 of 4 residents reported that they felt more confident in their surgical skills. All residents felt more confident in assessing their surgical skills and felt that VBC was an effective teaching measure. All residents agreed that VBC should be continued as part of their residency training.

Arffa_figure_ret.jpg
%3Cp%3ESurgical%20experience%20of%20dermatology%20residents%20and%20surgical%20coaches%20vs%20their%20reported%20Objective%20Structured%20Assessment%20of%20Technical%20Skills%20(OSATS)%20score%20for%20video-based%20coaching.%20The%207%20elements%20were%20each%20scored%20from%201%20to%205%2C%20with%205%20being%20the%20highest%20possible%20score%20for%20each%20element%20and%2035%20being%20the%20highest%20possible%20total%20score.%3C%2Fp%3E

Video-based coaching has the potential to provide several benefits for dermatology trainees. Because receiving feedback intraoperatively often can be distracting and incomplete, video review can instead allow the surgeon to focus on performing the surgery and then later focus on learning while reviewing the video.1,4 Feedback also can be more comprehensive and delivered without concern for time constraints or disturbing clinic flow as well as without the additional concern of the patient overhearing comments and feedback.3 Although independent video review in the absence of coaching can lead to improvement in surgical skills, the addition of VBC provides even greater potential educational benefit.4 During the COVID-19 pandemic, VBC allowed coaches to provide feedback without additional exposures. We utilized dermatologic surgery faculty as coaches, but this format of training also would apply to general dermatology faculty.

Another goal of VBC is to enhance a trainee’s ability to perform self-directed learning, which requires accurate self-assessment.4 Accurately assessing one’s own strengths empowers a trainee to act with appropriate confidence, while understanding one’s own weaknesses allows a trainee to effectively balance confidence and caution in daily practice.5 Interestingly, in our study all residents scored themselves lower than surgical coaches, but with 1 coaching session, the residents subsequently reported greater surgical confidence.

Time constraints can be a potential barrier to surgical coaching.4 Our study demonstrates that VBC requires minimal time investment. Increasing the speed of video playback allowed for efficient evaluation of resident surgeries without compromising the coach’s ability to provide comprehensive feedback. Our feedback sessions were performed virtually, which allowed for ease of scheduling between trainees and coaches.

Our pilot study demonstrated that VBC is relatively easy to implement in a dermatology residency training setting, leveraging relatively low-cost technologies and allowing for a means of learning that residents felt was effective. Video-based coaching requires minimal time investment from both trainees and coaches and has the potential to enhance surgical confidence. Our current study is limited by its small sample size. Future studies should include follow-up recordings and assess the efficacy of VBC in enhancing surgical skills.

References
  1. Greenberg CC, Dombrowski J, Dimick JB. Video-based surgical coaching: an emerging approach to performance improvement. JAMA Surg. 2016;151:282-283.
  2. Dai J, Bordeaux JS, Miller CJ, et al. Assessing surgical training and deliberate practice methods in dermatology residency: a survey of dermatology program directors. Dermatol Surg. 2016;42:977-984.
  3. Chitgopeker P, Sidey K, Aronson A, et al. Surgical skills video-based assessment tool for dermatology residents: a prospective pilot study. J Am Acad Dermatol. 2020;83:614-616.
  4. Bull NB, Silverman CD, Bonrath EM. Targeted surgical coaching can improve operative self-assessment ability: a single-blinded nonrandomized trial. Surgery. 2020;167:308-313.
  5. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46-S54.
Article PDF
Author and Disclosure Information

Dr. Arffa is from Bennett Surgery Center, Santa Monica, California. Drs. Leszczynska, Fox, and Hollmig are from Department of Internal Medicine, Division of Dermatology, The University of Texas at Austin Dell Medical School.

Drs. Arffa, Leszczynska, and Fox report no conflict of interest. Dr. Hollmig is a board director for Venus Concept and a speaker for Lumenis and Sciton.

The eTables are available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Matthew Lee Arffa, MD, 1301 20th St, Ste 570, Santa Monica, CA 90404 (mattarffa@gmail.com).

Issue
Cutis - 112(4)
Publications
Topics
Page Number
176-177,E5-E6
Sections
Author and Disclosure Information

Dr. Arffa is from Bennett Surgery Center, Santa Monica, California. Drs. Leszczynska, Fox, and Hollmig are from Department of Internal Medicine, Division of Dermatology, The University of Texas at Austin Dell Medical School.

Drs. Arffa, Leszczynska, and Fox report no conflict of interest. Dr. Hollmig is a board director for Venus Concept and a speaker for Lumenis and Sciton.

The eTables are available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Matthew Lee Arffa, MD, 1301 20th St, Ste 570, Santa Monica, CA 90404 (mattarffa@gmail.com).

Author and Disclosure Information

Dr. Arffa is from Bennett Surgery Center, Santa Monica, California. Drs. Leszczynska, Fox, and Hollmig are from Department of Internal Medicine, Division of Dermatology, The University of Texas at Austin Dell Medical School.

Drs. Arffa, Leszczynska, and Fox report no conflict of interest. Dr. Hollmig is a board director for Venus Concept and a speaker for Lumenis and Sciton.

The eTables are available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Matthew Lee Arffa, MD, 1301 20th St, Ste 570, Santa Monica, CA 90404 (mattarffa@gmail.com).

Article PDF
Article PDF

To the Editor:

Video-based coaching (VBC) involves a surgeon recording a surgery and then reviewing the video with a surgical coach; it is a form of education that is gaining popularity among surgical specialties.1 Video-based education is underutilized in dermatology residency training.2 We conducted a pilot study at our dermatology residency program to evaluate the efficacy and feasibility of VBC.

The University of Texas at Austin Dell Medical School institutional review board approved this study. All 4 first-year dermatology residents were recruited to participate in this study. Participants filled out a prestudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Participants used a head-mounted point-of-view camera to record themselves performing a wide local excision on the trunk or extremities of a live human patient. Participants then reviewed the recording on their own and scored themselves using the Objective Structured Assessment of Technical Skills (OSATS) scoring table (scored from 1 to 5, with 5 being the highest possible score for each element), which is a validated tool for assessing surgical skills (eTable 1).3 Given that there were no assistants participating in the surgery, this element of the OSATS scoring table was excluded, making a maximum possible score of 30 and a minimum possible score of 6. After scoring themselves, participants then had a 1-on-1 coaching session with a fellowship-trained dermatologic surgeon (M.F. or T.H.) via online teleconferencing.

CT112004176_eTable1.jpg

During the coaching session, participants and coaches reviewed the video. The surgical coaches also scored the residents using the OSATS, then residents and coaches discussed how the resident could improve using the OSATS scores as a guide. The residents then completed a poststudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Descriptive statistics were reported.

CT112004176_eTable2.jpg

On average, residents spent 31.3 minutes reviewing their own surgeries and scoring themselves. The average time for a coaching session, which included time spent scoring, was 13.8 minutes. Residents scored themselves lower than the surgical coaches did by an average of 5.25 points (eTable 2). Residents gave themselves an average total score of 10.5, while their respective surgical coaches gave the residents an average score of 15.75. There was a trend of residents with greater surgical experience having higher OSATS scores (Figure). After the coaching session, 3 of 4 residents reported that they felt more confident in their surgical skills. All residents felt more confident in assessing their surgical skills and felt that VBC was an effective teaching measure. All residents agreed that VBC should be continued as part of their residency training.

Arffa_figure_ret.jpg
%3Cp%3ESurgical%20experience%20of%20dermatology%20residents%20and%20surgical%20coaches%20vs%20their%20reported%20Objective%20Structured%20Assessment%20of%20Technical%20Skills%20(OSATS)%20score%20for%20video-based%20coaching.%20The%207%20elements%20were%20each%20scored%20from%201%20to%205%2C%20with%205%20being%20the%20highest%20possible%20score%20for%20each%20element%20and%2035%20being%20the%20highest%20possible%20total%20score.%3C%2Fp%3E

Video-based coaching has the potential to provide several benefits for dermatology trainees. Because receiving feedback intraoperatively often can be distracting and incomplete, video review can instead allow the surgeon to focus on performing the surgery and then later focus on learning while reviewing the video.1,4 Feedback also can be more comprehensive and delivered without concern for time constraints or disturbing clinic flow as well as without the additional concern of the patient overhearing comments and feedback.3 Although independent video review in the absence of coaching can lead to improvement in surgical skills, the addition of VBC provides even greater potential educational benefit.4 During the COVID-19 pandemic, VBC allowed coaches to provide feedback without additional exposures. We utilized dermatologic surgery faculty as coaches, but this format of training also would apply to general dermatology faculty.

Another goal of VBC is to enhance a trainee’s ability to perform self-directed learning, which requires accurate self-assessment.4 Accurately assessing one’s own strengths empowers a trainee to act with appropriate confidence, while understanding one’s own weaknesses allows a trainee to effectively balance confidence and caution in daily practice.5 Interestingly, in our study all residents scored themselves lower than surgical coaches, but with 1 coaching session, the residents subsequently reported greater surgical confidence.

Time constraints can be a potential barrier to surgical coaching.4 Our study demonstrates that VBC requires minimal time investment. Increasing the speed of video playback allowed for efficient evaluation of resident surgeries without compromising the coach’s ability to provide comprehensive feedback. Our feedback sessions were performed virtually, which allowed for ease of scheduling between trainees and coaches.

Our pilot study demonstrated that VBC is relatively easy to implement in a dermatology residency training setting, leveraging relatively low-cost technologies and allowing for a means of learning that residents felt was effective. Video-based coaching requires minimal time investment from both trainees and coaches and has the potential to enhance surgical confidence. Our current study is limited by its small sample size. Future studies should include follow-up recordings and assess the efficacy of VBC in enhancing surgical skills.

To the Editor:

Video-based coaching (VBC) involves a surgeon recording a surgery and then reviewing the video with a surgical coach; it is a form of education that is gaining popularity among surgical specialties.1 Video-based education is underutilized in dermatology residency training.2 We conducted a pilot study at our dermatology residency program to evaluate the efficacy and feasibility of VBC.

The University of Texas at Austin Dell Medical School institutional review board approved this study. All 4 first-year dermatology residents were recruited to participate in this study. Participants filled out a prestudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Participants used a head-mounted point-of-view camera to record themselves performing a wide local excision on the trunk or extremities of a live human patient. Participants then reviewed the recording on their own and scored themselves using the Objective Structured Assessment of Technical Skills (OSATS) scoring table (scored from 1 to 5, with 5 being the highest possible score for each element), which is a validated tool for assessing surgical skills (eTable 1).3 Given that there were no assistants participating in the surgery, this element of the OSATS scoring table was excluded, making a maximum possible score of 30 and a minimum possible score of 6. After scoring themselves, participants then had a 1-on-1 coaching session with a fellowship-trained dermatologic surgeon (M.F. or T.H.) via online teleconferencing.

CT112004176_eTable1.jpg

During the coaching session, participants and coaches reviewed the video. The surgical coaches also scored the residents using the OSATS, then residents and coaches discussed how the resident could improve using the OSATS scores as a guide. The residents then completed a poststudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Descriptive statistics were reported.

CT112004176_eTable2.jpg

On average, residents spent 31.3 minutes reviewing their own surgeries and scoring themselves. The average time for a coaching session, which included time spent scoring, was 13.8 minutes. Residents scored themselves lower than the surgical coaches did by an average of 5.25 points (eTable 2). Residents gave themselves an average total score of 10.5, while their respective surgical coaches gave the residents an average score of 15.75. There was a trend of residents with greater surgical experience having higher OSATS scores (Figure). After the coaching session, 3 of 4 residents reported that they felt more confident in their surgical skills. All residents felt more confident in assessing their surgical skills and felt that VBC was an effective teaching measure. All residents agreed that VBC should be continued as part of their residency training.

Arffa_figure_ret.jpg
%3Cp%3ESurgical%20experience%20of%20dermatology%20residents%20and%20surgical%20coaches%20vs%20their%20reported%20Objective%20Structured%20Assessment%20of%20Technical%20Skills%20(OSATS)%20score%20for%20video-based%20coaching.%20The%207%20elements%20were%20each%20scored%20from%201%20to%205%2C%20with%205%20being%20the%20highest%20possible%20score%20for%20each%20element%20and%2035%20being%20the%20highest%20possible%20total%20score.%3C%2Fp%3E

Video-based coaching has the potential to provide several benefits for dermatology trainees. Because receiving feedback intraoperatively often can be distracting and incomplete, video review can instead allow the surgeon to focus on performing the surgery and then later focus on learning while reviewing the video.1,4 Feedback also can be more comprehensive and delivered without concern for time constraints or disturbing clinic flow as well as without the additional concern of the patient overhearing comments and feedback.3 Although independent video review in the absence of coaching can lead to improvement in surgical skills, the addition of VBC provides even greater potential educational benefit.4 During the COVID-19 pandemic, VBC allowed coaches to provide feedback without additional exposures. We utilized dermatologic surgery faculty as coaches, but this format of training also would apply to general dermatology faculty.

Another goal of VBC is to enhance a trainee’s ability to perform self-directed learning, which requires accurate self-assessment.4 Accurately assessing one’s own strengths empowers a trainee to act with appropriate confidence, while understanding one’s own weaknesses allows a trainee to effectively balance confidence and caution in daily practice.5 Interestingly, in our study all residents scored themselves lower than surgical coaches, but with 1 coaching session, the residents subsequently reported greater surgical confidence.

Time constraints can be a potential barrier to surgical coaching.4 Our study demonstrates that VBC requires minimal time investment. Increasing the speed of video playback allowed for efficient evaluation of resident surgeries without compromising the coach’s ability to provide comprehensive feedback. Our feedback sessions were performed virtually, which allowed for ease of scheduling between trainees and coaches.

Our pilot study demonstrated that VBC is relatively easy to implement in a dermatology residency training setting, leveraging relatively low-cost technologies and allowing for a means of learning that residents felt was effective. Video-based coaching requires minimal time investment from both trainees and coaches and has the potential to enhance surgical confidence. Our current study is limited by its small sample size. Future studies should include follow-up recordings and assess the efficacy of VBC in enhancing surgical skills.

References
  1. Greenberg CC, Dombrowski J, Dimick JB. Video-based surgical coaching: an emerging approach to performance improvement. JAMA Surg. 2016;151:282-283.
  2. Dai J, Bordeaux JS, Miller CJ, et al. Assessing surgical training and deliberate practice methods in dermatology residency: a survey of dermatology program directors. Dermatol Surg. 2016;42:977-984.
  3. Chitgopeker P, Sidey K, Aronson A, et al. Surgical skills video-based assessment tool for dermatology residents: a prospective pilot study. J Am Acad Dermatol. 2020;83:614-616.
  4. Bull NB, Silverman CD, Bonrath EM. Targeted surgical coaching can improve operative self-assessment ability: a single-blinded nonrandomized trial. Surgery. 2020;167:308-313.
  5. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46-S54.
References
  1. Greenberg CC, Dombrowski J, Dimick JB. Video-based surgical coaching: an emerging approach to performance improvement. JAMA Surg. 2016;151:282-283.
  2. Dai J, Bordeaux JS, Miller CJ, et al. Assessing surgical training and deliberate practice methods in dermatology residency: a survey of dermatology program directors. Dermatol Surg. 2016;42:977-984.
  3. Chitgopeker P, Sidey K, Aronson A, et al. Surgical skills video-based assessment tool for dermatology residents: a prospective pilot study. J Am Acad Dermatol. 2020;83:614-616.
  4. Bull NB, Silverman CD, Bonrath EM. Targeted surgical coaching can improve operative self-assessment ability: a single-blinded nonrandomized trial. Surgery. 2020;167:308-313.
  5. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46-S54.
Issue
Cutis - 112(4)
Issue
Cutis - 112(4)
Page Number
176-177,E5-E6
Page Number
176-177,E5-E6
Publications
Publications
Topics
Article Type
Display Headline
Video-Based Coaching for Dermatology Resident Surgical Education
Display Headline
Video-Based Coaching for Dermatology Resident Surgical Education
Sections
Teambase XML
<?xml version="1.0" encoding="UTF-8"?>
<!--$RCSfile: InCopy_agile.xsl,v $ $Revision: 1.35 $-->
<!--$RCSfile: drupal.xsl,v $ $Revision: 1.7 $-->
<root generator="drupal.xsl" gversion="1.7"> <header> <fileName>Arffa</fileName> <TBEID>0C02E428.SIG</TBEID> <TBUniqueIdentifier>NJ_0C02E428</TBUniqueIdentifier> <newsOrJournal>Journal</newsOrJournal> <publisherName>Frontline Medical Communications Inc.</publisherName> <storyname>Arffa</storyname> <articleType>1</articleType> <TBLocation>Copyfitting-CT</TBLocation> <QCDate/> <firstPublished>20231003T073005</firstPublished> <LastPublished>20231003T073005</LastPublished> <pubStatus qcode="stat:"/> <embargoDate/> <killDate/> <CMSDate>20231003T073004</CMSDate> <articleSource/> <facebookInfo/> <meetingNumber/> <byline>Matthew Lee Arffa, MD; Maria Leszczynska, MD; Matthew Fox, MD</byline> <bylineText>Matthew Lee Arffa, MD; Maria Leszczynska, MD; Matthew Fox, MD; Tyler Hollmig, MD</bylineText> <bylineFull>Matthew Lee Arffa, MD; Maria Leszczynska, MD; Matthew Fox, MD</bylineFull> <bylineTitleText/> <USOrGlobal/> <wireDocType/> <newsDocType/> <journalDocType/> <linkLabel/> <pageRange>176-177,E5-E6</pageRange> <citation/> <quizID/> <indexIssueDate/> <itemClass qcode="ninat:text"/> <provider qcode="provider:"> <name/> <rightsInfo> <copyrightHolder> <name/> </copyrightHolder> <copyrightNotice/> </rightsInfo> </provider> <abstract/> <metaDescription>To the Editor:Video-based coaching (VBC) involves a surgeon recording a surgery and then reviewing the video with a surgical coach; it is a form of education th</metaDescription> <articlePDF>298294</articlePDF> <teaserImage/> <title>Video-Based Coaching for Dermatology Resident Surgical Education</title> <deck/> <disclaimer/> <AuthorList/> <articleURL/> <doi/> <pubMedID/> <publishXMLStatus/> <publishXMLVersion>1</publishXMLVersion> <useEISSN>0</useEISSN> <urgency/> <pubPubdateYear>2023</pubPubdateYear> <pubPubdateMonth>October</pubPubdateMonth> <pubPubdateDay/> <pubVolume>112</pubVolume> <pubNumber>4</pubNumber> <wireChannels/> <primaryCMSID/> <CMSIDs> <CMSID>2161</CMSID> </CMSIDs> <keywords> <keyword>video-based coaching</keyword> <keyword> dermatology resident</keyword> <keyword> surgical education</keyword> </keywords> <seeAlsos/> <publications_g> <publicationData> <publicationCode>CT</publicationCode> <pubIssueName>October 2023</pubIssueName> <pubArticleType>Original Articles | 2161</pubArticleType> <pubTopics/> <pubCategories/> <pubSections/> <journalTitle>Cutis</journalTitle> <journalFullTitle>Cutis</journalFullTitle> <copyrightStatement>Copyright 2015 Frontline Medical Communications Inc., Parsippany, NJ, USA. All rights reserved.</copyrightStatement> </publicationData> </publications_g> <publications> <term canonical="true">12</term> </publications> <sections> <term canonical="true">104</term> <term>64</term> </sections> <topics> <term canonical="true">27442</term> </topics> <links> <link> <itemClass qcode="ninat:composite"/> <altRep contenttype="application/pdf">images/180025ab.pdf</altRep> <description role="drol:caption"/> <description role="drol:credit"/> </link> </links> </header> <itemSet> <newsItem> <itemMeta> <itemRole>Main</itemRole> <itemClass>text</itemClass> <title>Video-Based Coaching for Dermatology Resident Surgical Education</title> <deck/> </itemMeta> <itemContent> <p>To the Editor:<br/><br/>Video-based coaching (VBC) involves a surgeon recording a surgery and then reviewing the video with a surgical coach; it is a form of education that is gaining popularity among surgical specialties.<sup>1</sup> Video-based education is underutilized in dermatology residency training.<sup>2</sup> We conducted a pilot study at our dermatology residency program to evaluate the efficacy and feasibility of VBC. </p> <p>The University of Texas at Austin Dell Medical School institutional review board approved this study. All 4 first-year dermatology residents were recruited to participate in this study. Participants filled out a prestudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Participants used a head-mounted point-of-view camera to record themselves performing a wide local excision on the trunk or extremities of a live human patient. Participants then reviewed the recording on their own and scored themselves using the Objective Structured Assessment of Technical Skills (OSATS) scoring table (scored from 1 to 5, with 5 being the highest possible score for each element), which is a validated tool for assessing surgical skills (eTable 1).<sup>3</sup> Given that there were no assistants participating in the surgery, this element of the OSATS scoring table was excluded, making a maximum possible score of 30 and a minimum possible score of 6. After scoring themselves, participants then had a 1-on-1 coaching session with a fellowship-trained dermatologic surgeon (M.F. or T.H.) via online teleconferencing. <br/><br/>During the coaching session, participants and coaches reviewed the video. The surgical coaches also scored the residents using the OSATS, then residents and coaches discussed how the resident could improve using the OSATS scores as a guide. The residents then completed a poststudy survey assessing their surgical experience, confidence in performing surgery, and attitudes on VBC. Descriptive statistics were reported.<br/><br/>On average, residents spent 31.3 minutes reviewing their own surgeries and scoring themselves. The average time for a coaching session, which included time spent scoring, was 13.8 minutes. Residents scored themselves lower than the surgical coaches did by an average of 5.25 points (eTable 2). Residents gave themselves an average total score of 10.5, while their respective surgical coaches gave the residents an average score of 15.75. There was a trend of residents with greater surgical experience having higher OSATS scores (Figure). After the coaching session, 3 of 4 residents reported that they felt more confident in their surgical skills. All residents felt more confident in assessing their surgical skills and felt that VBC was an effective teaching measure. All residents agreed that VBC should be continued as part of their residency training. <br/><br/>Video-based coaching has the potential to provide several benefits for dermatology trainees. Because receiving feedback intraoperatively often can be distracting and incomplete, video review can instead allow the surgeon to focus on performing the surgery and then later focus on learning while reviewing the video.<sup>1,4</sup> Feedback also can be more comprehensive and delivered without concern for time constraints or disturbing clinic flow as well as without the additional concern of the patient overhearing comments and feedback.<sup>3</sup> Although independent video review in the absence of coaching can lead to improvement in surgical skills, the addition of VBC provides even greater potential educational benefit.<sup>4</sup> During the COVID-19 pandemic, VBC allowed coaches to provide feedback without additional exposures. We utilized dermatologic surgery faculty as coaches, but this format of training also would apply to general dermatology faculty.<br/><br/>Another goal of VBC is to enhance a trainee’s ability to perform self-directed learning, which requires accurate self-assessment.<sup>4</sup> Accurately assessing one’s own strengths empowers a trainee to act with appropriate confidence, while understanding one’s own weaknesses allows a trainee to effectively balance confidence and caution in daily practice.<sup>5</sup> Interestingly, in our study all residents scored themselves lower than surgical coaches, but with 1 coaching session, the residents subsequently reported greater surgical confidence.<br/><br/>Time constraints can be a potential barrier to surgical coaching.<sup>4</sup> Our study demonstrates that VBC requires minimal time investment. Increasing the speed of video playback allowed for efficient evaluation of resident surgeries without compromising the coach’s ability to provide comprehensive feedback. Our feedback sessions were performed virtually, which allowed for ease of scheduling between trainees and coaches.<br/><br/>Our pilot study demonstrated that VBC is relatively easy to implement in a dermatology residency training setting, leveraging relatively low-cost technologies and allowing for a means of learning that residents felt was effective. Video-based coaching requires minimal time investment from both trainees and coaches and has the potential to enhance surgical confidence. Our current study is limited by its small sample size. Future studies should include follow-up recordings and assess the efficacy of VBC in enhancing surgical skills. </p> <h2>References</h2> <p class="reference"> 1. Greenberg CC, Dombrowski J, Dimick JB. Video-based surgical coaching: an emerging approach to performance improvement. <i>JAMA Surg</i>. 2016;151:282-283. <br/><br/> 2. Dai J, Bordeaux JS, Miller CJ, et al. Assessing surgical training and deliberate practice methods in dermatology residency: a survey of dermatology program directors. <i>Dermatol Surg</i>. 2016;42:977-984. <br/><br/> 3. Chitgopeker P, Sidey K, Aronson A, et al. Surgical skills video-based assessment tool for dermatology residents: a prospective pilot study. <i>J Am Acad Dermatol</i>. 2020;83:614-616. <br/><br/> 4. Bull NB, Silverman CD, Bonrath EM. Targeted surgical coaching can improve operative self-assessment ability: a single-blinded nonrandomized trial. <i>Surgery</i>. 2020;167:308-313. <br/><br/> 5. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. <i>Acad Med</i>. 2005;80(10 suppl):S46-S54. </p> </itemContent> </newsItem> <newsItem> <itemMeta> <itemRole>bio</itemRole> <itemClass>text</itemClass> <title/> <deck/> </itemMeta> <itemContent> <p class="disclosure">Dr. Arffa is from Bennett Surgery Center, Santa Monica, California. Drs. Leszczynska, Fox, and Hollmig are from Department of Internal Medicine, Division of Dermatology, The University of Texas at Austin Dell Medical School.</p> <p class="disclosure">Drs. Arffa, Leszczynska, and Fox report no conflict of interest. Dr. Hollmig is a board director for Venus Concept and a speaker for Lumenis and Sciton.<br/><br/>The eTables are available in the Appendix online at www.mdedge.com/dermatology.<br/><br/>Correspondence: Matthew Lee Arffa, MD, 1301 20th St, Ste 570, Santa Monica, CA 90404 (mattarffa@gmail.com).<br/><br/>doi:10.12788/cutis.0867</p> </itemContent> </newsItem> <newsItem> <itemMeta> <itemRole>in</itemRole> <itemClass>text</itemClass> <title/> <deck/> </itemMeta> <itemContent> <p class="insidehead">Practice <strong>Points</strong> </p> <ul class="insidebody"> <li>Video-based coaching (VBC) for surgical procedures is an up-and-coming form of medical education that allows a “coach” to provide thoughtful and in-depth feedback while reviewing a recording with the surgeon in a private setting. This format has potential utility in teaching dermatology resident surgeons being coached by a dermatology faculty member.</li> <li>We performed a pilot study demonstrating that VBC can be performed easily with a minimal time investment for both the surgeon and the coach. Dermatology residents not only felt that VBC was an effective teaching method but also should become a formal part of their education.</li> </ul> </itemContent> </newsItem> </itemSet></root>
Inside the Article

PRACTICE POINTS

  • Video-based coaching (VBC) for surgical procedures is an up-and-coming form of medical education that allows a “coach” to provide thoughtful and in-depth feedback while reviewing a recording with the surgeon in a private setting. This format has potential utility in teaching dermatology resident surgeons being coached by a dermatology faculty member.
  • We performed a pilot study demonstrating that VBC can be performed easily with a minimal time investment for both the surgeon and the coach. Dermatology residents not only felt that VBC was an effective teaching method but also should become a formal part of their education.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Image
Teambase ID
180025AB.SIG
Disable zoom
Off

Improving Diagnostic Accuracy in Skin of Color Using an Educational Module

Article Type
Changed
Thu, 08/10/2023 - 08:23
Display Headline
Improving Diagnostic Accuracy in Skin of Color Using an Educational Module
IN COLLABORATION WITH THE SKIN OF COLOR SOCIETY

Dermatologic disparities disproportionately affect patients with skin of color (SOC). Two studies assessing the diagnostic accuracy of medical students have shown disparities in diagnosing common skin conditions presenting in darker skin compared to lighter skin at early stages of training.1,2 This knowledge gap could be attributed to the underrepresentation of SOC in dermatologic textbooks, journals, and educational curricula.3-6 It is important for dermatologists as well as physicians in other specialties and ancillary health care workers involved in treating or triaging dermatologic diseases to recognize common skin conditions presenting in SOC. We sought to evaluate the effectiveness of a focused educational module for improving diagnostic accuracy and confidence in treating SOC among interprofessional health care providers.

Methods

Interprofessional health care providers—medical students, residents/fellows, attending physicians, advanced practice providers (APPs), and nurses practicing across various medical specialties—at The University of Texas at Austin Dell Medical School and Ascension Medical Group (both in Austin, Texas) were invited to participate in an institutional review board–exempt study involving a virtual SOC educational module from February through May 2021. The 1-hour module involved a pretest, a 15-minute lecture, an immediate posttest, and a 3-month posttest. All tests included the same 40 multiple-choice questions of 20 dermatologic conditions portrayed in lighter and darker skin types from VisualDx.com, and participants were asked to identify the condition in each photograph. Questions appeared one at a time in a randomized order, and answers could not be changed once submitted.

For analysis, the dermatologic conditions were categorized into 4 groups: cancerous, infectious, inflammatory, and SOC-associated conditions. Cancerous conditions included basal cell carcinoma, squamous cell carcinoma, and melanoma. Infectious conditions included herpes zoster, tinea corporis, tinea versicolor, staphylococcal scalded skin syndrome, and verruca vulgaris. Inflammatory conditions included acne, atopic dermatitis, pityriasis rosea, psoriasis, seborrheic dermatitis, contact dermatitis, lichen planus, and urticaria. Skin of color–associated conditions included hidradenitis suppurativa, acanthosis nigricans, keloid, and melasma. Two questions utilizing a 5-point Likert scale assessing confidence in diagnosing light and dark skin also were included.

The pre-recorded 15-minute video lecture was given by 2 dermatology residents (P.L.K. and C.P.), and the learning objectives covered morphologic differences in lighter skin and darker skin, comparisons of common dermatologic diseases in lighter skin and darker skin, diseases more commonly affecting patients with SOC, and treatment considerations for conditions affecting skin and hair in patients with SOC. Photographs from the diagnostic accuracy assessment were not reused in the lecture. Detailed explanations on morphology, diagnostic pearls, and treatment options for all conditions tested were provided to participants upon completion of the 3-month posttest.

Statistical Analysis—Test scores were compared between conditions shown in lighter and darker skin types and from the pretest to the immediate posttest and 3-month posttest. Multiple linear regression was used to assess for intervention effects on lighter and darker skin scores controlling for provider type and specialty. All tests were 2-sided with significance at P<.05. Analyses were conducted using Stata 17.

Results

One hundred participants completed the pretest and immediate posttest, 36 of whom also completed the 3-month posttest (Table). There was no significant difference in baseline characteristics between the pretest and 3-month posttest groups.

CT112001012_Table.jpg

Test scores were correlated with provider type and specialty but not age, sex, or race/ethnicity. Specializing in dermatology and being a resident or attending physician were independently associated with higher test scores. Mean pretest diagnostic accuracy and confidence scores were higher for skin conditions shown in lighter skin compared with those shown in darker skin (13.6 vs 11.3 and 2.7 vs 1.9, respectively; both P<.001). Pretest diagnostic accuracy was significantly higher for skin conditions shown in lighter skin compared with darker skin for cancerous, inflammatory, and infectious conditions (72% vs 50%, 68% vs 55%, and 57% vs 47%, respectively; P<.001 for all)(Figure 1). Skin of color–associated conditions were not associated with significantly different scores for lighter skin compared with darker skin (79% vs 75%; P=.059).

Kojder_1.jpg
%3Cp%3E%3Cstrong%3EFIGURE%201.%3C%2Fstrong%3E%20Pretest%20percentage%20correct%20score%20in%20lighter%20skin%20compared%20with%20darker%20skin%20categorized%20by%20type%20of%20skin%20condition.%20Asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.001.%3C%2Fp%3E

 

 

Controlling for provider type and specialty, significantly improved diagnostic accuracy was seen in immediate posttest scores compared with pretest scores for conditions shown in both lighter and darker skin types (lighter: 15.2 vs 13.6; darker: 13.3 vs 11.3; both P<.001)(Figure 2). The immediate posttest demonstrated higher mean diagnostic accuracy and confidence scores for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 15.2 vs 13.3; confidence: 3.0 vs 2.6; both P<.001), but the disparity between scores was less than in the pretest.

Kojder_2.jpg
%3Cp%3E%3Cstrong%3EFIGURE%202.%3C%2Fstrong%3E%20Mean%20scores%20for%20diagnostic%20accuracy%20overall%20and%20in%20lighter%20and%20darker%20skin%20following%20pretest%2C%20immediate%20posttest%2C%20and%203-month%20posttest.%20Single%20asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.05%3B%20double%20asterisk%2C%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.01%3B%20triple%20asterisk%2C%20P%26lt%3B.001.%3C%2Fp%3E

Following the 3-month posttest, improvement in diagnostic accuracy was noted among both lighter and darker skin types compared with the pretest, but the difference remained significant only for conditions shown in darker skin (mean scores, 11.3 vs 13.3; P<.01). Similarly, confidence in diagnosing conditions in both lighter and darker skin improved following the immediate posttest (mean scores, 2.7 vs 3.0 and 1.9 vs 2.6; both P<.001), and this improvement remained significant for only darker skin following the 3-month posttest (mean scores, 1.9 vs 2.3; P<.001). Despite these improvements, diagnostic accuracy and confidence remained higher for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 14.7 vs 13.3; P<.01; confidence: 2.8 vs 2.3; P<.001), though the disparity between scores was again less than in the pretest.

Comment

Our study showed that there are diagnostic disparities between lighter and darker skin types among interprofessional health care providers. Education on SOC should extend to interprofessional health care providers and other medical specialties involved in treating or triaging dermatologic diseases. A focused educational module may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in SOC. Differences in diagnostic accuracy between conditions shown in lighter and darker skin types were noted for the disease categories of infectious, cancerous, and inflammatory conditions, with the exception of conditions more frequently seen in patients with SOC. Learning resources for SOC-associated conditions are more likely to have greater representation of images depicting darker skin types.7 Future educational interventions may need to focus on dermatologic conditions that are not preferentially seen in patients with SOC. In our study, the pretest scores for conditions shown in darker skin were lowest among infectious and cancerous conditions. For infections, certain morphologic clues such as erythema are important for diagnosis but may be more subtle or difficult to discern in darker skin. It also is possible that providers may be less likely to suspect skin cancer in patients with SOC given that the morphologic presentation and/or anatomic site of involvement for skin cancers in SOC differs from those in lighter skin. Future educational interventions targeting disparities in diagnostic accuracy should focus on conditions that are not specifically associated with SOC.

Limitations of our study included the small number of participants, the study population came from a single institution, and a possible selection bias for providers interested in dermatology.

Conclusion

Disparities exist among interprofessional health care providers when treating conditions in patients with lighter skin compared to darker skin. An educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.

References
  1. Fenton A, Elliott E, Shahbandi A, et al. Medical students’ ability to diagnose common dermatologic conditions in skin of color. J Am Acad Dermatol. 2020;83:957-958. doi:10.1016/j.jaad.2019.12.078
  2. Mamo A, Szeto MD, Rietcheck H, et al. Evaluating medical student assessment of common dermatologic conditions across Fitzpatrick phototypes and skin of color. J Am Acad Dermatol. 2022;87:167-169. doi:10.1016/j.jaad.2021.06.868
  3. Guda VA, Paek SY. Skin of color representation in commonly utilized medical student dermatology resources. J Drugs Dermatol. 2021;20:799. doi:10.36849/JDD.5726
  4. Wilson BN, Sun M, Ashbaugh AG, et al. Assessment of skin of color and diversity and inclusion content of dermatologic published literature: an analysis and call to action. Int J Womens Dermatol. 2021;7:391-397. doi:10.1016/j.ijwd.2021.04.001
  5. Ibraheim MK, Gupta R, Dao H, et al. Evaluating skin of color education in dermatology residency programs: data from a national survey. Clin Dermatol. 2022;40:228-233. doi:10.1016/j.clindermatol.2021.11.015
  6. Gupta R, Ibraheim MK, Dao H Jr, et al. Assessing dermatology resident confidence in caring for patients with skin of color. Clin Dermatol. 2021;39:873-878. doi:10.1016/j.clindermatol.2021.08.019
  7. Chang MJ, Lipner SR. Analysis of skin color on the American Academy of Dermatology public education website. J Drugs Dermatol. 2020;19:1236-1237. doi:10.36849/JDD.2020.5545
Article PDF
Author and Disclosure Information

Drs. Kojder, Leszczynska, Riddle, Diaz, and Ahmed are from The University of Texas at Austin Dell Medical School. Drs. Kojder, Riddle, Diaz, and Ahmed are from the Division of Dermatology and Dermatologic Surgery, Department of Internal Medicine, and Dr. Leszczynska is from the Division of Pediatric Dermatology, Department of Pediatrics. Dr. Pisano is from the Department of Dermatology, Harvard Medical School, Boston, Massachusetts.

The authors report no conflict of interest.

Correspondence: Ammar M. Ahmed, MD, Division of Dermatology, The University of Texas at Austin Dell Medical School, 1601 Trinity St, Ste 7.802, Austin, TX 78701 (amahmed@ascension.org).

Issue
Cutis - 112(1)
Publications
Topics
Page Number
12-15
Sections
Author and Disclosure Information

Drs. Kojder, Leszczynska, Riddle, Diaz, and Ahmed are from The University of Texas at Austin Dell Medical School. Drs. Kojder, Riddle, Diaz, and Ahmed are from the Division of Dermatology and Dermatologic Surgery, Department of Internal Medicine, and Dr. Leszczynska is from the Division of Pediatric Dermatology, Department of Pediatrics. Dr. Pisano is from the Department of Dermatology, Harvard Medical School, Boston, Massachusetts.

The authors report no conflict of interest.

Correspondence: Ammar M. Ahmed, MD, Division of Dermatology, The University of Texas at Austin Dell Medical School, 1601 Trinity St, Ste 7.802, Austin, TX 78701 (amahmed@ascension.org).

Author and Disclosure Information

Drs. Kojder, Leszczynska, Riddle, Diaz, and Ahmed are from The University of Texas at Austin Dell Medical School. Drs. Kojder, Riddle, Diaz, and Ahmed are from the Division of Dermatology and Dermatologic Surgery, Department of Internal Medicine, and Dr. Leszczynska is from the Division of Pediatric Dermatology, Department of Pediatrics. Dr. Pisano is from the Department of Dermatology, Harvard Medical School, Boston, Massachusetts.

The authors report no conflict of interest.

Correspondence: Ammar M. Ahmed, MD, Division of Dermatology, The University of Texas at Austin Dell Medical School, 1601 Trinity St, Ste 7.802, Austin, TX 78701 (amahmed@ascension.org).

Article PDF
Article PDF
IN COLLABORATION WITH THE SKIN OF COLOR SOCIETY
IN COLLABORATION WITH THE SKIN OF COLOR SOCIETY

Dermatologic disparities disproportionately affect patients with skin of color (SOC). Two studies assessing the diagnostic accuracy of medical students have shown disparities in diagnosing common skin conditions presenting in darker skin compared to lighter skin at early stages of training.1,2 This knowledge gap could be attributed to the underrepresentation of SOC in dermatologic textbooks, journals, and educational curricula.3-6 It is important for dermatologists as well as physicians in other specialties and ancillary health care workers involved in treating or triaging dermatologic diseases to recognize common skin conditions presenting in SOC. We sought to evaluate the effectiveness of a focused educational module for improving diagnostic accuracy and confidence in treating SOC among interprofessional health care providers.

Methods

Interprofessional health care providers—medical students, residents/fellows, attending physicians, advanced practice providers (APPs), and nurses practicing across various medical specialties—at The University of Texas at Austin Dell Medical School and Ascension Medical Group (both in Austin, Texas) were invited to participate in an institutional review board–exempt study involving a virtual SOC educational module from February through May 2021. The 1-hour module involved a pretest, a 15-minute lecture, an immediate posttest, and a 3-month posttest. All tests included the same 40 multiple-choice questions of 20 dermatologic conditions portrayed in lighter and darker skin types from VisualDx.com, and participants were asked to identify the condition in each photograph. Questions appeared one at a time in a randomized order, and answers could not be changed once submitted.

For analysis, the dermatologic conditions were categorized into 4 groups: cancerous, infectious, inflammatory, and SOC-associated conditions. Cancerous conditions included basal cell carcinoma, squamous cell carcinoma, and melanoma. Infectious conditions included herpes zoster, tinea corporis, tinea versicolor, staphylococcal scalded skin syndrome, and verruca vulgaris. Inflammatory conditions included acne, atopic dermatitis, pityriasis rosea, psoriasis, seborrheic dermatitis, contact dermatitis, lichen planus, and urticaria. Skin of color–associated conditions included hidradenitis suppurativa, acanthosis nigricans, keloid, and melasma. Two questions utilizing a 5-point Likert scale assessing confidence in diagnosing light and dark skin also were included.

The pre-recorded 15-minute video lecture was given by 2 dermatology residents (P.L.K. and C.P.), and the learning objectives covered morphologic differences in lighter skin and darker skin, comparisons of common dermatologic diseases in lighter skin and darker skin, diseases more commonly affecting patients with SOC, and treatment considerations for conditions affecting skin and hair in patients with SOC. Photographs from the diagnostic accuracy assessment were not reused in the lecture. Detailed explanations on morphology, diagnostic pearls, and treatment options for all conditions tested were provided to participants upon completion of the 3-month posttest.

Statistical Analysis—Test scores were compared between conditions shown in lighter and darker skin types and from the pretest to the immediate posttest and 3-month posttest. Multiple linear regression was used to assess for intervention effects on lighter and darker skin scores controlling for provider type and specialty. All tests were 2-sided with significance at P<.05. Analyses were conducted using Stata 17.

Results

One hundred participants completed the pretest and immediate posttest, 36 of whom also completed the 3-month posttest (Table). There was no significant difference in baseline characteristics between the pretest and 3-month posttest groups.

CT112001012_Table.jpg

Test scores were correlated with provider type and specialty but not age, sex, or race/ethnicity. Specializing in dermatology and being a resident or attending physician were independently associated with higher test scores. Mean pretest diagnostic accuracy and confidence scores were higher for skin conditions shown in lighter skin compared with those shown in darker skin (13.6 vs 11.3 and 2.7 vs 1.9, respectively; both P<.001). Pretest diagnostic accuracy was significantly higher for skin conditions shown in lighter skin compared with darker skin for cancerous, inflammatory, and infectious conditions (72% vs 50%, 68% vs 55%, and 57% vs 47%, respectively; P<.001 for all)(Figure 1). Skin of color–associated conditions were not associated with significantly different scores for lighter skin compared with darker skin (79% vs 75%; P=.059).

Kojder_1.jpg
%3Cp%3E%3Cstrong%3EFIGURE%201.%3C%2Fstrong%3E%20Pretest%20percentage%20correct%20score%20in%20lighter%20skin%20compared%20with%20darker%20skin%20categorized%20by%20type%20of%20skin%20condition.%20Asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.001.%3C%2Fp%3E

 

 

Controlling for provider type and specialty, significantly improved diagnostic accuracy was seen in immediate posttest scores compared with pretest scores for conditions shown in both lighter and darker skin types (lighter: 15.2 vs 13.6; darker: 13.3 vs 11.3; both P<.001)(Figure 2). The immediate posttest demonstrated higher mean diagnostic accuracy and confidence scores for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 15.2 vs 13.3; confidence: 3.0 vs 2.6; both P<.001), but the disparity between scores was less than in the pretest.

Kojder_2.jpg
%3Cp%3E%3Cstrong%3EFIGURE%202.%3C%2Fstrong%3E%20Mean%20scores%20for%20diagnostic%20accuracy%20overall%20and%20in%20lighter%20and%20darker%20skin%20following%20pretest%2C%20immediate%20posttest%2C%20and%203-month%20posttest.%20Single%20asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.05%3B%20double%20asterisk%2C%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.01%3B%20triple%20asterisk%2C%20P%26lt%3B.001.%3C%2Fp%3E

Following the 3-month posttest, improvement in diagnostic accuracy was noted among both lighter and darker skin types compared with the pretest, but the difference remained significant only for conditions shown in darker skin (mean scores, 11.3 vs 13.3; P<.01). Similarly, confidence in diagnosing conditions in both lighter and darker skin improved following the immediate posttest (mean scores, 2.7 vs 3.0 and 1.9 vs 2.6; both P<.001), and this improvement remained significant for only darker skin following the 3-month posttest (mean scores, 1.9 vs 2.3; P<.001). Despite these improvements, diagnostic accuracy and confidence remained higher for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 14.7 vs 13.3; P<.01; confidence: 2.8 vs 2.3; P<.001), though the disparity between scores was again less than in the pretest.

Comment

Our study showed that there are diagnostic disparities between lighter and darker skin types among interprofessional health care providers. Education on SOC should extend to interprofessional health care providers and other medical specialties involved in treating or triaging dermatologic diseases. A focused educational module may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in SOC. Differences in diagnostic accuracy between conditions shown in lighter and darker skin types were noted for the disease categories of infectious, cancerous, and inflammatory conditions, with the exception of conditions more frequently seen in patients with SOC. Learning resources for SOC-associated conditions are more likely to have greater representation of images depicting darker skin types.7 Future educational interventions may need to focus on dermatologic conditions that are not preferentially seen in patients with SOC. In our study, the pretest scores for conditions shown in darker skin were lowest among infectious and cancerous conditions. For infections, certain morphologic clues such as erythema are important for diagnosis but may be more subtle or difficult to discern in darker skin. It also is possible that providers may be less likely to suspect skin cancer in patients with SOC given that the morphologic presentation and/or anatomic site of involvement for skin cancers in SOC differs from those in lighter skin. Future educational interventions targeting disparities in diagnostic accuracy should focus on conditions that are not specifically associated with SOC.

Limitations of our study included the small number of participants, the study population came from a single institution, and a possible selection bias for providers interested in dermatology.

Conclusion

Disparities exist among interprofessional health care providers when treating conditions in patients with lighter skin compared to darker skin. An educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.

Dermatologic disparities disproportionately affect patients with skin of color (SOC). Two studies assessing the diagnostic accuracy of medical students have shown disparities in diagnosing common skin conditions presenting in darker skin compared to lighter skin at early stages of training.1,2 This knowledge gap could be attributed to the underrepresentation of SOC in dermatologic textbooks, journals, and educational curricula.3-6 It is important for dermatologists as well as physicians in other specialties and ancillary health care workers involved in treating or triaging dermatologic diseases to recognize common skin conditions presenting in SOC. We sought to evaluate the effectiveness of a focused educational module for improving diagnostic accuracy and confidence in treating SOC among interprofessional health care providers.

Methods

Interprofessional health care providers—medical students, residents/fellows, attending physicians, advanced practice providers (APPs), and nurses practicing across various medical specialties—at The University of Texas at Austin Dell Medical School and Ascension Medical Group (both in Austin, Texas) were invited to participate in an institutional review board–exempt study involving a virtual SOC educational module from February through May 2021. The 1-hour module involved a pretest, a 15-minute lecture, an immediate posttest, and a 3-month posttest. All tests included the same 40 multiple-choice questions of 20 dermatologic conditions portrayed in lighter and darker skin types from VisualDx.com, and participants were asked to identify the condition in each photograph. Questions appeared one at a time in a randomized order, and answers could not be changed once submitted.

For analysis, the dermatologic conditions were categorized into 4 groups: cancerous, infectious, inflammatory, and SOC-associated conditions. Cancerous conditions included basal cell carcinoma, squamous cell carcinoma, and melanoma. Infectious conditions included herpes zoster, tinea corporis, tinea versicolor, staphylococcal scalded skin syndrome, and verruca vulgaris. Inflammatory conditions included acne, atopic dermatitis, pityriasis rosea, psoriasis, seborrheic dermatitis, contact dermatitis, lichen planus, and urticaria. Skin of color–associated conditions included hidradenitis suppurativa, acanthosis nigricans, keloid, and melasma. Two questions utilizing a 5-point Likert scale assessing confidence in diagnosing light and dark skin also were included.

The pre-recorded 15-minute video lecture was given by 2 dermatology residents (P.L.K. and C.P.), and the learning objectives covered morphologic differences in lighter skin and darker skin, comparisons of common dermatologic diseases in lighter skin and darker skin, diseases more commonly affecting patients with SOC, and treatment considerations for conditions affecting skin and hair in patients with SOC. Photographs from the diagnostic accuracy assessment were not reused in the lecture. Detailed explanations on morphology, diagnostic pearls, and treatment options for all conditions tested were provided to participants upon completion of the 3-month posttest.

Statistical Analysis—Test scores were compared between conditions shown in lighter and darker skin types and from the pretest to the immediate posttest and 3-month posttest. Multiple linear regression was used to assess for intervention effects on lighter and darker skin scores controlling for provider type and specialty. All tests were 2-sided with significance at P<.05. Analyses were conducted using Stata 17.

Results

One hundred participants completed the pretest and immediate posttest, 36 of whom also completed the 3-month posttest (Table). There was no significant difference in baseline characteristics between the pretest and 3-month posttest groups.

CT112001012_Table.jpg

Test scores were correlated with provider type and specialty but not age, sex, or race/ethnicity. Specializing in dermatology and being a resident or attending physician were independently associated with higher test scores. Mean pretest diagnostic accuracy and confidence scores were higher for skin conditions shown in lighter skin compared with those shown in darker skin (13.6 vs 11.3 and 2.7 vs 1.9, respectively; both P<.001). Pretest diagnostic accuracy was significantly higher for skin conditions shown in lighter skin compared with darker skin for cancerous, inflammatory, and infectious conditions (72% vs 50%, 68% vs 55%, and 57% vs 47%, respectively; P<.001 for all)(Figure 1). Skin of color–associated conditions were not associated with significantly different scores for lighter skin compared with darker skin (79% vs 75%; P=.059).

Kojder_1.jpg
%3Cp%3E%3Cstrong%3EFIGURE%201.%3C%2Fstrong%3E%20Pretest%20percentage%20correct%20score%20in%20lighter%20skin%20compared%20with%20darker%20skin%20categorized%20by%20type%20of%20skin%20condition.%20Asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.001.%3C%2Fp%3E

 

 

Controlling for provider type and specialty, significantly improved diagnostic accuracy was seen in immediate posttest scores compared with pretest scores for conditions shown in both lighter and darker skin types (lighter: 15.2 vs 13.6; darker: 13.3 vs 11.3; both P<.001)(Figure 2). The immediate posttest demonstrated higher mean diagnostic accuracy and confidence scores for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 15.2 vs 13.3; confidence: 3.0 vs 2.6; both P<.001), but the disparity between scores was less than in the pretest.

Kojder_2.jpg
%3Cp%3E%3Cstrong%3EFIGURE%202.%3C%2Fstrong%3E%20Mean%20scores%20for%20diagnostic%20accuracy%20overall%20and%20in%20lighter%20and%20darker%20skin%20following%20pretest%2C%20immediate%20posttest%2C%20and%203-month%20posttest.%20Single%20asterisk%20indicates%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.05%3B%20double%20asterisk%2C%20%3Cem%3EP%3C%2Fem%3E%26lt%3B.01%3B%20triple%20asterisk%2C%20P%26lt%3B.001.%3C%2Fp%3E

Following the 3-month posttest, improvement in diagnostic accuracy was noted among both lighter and darker skin types compared with the pretest, but the difference remained significant only for conditions shown in darker skin (mean scores, 11.3 vs 13.3; P<.01). Similarly, confidence in diagnosing conditions in both lighter and darker skin improved following the immediate posttest (mean scores, 2.7 vs 3.0 and 1.9 vs 2.6; both P<.001), and this improvement remained significant for only darker skin following the 3-month posttest (mean scores, 1.9 vs 2.3; P<.001). Despite these improvements, diagnostic accuracy and confidence remained higher for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 14.7 vs 13.3; P<.01; confidence: 2.8 vs 2.3; P<.001), though the disparity between scores was again less than in the pretest.

Comment

Our study showed that there are diagnostic disparities between lighter and darker skin types among interprofessional health care providers. Education on SOC should extend to interprofessional health care providers and other medical specialties involved in treating or triaging dermatologic diseases. A focused educational module may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in SOC. Differences in diagnostic accuracy between conditions shown in lighter and darker skin types were noted for the disease categories of infectious, cancerous, and inflammatory conditions, with the exception of conditions more frequently seen in patients with SOC. Learning resources for SOC-associated conditions are more likely to have greater representation of images depicting darker skin types.7 Future educational interventions may need to focus on dermatologic conditions that are not preferentially seen in patients with SOC. In our study, the pretest scores for conditions shown in darker skin were lowest among infectious and cancerous conditions. For infections, certain morphologic clues such as erythema are important for diagnosis but may be more subtle or difficult to discern in darker skin. It also is possible that providers may be less likely to suspect skin cancer in patients with SOC given that the morphologic presentation and/or anatomic site of involvement for skin cancers in SOC differs from those in lighter skin. Future educational interventions targeting disparities in diagnostic accuracy should focus on conditions that are not specifically associated with SOC.

Limitations of our study included the small number of participants, the study population came from a single institution, and a possible selection bias for providers interested in dermatology.

Conclusion

Disparities exist among interprofessional health care providers when treating conditions in patients with lighter skin compared to darker skin. An educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.

References
  1. Fenton A, Elliott E, Shahbandi A, et al. Medical students’ ability to diagnose common dermatologic conditions in skin of color. J Am Acad Dermatol. 2020;83:957-958. doi:10.1016/j.jaad.2019.12.078
  2. Mamo A, Szeto MD, Rietcheck H, et al. Evaluating medical student assessment of common dermatologic conditions across Fitzpatrick phototypes and skin of color. J Am Acad Dermatol. 2022;87:167-169. doi:10.1016/j.jaad.2021.06.868
  3. Guda VA, Paek SY. Skin of color representation in commonly utilized medical student dermatology resources. J Drugs Dermatol. 2021;20:799. doi:10.36849/JDD.5726
  4. Wilson BN, Sun M, Ashbaugh AG, et al. Assessment of skin of color and diversity and inclusion content of dermatologic published literature: an analysis and call to action. Int J Womens Dermatol. 2021;7:391-397. doi:10.1016/j.ijwd.2021.04.001
  5. Ibraheim MK, Gupta R, Dao H, et al. Evaluating skin of color education in dermatology residency programs: data from a national survey. Clin Dermatol. 2022;40:228-233. doi:10.1016/j.clindermatol.2021.11.015
  6. Gupta R, Ibraheim MK, Dao H Jr, et al. Assessing dermatology resident confidence in caring for patients with skin of color. Clin Dermatol. 2021;39:873-878. doi:10.1016/j.clindermatol.2021.08.019
  7. Chang MJ, Lipner SR. Analysis of skin color on the American Academy of Dermatology public education website. J Drugs Dermatol. 2020;19:1236-1237. doi:10.36849/JDD.2020.5545
References
  1. Fenton A, Elliott E, Shahbandi A, et al. Medical students’ ability to diagnose common dermatologic conditions in skin of color. J Am Acad Dermatol. 2020;83:957-958. doi:10.1016/j.jaad.2019.12.078
  2. Mamo A, Szeto MD, Rietcheck H, et al. Evaluating medical student assessment of common dermatologic conditions across Fitzpatrick phototypes and skin of color. J Am Acad Dermatol. 2022;87:167-169. doi:10.1016/j.jaad.2021.06.868
  3. Guda VA, Paek SY. Skin of color representation in commonly utilized medical student dermatology resources. J Drugs Dermatol. 2021;20:799. doi:10.36849/JDD.5726
  4. Wilson BN, Sun M, Ashbaugh AG, et al. Assessment of skin of color and diversity and inclusion content of dermatologic published literature: an analysis and call to action. Int J Womens Dermatol. 2021;7:391-397. doi:10.1016/j.ijwd.2021.04.001
  5. Ibraheim MK, Gupta R, Dao H, et al. Evaluating skin of color education in dermatology residency programs: data from a national survey. Clin Dermatol. 2022;40:228-233. doi:10.1016/j.clindermatol.2021.11.015
  6. Gupta R, Ibraheim MK, Dao H Jr, et al. Assessing dermatology resident confidence in caring for patients with skin of color. Clin Dermatol. 2021;39:873-878. doi:10.1016/j.clindermatol.2021.08.019
  7. Chang MJ, Lipner SR. Analysis of skin color on the American Academy of Dermatology public education website. J Drugs Dermatol. 2020;19:1236-1237. doi:10.36849/JDD.2020.5545
Issue
Cutis - 112(1)
Issue
Cutis - 112(1)
Page Number
12-15
Page Number
12-15
Publications
Publications
Topics
Article Type
Display Headline
Improving Diagnostic Accuracy in Skin of Color Using an Educational Module
Display Headline
Improving Diagnostic Accuracy in Skin of Color Using an Educational Module
Sections
Teambase XML
<?xml version="1.0" encoding="UTF-8"?>
<!--$RCSfile: InCopy_agile.xsl,v $ $Revision: 1.35 $-->
<!--$RCSfile: drupal.xsl,v $ $Revision: 1.7 $-->
<root generator="drupal.xsl" gversion="1.7"> <header> <fileName>Kojder</fileName> <TBEID>0C02D5CD.SIG</TBEID> <TBUniqueIdentifier>NJ_0C02D5CD</TBUniqueIdentifier> <newsOrJournal>Journal</newsOrJournal> <publisherName>Frontline Medical Communications Inc.</publisherName> <storyname>Kojder</storyname> <articleType>1</articleType> <TBLocation>Copyfitting-CT</TBLocation> <QCDate/> <firstPublished>20230703T101444</firstPublished> <LastPublished>20230703T101444</LastPublished> <pubStatus qcode="stat:"/> <embargoDate/> <killDate/> <CMSDate>20230703T101444</CMSDate> <articleSource/> <facebookInfo/> <meetingNumber/> <byline>Priscilla Ly Kojder, MD; Catherine Pisano, MD</byline> <bylineText>Priscilla Ly Kojder, MD; Catherine Pisano, MD; Maria Leszczynska, MD; Ashley Riddle, MD; Lucia Diaz, MD; Ammar M. Ahmed, MD</bylineText> <bylineFull>Priscilla Ly Kojder, MD; Catherine Pisano, MD</bylineFull> <bylineTitleText/> <USOrGlobal/> <wireDocType/> <newsDocType/> <journalDocType/> <linkLabel/> <pageRange>12-15</pageRange> <citation/> <quizID/> <indexIssueDate/> <itemClass qcode="ninat:text"/> <provider qcode="provider:"> <name/> <rightsInfo> <copyrightHolder> <name/> </copyrightHolder> <copyrightNotice/> </rightsInfo> </provider> <abstract/> <metaDescription>Dermatologic disparities disproportionately affect patients with skin of color (SOC). Two studies assessing the diagnostic accuracy of medical students have sho</metaDescription> <articlePDF>296232</articlePDF> <teaserImage/> <title>Improving Diagnostic Accuracy in Skin of Color Using an Educational Module</title> <deck/> <disclaimer/> <AuthorList/> <articleURL/> <doi/> <pubMedID/> <publishXMLStatus/> <publishXMLVersion>1</publishXMLVersion> <useEISSN>0</useEISSN> <urgency/> <pubPubdateYear>2023</pubPubdateYear> <pubPubdateMonth>July</pubPubdateMonth> <pubPubdateDay/> <pubVolume>112</pubVolume> <pubNumber>1</pubNumber> <wireChannels/> <primaryCMSID/> <CMSIDs> <CMSID>2159</CMSID> </CMSIDs> <keywords> <keyword>diversity in medicine</keyword> </keywords> <seeAlsos/> <publications_g> <publicationData> <publicationCode>CT</publicationCode> <pubIssueName>July 2023</pubIssueName> <pubArticleType>Departments | 2159</pubArticleType> <pubTopics/> <pubCategories/> <pubSections/> <journalTitle>Cutis</journalTitle> <journalFullTitle>Cutis</journalFullTitle> <copyrightStatement>Copyright 2015 Frontline Medical Communications Inc., Parsippany, NJ, USA. All rights reserved.</copyrightStatement> </publicationData> </publications_g> <publications> <term canonical="true">12</term> </publications> <sections> <term canonical="true">136</term> </sections> <topics> <term canonical="true">66772</term> </topics> <links> <link> <itemClass qcode="ninat:composite"/> <altRep contenttype="application/pdf">images/180024d3.pdf</altRep> <description role="drol:caption"/> <description role="drol:credit"/> </link> </links> </header> <itemSet> <newsItem> <itemMeta> <itemRole>Main</itemRole> <itemClass>text</itemClass> <title>Improving Diagnostic Accuracy in Skin of Color Using an Educational Module</title> <deck/> </itemMeta> <itemContent> <p class="abstract">Dermatologic disparities disproportionately affect patients with skin of color (SOC). This study evaluated the effectiveness of a focused educational module for improving diagnostic accuracy and confidence in the treatment of patients with SOC among interprofessional health care providers. An SOC educational module involving a pretest, 15-minute lecture, immediate posttest, and 3-month posttest was created. One hundred participants completed the pretest and immediate posttest; 36 of them also completed the 3-month posttest. Our results suggest that a focused educational module may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.</p> <p> <em><em>Cutis. </em>2023;112:12-15.</em> </p> <p>Dermatologic disparities disproportionately affect patients with skin of color (SOC). Two studies assessing the diagnostic accuracy of medical students have shown disparities in diagnosing common skin conditions presenting in darker skin compared to lighter skin at early stages of training.<sup>1,2</sup> This knowledge gap could be attributed to the underrepresentation of SOC in dermatologic textbooks, journals, and educational curricula.<sup>3-6</sup> It is important for dermatologists as well as physicians in other specialties and ancillary health care workers involved in treating or triaging dermatologic diseases to recognize common skin conditions presenting in SOC. We sought to evaluate the effectiveness of a focused educational module for improving diagnostic accuracy and confidence in treating SOC among interprofessional health care providers.</p> <h3>Methods </h3> <p>Interprofessional health care providers—medical students, residents/fellows, attending physicians, advanced practice providers (APPs), and nurses practicing across various medical specialties—at The University of Texas at Austin Dell Medical School and Ascension Medical Group (both in Austin, Texas) were invited to participate in an institutional review board–exempt study involving a virtual SOC educational module from February through May 2021. The 1-hour module involved a pretest, a 15-minute lecture, an immediate posttest, and a 3-month posttest. All tests included the same 40 multiple-choice questions of 20 dermatologic conditions portrayed in lighter and darker skin types from VisualDx.com, and participants were asked to identify the condition in each photograph. Questions appeared one at a time in a randomized order, and answers could not be changed once submitted. </p> <p>For analysis, the dermatologic conditions were categorized into 4 groups: cancerous, infectious, inflammatory, and SOC-associated conditions. Cancerous conditions included basal cell carcinoma, squamous cell carcinoma, and melanoma. Infectious conditions included herpes zoster, tinea corporis, tinea versicolor, staphylococcal scalded skin syndrome, and verruca vulgaris. Inflammatory conditions included acne, atopic dermatitis, pityriasis rosea, psoriasis, seborrheic dermatitis, contact dermatitis, lichen planus, and urticaria. Skin of color–associated conditions included hidradenitis suppurativa, acanthosis nigricans, keloid, and melasma. Two questions utilizing a 5-point Likert scale assessing confidence in diagnosing light and dark skin also were included. <br/><br/>The pre-recorded 15-minute video lecture was given by 2 dermatology residents (P.L.K. and C.P.), and the learning objectives covered morphologic differences in lighter skin and darker skin, comparisons of common dermatologic diseases in lighter skin and darker skin, diseases more commonly affecting patients with SOC, and treatment considerations for conditions affecting skin and hair in patients with SOC. Photographs from the diagnostic accuracy assessment were not reused in the lecture. Detailed explanations on morphology, diagnostic pearls, and treatment options for all conditions tested were provided to participants upon completion of the 3-month posttest.<br/><br/><i>Statistical Analysis</i>—Test scores were compared between conditions shown in lighter and darker skin types and from the pretest to the immediate posttest and 3-month posttest. Multiple linear regression was used to assess for intervention effects on lighter and darker skin scores controlling for provider type and specialty. All tests were 2-sided with significance at <i>P</i><span class="body">&lt;</span>.05. Analyses were conducted using Stata 17.</p> <h3>Results </h3> <p>One hundred participants completed the pretest and immediate posttest, 36 of whom also completed the 3-month posttest (Table). There was no significant difference in baseline characteristics between the pretest and 3-month posttest groups.</p> <p>Test scores were correlated with provider type and specialty but not age, sex, or race/ethnicity. Specializing in dermatology and being a resident or attending physician were independently associated with higher test scores. Mean pretest diagnostic accuracy and confidence scores were higher for skin conditions shown in lighter skin compared with those shown in darker skin (13.6 vs 11.3 and 2.7 vs 1.9, respectively; both <i>P</i><span class="body">&lt;</span>.001). Pretest diagnostic accuracy was significantly higher for skin conditions shown in lighter skin compared with darker skin for cancerous, inflammatory, and infectious conditions (72% vs 50%, 68% vs 55%, and 57% vs 47%, respectively; <i>P</i><span class="body">&lt;</span>.001 for all)(Figure 1). Skin of color–associated conditions were not associated with significantly different scores for lighter skin compared with darker skin (79% vs 75%; <i>P</i><span class="body">=</span>.059). <br/><br/>Controlling for provider type and specialty, significantly improved diagnostic accuracy was seen in immediate posttest scores compared with pretest scores for conditions shown in both lighter and darker skin types (lighter: 15.2 vs 13.6; darker: 13.3 vs 11.3; both <i>P</i><span class="body">&lt;</span>.001)(Figure 2). The immediate posttest demonstrated higher mean diagnostic accuracy and confidence scores for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 15.2 vs 13.3; confidence: 3.0 vs 2.6; both <i>P</i><span class="body">&lt;</span>.001), but the disparity between scores was less than in the pretest.<br/><br/>Following the 3-month posttest, improvement in diagnostic accuracy was noted among both lighter and darker skin types compared with the pretest, but the difference remained significant only for conditions shown in darker skin (mean scores, 11.3 vs 13.3; <i>P</i><span class="body">&lt;</span>.01). Similarly, confidence in diagnosing conditions in both lighter and darker skin improved following the immediate posttest (mean scores, 2.7 vs 3.0 and 1.9 vs 2.6; both <i>P</i><span class="body">&lt;</span>.001), and this improvement remained significant for only darker skin following the 3-month posttest (mean scores, 1.9 vs 2.3; <i>P</i><span class="body">&lt;</span>.001). Despite these improvements, diagnostic accuracy and confidence remained higher for skin conditions shown in lighter skin compared with darker skin (diagnostic accuracy: 14.7 vs 13.3; <i>P</i><span class="body">&lt;</span>.01; confidence: 2.8 vs 2.3; <i>P</i><span class="body">&lt;</span>.001), though the disparity between scores was again less than in the pretest. </p> <h3>Comment</h3> <p>Our study showed that there are diagnostic disparities between lighter and darker skin types among interprofessional health care providers. Education on SOC should extend to interprofessional health care providers and other medical specialties involved in treating or triaging dermatologic diseases. A focused educational module may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in SOC. Differences in diagnostic accuracy between conditions shown in lighter and darker skin types were noted for the disease categories of infectious, cancerous, and inflammatory conditions, with the exception of conditions more frequently seen in patients with SOC. Learning resources for SOC-associated conditions are more likely to have greater representation of images depicting darker skin types.<sup>7</sup> Future educational interventions may need to focus on dermatologic conditions that are not preferentially seen in patients with SOC. In our study, the pretest scores for conditions shown in darker skin were lowest among infectious and cancerous conditions. For infections, certain morphologic clues such as erythema are important for diagnosis but may be more subtle or difficult to discern in darker skin. It also is possible that providers may be less likely to suspect skin cancer in patients with SOC given that the morphologic presentation and/or anatomic site of involvement for skin cancers in SOC differs from those in lighter skin. Future educational interventions targeting disparities in diagnostic accuracy should focus on conditions that are not specifically associated with SOC. </p> <p>Limitations of our study included the small number of participants, the study population came from a single institution, and a possible selection bias for providers interested in dermatology.</p> <h3>Conclusion</h3> <p>Disparities exist among interprofessional health care providers when treating conditions in patients with lighter skin compared to darker skin. An educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.</p> <h2>References </h2> <p class="reference"> 1. Fenton A, Elliott E, Shahbandi A, et al. Medical students’ ability to diagnose common dermatologic conditions in skin of color. <i>J Am Acad Dermatol. </i>2020;83:957-958. doi:10.1016/j.jaad.2019.12.078<br/><br/> 2. Mamo A, Szeto MD, Rietcheck H, et al. Evaluating medical student assessment of common dermatologic conditions across Fitzpatrick phototypes and skin of color. <i>J Am Acad Dermatol. </i>2022;87:167-169. <span class="citation-doi">doi:10.1016/j.jaad.2021.06.868<br/><br/></span> 3. Guda VA, Paek SY. Skin of color representation in commonly utilized medical student dermatology resources. <i>J Drugs Dermatol. </i>2021;20:799. doi:10.36849/JDD.5726<br/><br/> 4. Wilson BN, Sun M, Ashbaugh AG, et al. Assessment of skin of color and diversity and inclusion content of dermatologic published literature: an analysis and call to action. <i>Int J Womens Dermatol.</i> 2021;7:391-397. <span class="citation-doi">doi:10.1016/j.ijwd.2021.04.001<br/><br/></span> 5. Ibraheim MK, Gupta R, Dao H, et al. Evaluating skin of color education in dermatology residency programs: data from a national survey.<i> Clin Dermatol.</i> 2022;40:228-233. <span class="citation-doi">doi:10.1016/j.clindermatol.2021.11.015<br/><br/></span> 6. Gupta R, Ibraheim MK, Dao H Jr, et al. Assessing dermatology resident confidence in caring for patients with skin of color. <i>Clin Dermatol.</i> 2021;39:873-878. doi:10.1016/j.clindermatol.2021.08.019<br/><br/> 7. Chang MJ, Lipner SR. Analysis of skin color on the American Academy of Dermatology public education website.<i> J Drugs Dermatol.</i> 2020;19:1236-1237. doi:10.36849/JDD.2020.5545</p> </itemContent> </newsItem> <newsItem> <itemMeta> <itemRole>bio</itemRole> <itemClass>text</itemClass> <title/> <deck/> </itemMeta> <itemContent> <p class="disclosure">Drs. Kojder, Leszczynska, Riddle, Diaz, and Ahmed are from The University of Texas at Austin Dell Medical School. Drs. Kojder, Riddle, Diaz, and Ahmed are from the Division of Dermatology and Dermatologic Surgery, Department of Internal Medicine, and Dr. Leszczynska is from the Division of Pediatric Dermatology, Department of Pediatrics. Dr. Pisano is from the Department of Dermatology, Harvard Medical School, Boston, Massachusetts.</p> <p class="disclosure">The authors report no conflict of interest. <br/><br/>Correspondence: Ammar M. Ahmed, MD, Division of Dermatology, The University of Texas at Austin Dell Medical School, 1601 Trinity St, Ste 7.802, Austin, TX 78701 (amahmed@ascension.org). <br/><br/>doi:10.12788/cutis.0804</p> </itemContent> </newsItem> <newsItem> <itemMeta> <itemRole>in</itemRole> <itemClass>text</itemClass> <title/> <deck/> </itemMeta> <itemContent> <p class="insidehead">Practice<strong> Points</strong></p> <ul class="insidebody"> <li>Disparities exist among interprofessional health care providers when diagnosing conditions in patients with lighter and darker skin, specifically for infectious, cancerous, or inflammatory conditions vs conditions that are preferentially seen in patients with skin of color (SOC).</li> <li>A focused educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.</li> </ul> </itemContent> </newsItem> </itemSet></root>
Inside the Article

Practice Points

  • Disparities exist among interprofessional health care providers when diagnosing conditions in patients with lighter and darker skin, specifically for infectious, cancerous, or inflammatory conditions vs conditions that are preferentially seen in patients with skin of color (SOC).
  • A focused educational module for health care providers may provide long-term improvements in diagnostic accuracy and confidence for conditions presenting in patients with SOC.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Image
Teambase ID
18002544.SIG
Disable zoom
Off