Article Type
Changed
Fri, 04/09/2021 - 14:50

Health care–specific (eg, Healthgrades, Zocdoc, Vitals, WebMD) and general consumer websites (eg, Google, Yelp) are popular platforms for patients to find physicians, schedule appointments, and review physician experiences. Patients find ratings on these websites more trustworthy than standardized surveys distributed by hospitals, but many physicians do not trust the reviews on these sites. For example, in a survey of both physicians (n=828) and patients (n=494), 36% of physicians trusted online reviews compared to 57% of patients.1 The objective of this study was to determine if health care–specific or general consumer websites more accurately reflect overall patient sentiment. This knowledge can help physicians who are seeking to improve the patient experience understand which websites have more accurate and trustworthy reviews.

Methods

A list of dermatologists from the top 10 most and least dermatologist–dense areas in the United States was compiled to examine different physician populations.2 Equal numbers of male and female dermatologists were randomly selected from the most dense areas. All physicians were included from the least dense areas because of limited sample size. Ratings were collected from websites most likely to appear on the first page of a Google search for a physician name, as these are most likely to be seen by patients. Descriptive statistics were generated to describe the study population; mean and median physician rating (using a scale of 1–5); SD; and minimum, maximum, and interquartile ranges. Spearman correlation coefficients were generated to examine the strength of association between ratings from website pairs. P<.05 was considered statistically significant, with analyses performed in R (3.6.2) for Windows (the R Foundation).

Results

A total of 167 representative physicians were included in this analysis; 141 from the most dense areas, and 26 from the least dense areas. The lowest average ratings for the entire sample and most dermatologist–dense areas were found on Yelp (3.61 and 3.60, respectively), and the lowest ratings in the least dermatologist–dense areas were found on Google (3.45)(Table 1). Correlation coefficient values were lowest for Zocdoc and Healthgrades (0.263) and highest for Vitals and WebMD (0.963)(Table 2). The health care–specific sites were closer to the overall average (4.06) than the general consumer sites (eFigure).

Comment

Although dermatologist ratings on each site had a broad range, we found that patients typically expressed negative interactions on general consumer websites rather than health care–specific websites. When comparing the ratings of the same group of dermatologists across different sites, ratings on health care–specific sites had a higher degree of correlation, with physician ratings more similar between 2 health care–specific sites and less similar between a health care–specific and a general consumer website. This pattern was consistent in both dermatologist-dense and dermatologist-poor areas, despite patients having varying levels of access to dermatologic care and medical resources and potentially different regional preferences of consumer websites. Taken together, these findings imply that health care–specific websites more consistently reflect overall patient sentiment.

Although one 2016 study comparing reviews of dermatology practices on Zocdoc and Yelp also demonstrated lower average ratings on Yelp,3 our study suggests that this trend is not isolated to these 2 sites but can be seen when comparing many health care–specific sites vs general consumer sites.

Our study compared ratings of dermatologists among popular websites to understand those that are most representative of patient attitudes toward physicians. These findings are important because online reviews reflect the entire patient experience, not just the patient-physician interaction, which may explain why physician scores on standardized questionnaires, such as Press Ganey surveys, do not correlate well with their online reviews.4 In a study comparing 98 physicians with negative online ratings to 82 physicians in similar departments with positive ratings, there was no significant difference in scores on patient-physician interaction questions on the Press Ganey survey.5 However, physicians who received negative online reviews scored lower on Press Ganey questions related to nonphysician interactions (eg, office cleanliness, interactions with staff).

The current study was subject to several limitations. Our analysis included all physicians in our random selection without accounting for those physicians with a greater online presence who might be more cognizant of these ratings and try to manipulate them through a reputation-management company or public relations consultant.

Conclusion

Our study suggests that consumer websites are not primarily used by disgruntled patients wishing to express grievances; instead, on average, most physicians received positive reviews. Furthermore, health care–specific websites show a higher degree of concordance than and may more accurately reflect overall patient attitudes toward their physicians than general consumer sites. Reviews from these health care–specific sites may be more helpful than general consumer websites in allowing physicians to understand patient sentiment and improve patient experiences.

References
  1. Frost C, Mesfin A. Online reviews of orthopedic surgeons: an emerging trend. Orthopedics. 2015;38:e257-e262. doi:10.3928/01477447-20150402-52
  2. Waqas B, Cooley V, Lipner SR. Association of sex, location, and experience with online patient ratings of dermatologists. J Am Acad Dermatol. 2020;83:954-955.
  3. Smith RJ, Lipoff JB. Evaluation of dermatology practice online reviews: lessons from qualitative analysis. JAMA Dermatol. 2016;152:153-157. doi:10.1001/jamadermatol.2015.3950
  4. Chen J, Presson A, Zhang C, et al. Online physician review websites poorly correlate to a validated metric of patient satisfaction. J Surg Res. 2018;227:1-6.
  5. Widmer RJ, Maurer MJ, Nayar VR, et al. Online physician reviews do not reflect patient satisfaction survey responses. Mayo Clinic Proc. 2018;93:453-457.
Article PDF
Author and Disclosure Information

From Weill Cornell Medical College, New York, New York. Ms. Cooley is from the Clinical and Translational Science Center. Dr. Lipner is from the Department of Dermatology.

The authors report no conflict of interest.

Funding partially supported by a Clinical and Translational Science Center grant at Weill Cornell Medical College (1-UL1-TR002384-01).

The eFigure is available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Shari R. Lipner, MD, PhD, 1305 York Ave, 9th Floor, New York, NY 10021 (shl9032@med.cornell.edu).

Issue
cutis - 107(4)
Publications
Topics
Page Number
182-184, E1
Sections
Author and Disclosure Information

From Weill Cornell Medical College, New York, New York. Ms. Cooley is from the Clinical and Translational Science Center. Dr. Lipner is from the Department of Dermatology.

The authors report no conflict of interest.

Funding partially supported by a Clinical and Translational Science Center grant at Weill Cornell Medical College (1-UL1-TR002384-01).

The eFigure is available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Shari R. Lipner, MD, PhD, 1305 York Ave, 9th Floor, New York, NY 10021 (shl9032@med.cornell.edu).

Author and Disclosure Information

From Weill Cornell Medical College, New York, New York. Ms. Cooley is from the Clinical and Translational Science Center. Dr. Lipner is from the Department of Dermatology.

The authors report no conflict of interest.

Funding partially supported by a Clinical and Translational Science Center grant at Weill Cornell Medical College (1-UL1-TR002384-01).

The eFigure is available in the Appendix online at www.mdedge.com/dermatology.

Correspondence: Shari R. Lipner, MD, PhD, 1305 York Ave, 9th Floor, New York, NY 10021 (shl9032@med.cornell.edu).

Article PDF
Article PDF

Health care–specific (eg, Healthgrades, Zocdoc, Vitals, WebMD) and general consumer websites (eg, Google, Yelp) are popular platforms for patients to find physicians, schedule appointments, and review physician experiences. Patients find ratings on these websites more trustworthy than standardized surveys distributed by hospitals, but many physicians do not trust the reviews on these sites. For example, in a survey of both physicians (n=828) and patients (n=494), 36% of physicians trusted online reviews compared to 57% of patients.1 The objective of this study was to determine if health care–specific or general consumer websites more accurately reflect overall patient sentiment. This knowledge can help physicians who are seeking to improve the patient experience understand which websites have more accurate and trustworthy reviews.

Methods

A list of dermatologists from the top 10 most and least dermatologist–dense areas in the United States was compiled to examine different physician populations.2 Equal numbers of male and female dermatologists were randomly selected from the most dense areas. All physicians were included from the least dense areas because of limited sample size. Ratings were collected from websites most likely to appear on the first page of a Google search for a physician name, as these are most likely to be seen by patients. Descriptive statistics were generated to describe the study population; mean and median physician rating (using a scale of 1–5); SD; and minimum, maximum, and interquartile ranges. Spearman correlation coefficients were generated to examine the strength of association between ratings from website pairs. P<.05 was considered statistically significant, with analyses performed in R (3.6.2) for Windows (the R Foundation).

Results

A total of 167 representative physicians were included in this analysis; 141 from the most dense areas, and 26 from the least dense areas. The lowest average ratings for the entire sample and most dermatologist–dense areas were found on Yelp (3.61 and 3.60, respectively), and the lowest ratings in the least dermatologist–dense areas were found on Google (3.45)(Table 1). Correlation coefficient values were lowest for Zocdoc and Healthgrades (0.263) and highest for Vitals and WebMD (0.963)(Table 2). The health care–specific sites were closer to the overall average (4.06) than the general consumer sites (eFigure).

Comment

Although dermatologist ratings on each site had a broad range, we found that patients typically expressed negative interactions on general consumer websites rather than health care–specific websites. When comparing the ratings of the same group of dermatologists across different sites, ratings on health care–specific sites had a higher degree of correlation, with physician ratings more similar between 2 health care–specific sites and less similar between a health care–specific and a general consumer website. This pattern was consistent in both dermatologist-dense and dermatologist-poor areas, despite patients having varying levels of access to dermatologic care and medical resources and potentially different regional preferences of consumer websites. Taken together, these findings imply that health care–specific websites more consistently reflect overall patient sentiment.

Although one 2016 study comparing reviews of dermatology practices on Zocdoc and Yelp also demonstrated lower average ratings on Yelp,3 our study suggests that this trend is not isolated to these 2 sites but can be seen when comparing many health care–specific sites vs general consumer sites.

Our study compared ratings of dermatologists among popular websites to understand those that are most representative of patient attitudes toward physicians. These findings are important because online reviews reflect the entire patient experience, not just the patient-physician interaction, which may explain why physician scores on standardized questionnaires, such as Press Ganey surveys, do not correlate well with their online reviews.4 In a study comparing 98 physicians with negative online ratings to 82 physicians in similar departments with positive ratings, there was no significant difference in scores on patient-physician interaction questions on the Press Ganey survey.5 However, physicians who received negative online reviews scored lower on Press Ganey questions related to nonphysician interactions (eg, office cleanliness, interactions with staff).

The current study was subject to several limitations. Our analysis included all physicians in our random selection without accounting for those physicians with a greater online presence who might be more cognizant of these ratings and try to manipulate them through a reputation-management company or public relations consultant.

Conclusion

Our study suggests that consumer websites are not primarily used by disgruntled patients wishing to express grievances; instead, on average, most physicians received positive reviews. Furthermore, health care–specific websites show a higher degree of concordance than and may more accurately reflect overall patient attitudes toward their physicians than general consumer sites. Reviews from these health care–specific sites may be more helpful than general consumer websites in allowing physicians to understand patient sentiment and improve patient experiences.

Health care–specific (eg, Healthgrades, Zocdoc, Vitals, WebMD) and general consumer websites (eg, Google, Yelp) are popular platforms for patients to find physicians, schedule appointments, and review physician experiences. Patients find ratings on these websites more trustworthy than standardized surveys distributed by hospitals, but many physicians do not trust the reviews on these sites. For example, in a survey of both physicians (n=828) and patients (n=494), 36% of physicians trusted online reviews compared to 57% of patients.1 The objective of this study was to determine if health care–specific or general consumer websites more accurately reflect overall patient sentiment. This knowledge can help physicians who are seeking to improve the patient experience understand which websites have more accurate and trustworthy reviews.

Methods

A list of dermatologists from the top 10 most and least dermatologist–dense areas in the United States was compiled to examine different physician populations.2 Equal numbers of male and female dermatologists were randomly selected from the most dense areas. All physicians were included from the least dense areas because of limited sample size. Ratings were collected from websites most likely to appear on the first page of a Google search for a physician name, as these are most likely to be seen by patients. Descriptive statistics were generated to describe the study population; mean and median physician rating (using a scale of 1–5); SD; and minimum, maximum, and interquartile ranges. Spearman correlation coefficients were generated to examine the strength of association between ratings from website pairs. P<.05 was considered statistically significant, with analyses performed in R (3.6.2) for Windows (the R Foundation).

Results

A total of 167 representative physicians were included in this analysis; 141 from the most dense areas, and 26 from the least dense areas. The lowest average ratings for the entire sample and most dermatologist–dense areas were found on Yelp (3.61 and 3.60, respectively), and the lowest ratings in the least dermatologist–dense areas were found on Google (3.45)(Table 1). Correlation coefficient values were lowest for Zocdoc and Healthgrades (0.263) and highest for Vitals and WebMD (0.963)(Table 2). The health care–specific sites were closer to the overall average (4.06) than the general consumer sites (eFigure).

Comment

Although dermatologist ratings on each site had a broad range, we found that patients typically expressed negative interactions on general consumer websites rather than health care–specific websites. When comparing the ratings of the same group of dermatologists across different sites, ratings on health care–specific sites had a higher degree of correlation, with physician ratings more similar between 2 health care–specific sites and less similar between a health care–specific and a general consumer website. This pattern was consistent in both dermatologist-dense and dermatologist-poor areas, despite patients having varying levels of access to dermatologic care and medical resources and potentially different regional preferences of consumer websites. Taken together, these findings imply that health care–specific websites more consistently reflect overall patient sentiment.

Although one 2016 study comparing reviews of dermatology practices on Zocdoc and Yelp also demonstrated lower average ratings on Yelp,3 our study suggests that this trend is not isolated to these 2 sites but can be seen when comparing many health care–specific sites vs general consumer sites.

Our study compared ratings of dermatologists among popular websites to understand those that are most representative of patient attitudes toward physicians. These findings are important because online reviews reflect the entire patient experience, not just the patient-physician interaction, which may explain why physician scores on standardized questionnaires, such as Press Ganey surveys, do not correlate well with their online reviews.4 In a study comparing 98 physicians with negative online ratings to 82 physicians in similar departments with positive ratings, there was no significant difference in scores on patient-physician interaction questions on the Press Ganey survey.5 However, physicians who received negative online reviews scored lower on Press Ganey questions related to nonphysician interactions (eg, office cleanliness, interactions with staff).

The current study was subject to several limitations. Our analysis included all physicians in our random selection without accounting for those physicians with a greater online presence who might be more cognizant of these ratings and try to manipulate them through a reputation-management company or public relations consultant.

Conclusion

Our study suggests that consumer websites are not primarily used by disgruntled patients wishing to express grievances; instead, on average, most physicians received positive reviews. Furthermore, health care–specific websites show a higher degree of concordance than and may more accurately reflect overall patient attitudes toward their physicians than general consumer sites. Reviews from these health care–specific sites may be more helpful than general consumer websites in allowing physicians to understand patient sentiment and improve patient experiences.

References
  1. Frost C, Mesfin A. Online reviews of orthopedic surgeons: an emerging trend. Orthopedics. 2015;38:e257-e262. doi:10.3928/01477447-20150402-52
  2. Waqas B, Cooley V, Lipner SR. Association of sex, location, and experience with online patient ratings of dermatologists. J Am Acad Dermatol. 2020;83:954-955.
  3. Smith RJ, Lipoff JB. Evaluation of dermatology practice online reviews: lessons from qualitative analysis. JAMA Dermatol. 2016;152:153-157. doi:10.1001/jamadermatol.2015.3950
  4. Chen J, Presson A, Zhang C, et al. Online physician review websites poorly correlate to a validated metric of patient satisfaction. J Surg Res. 2018;227:1-6.
  5. Widmer RJ, Maurer MJ, Nayar VR, et al. Online physician reviews do not reflect patient satisfaction survey responses. Mayo Clinic Proc. 2018;93:453-457.
References
  1. Frost C, Mesfin A. Online reviews of orthopedic surgeons: an emerging trend. Orthopedics. 2015;38:e257-e262. doi:10.3928/01477447-20150402-52
  2. Waqas B, Cooley V, Lipner SR. Association of sex, location, and experience with online patient ratings of dermatologists. J Am Acad Dermatol. 2020;83:954-955.
  3. Smith RJ, Lipoff JB. Evaluation of dermatology practice online reviews: lessons from qualitative analysis. JAMA Dermatol. 2016;152:153-157. doi:10.1001/jamadermatol.2015.3950
  4. Chen J, Presson A, Zhang C, et al. Online physician review websites poorly correlate to a validated metric of patient satisfaction. J Surg Res. 2018;227:1-6.
  5. Widmer RJ, Maurer MJ, Nayar VR, et al. Online physician reviews do not reflect patient satisfaction survey responses. Mayo Clinic Proc. 2018;93:453-457.
Issue
cutis - 107(4)
Issue
cutis - 107(4)
Page Number
182-184, E1
Page Number
182-184, E1
Publications
Publications
Topics
Article Type
Sections
Inside the Article

Practice Points

  • Online physician-rating websites are commonly used by patients to find physicians and review experiences.
  • Health care–specific sites may more accurately reflect patient sentiment than general consumer sites.
  • Dermatologists can use health care–specific sites to understand patient sentiment and learn how to improve patient experiences.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
Article PDF Media