Published on in Vol 18, No 12 (2016): December

Association Between Physician Online Rating and Quality of Care

Association Between Physician Online Rating and Quality of Care

Association Between Physician Online Rating and Quality of Care

Original Paper

1Department of Orthopedic Surgery, Kaiser Permanente Moanalua Medical Center, Honolulu, HI, United States

2University of Colorado, Boulder, CO, United States

3Iolani School, Honolulu, HI, United States

4Department of Patient Experience, University of Massachusetts Memorial Healthcare, Worcester, MA, United States

*all authors contributed equally

Corresponding Author:

Kanu Okike, MPH, MD

Department of Orthopedic Surgery

Kaiser Permanente Moanalua Medical Center

3288 Moanalua Road

Honolulu, HI, 96821

United States

Phone: 1 808 432 7326

Fax:1 808 432 8330

Email: okike@post.harvard.edu


Background: Patients are increasingly using physician review websites to find “a good doctor.” However, to our knowledge, no prior study has examined the relationship between online rating and an accepted measure of quality.

Objective: The purpose of this study was to assess the association between online physician rating and an accepted measure of quality: 30-day risk-adjusted mortality rate following coronary artery bypass graft (CABG) surgery.

Methods: In the US states of California, Massachusetts, New Jersey, New York, and Pennsylvania—which together account for over one-quarter of the US population—risk-adjusted mortality rates are publicly reported for all cardiac surgeons. From these reports, we recorded the 30-day mortality rate following isolated CABG surgery for each surgeon practicing in these 5 states. For each surgeon listed in the state reports, we then conducted Internet-based searches to determine his or her online rating(s). We then assessed the relationship between physician online rating and risk-adjusted mortality rate.

Results: Of the 614 surgeons listed in the state reports, we found 96.1% (590/614) to be rated online. The average online rating was 4.4 out of 5, and 78.7% (483/614) of the online ratings were 4 or higher. The median number of reviews used to formulate each rating was 4 (range 1-89), and 32.70% (503/1538) of the ratings were based on 2 or fewer reviews. Overall, there was no correlation between surgeon online rating and risk-adjusted mortality rate (P=.13). Risk-adjusted mortality rates were similar for surgeons across categories of average online rating (P>.05), and surgeon average online rating was similar across quartiles of surgeon risk-adjusted mortality rate (P>.05).

Conclusions: In this study of cardiac surgeons practicing in the 5 US states that publicly report outcomes, we found no correlation between online rating and risk-adjusted mortality rates. Patients using online rating websites to guide their choice of physician should recognize that these ratings may not reflect actual quality of care as defined by accepted metrics.

J Med Internet Res 2016;18(12):e324

doi:10.2196/jmir.6612

Keywords



Consumers have long used reviews of goods and services to inform their choices. Recently, these trends have spread to the health care arena in the form of online physician review websites [1-14]. According to a recent survey, 65% of respondents were aware of physician rating websites, and 35% had sought online physician reviews within the past year [15]. The survey also found that these online reviews were influential: among those who sought physician ratings information online, 35% reported selecting a physician based on good ratings and 37% reported avoiding a physician with bad ratings [15].

While patients are increasingly using physician review websites to find “a good doctor,” it remains unclear whether online physician ratings actually reflect quality of care. Segal et al analyzed online ratings in relation to surgeon case volume, which they considered to be a proxy for quality of care, and found no correlation between numerical rating and number of procedures performed [13]. Similarly, Gao and colleagues analyzed ratings from the RateMDs.com website in comparison with data obtained from the Virginia Board of Medicine, and found no correlation between physician rating and malpractice claims [8]. However, to our knowledge, no prior study has examined the relationship between online ratings and an accepted measure of quality.

The purpose of this study was to assess the degree to which online physician ratings reflect quality of care. In the US states of New York, New Jersey, Massachusetts, Pennsylvania, and California—which together account for over one-quarter of the US population [16]—risk-adjusted mortality rates are publicly reported for all cardiac surgeons. By analyzing the online ratings of these surgeons in comparison with their clinical outcomes, we sought to assess the degree to which online ratings correlate with quality of care.


In June 2015, we accessed the cardiac surgeon “report cards” for all 5 states that publicly report risk-adjusted cardiac surgery mortality rates (ie, California [17], Massachusetts [18], New Jersey [19], New York [20], and Pennsylvania [21]). From the online reports, we recorded the names of all cardiac surgeons practicing in these states, as well as their institutions. For each surgeon listed, we also recorded the 30-day risk-adjusted mortality rate following isolated coronary artery bypass graft (CABG) surgery.

To calculate the risk-adjusted mortality rate, the observed mortality rate is divided by the expected mortality rate and then multiplied by the statewide mortality rate. (For reference, the observed mortality rate is the observed number of deaths divided by the total number of cases, and the expected mortality rate is the sum of predicted probabilities of death for all patients divided by the total number of patients.)

For each surgeon listed in the state reports, we conducted Internet-based searches between July and September 2015 to determine his or her online rating(s). Searches were conducted using surgeon name, location, and specialty. For each online rating identified, we recorded the name of the website, the overall rating, and the number of reviews used to formulate the rating. Online ratings were out of 5. The individuals performing these searches (TKPB and KCX) were blinded to the surgeons’ clinical outcomes.

We assessed the association between surgeon online rating and risk-adjusted mortality rate using the Pearson correlation coefficient. In addition, surgeons were grouped on the basis of average online rating, and risk-adjusted mortality rates were compared using the Student t test. Surgeons were also grouped on the basis of risk-adjusted mortality rate quartile, and online ratings were compared using Student t test. P<.05 was considered statistically significant, and all tests were 2-sided. Statistical analysis was performed using SAS version 9 (SAS Institute Inc).


There were 614 cardiac surgeons with risk-adjusted mortality rates listed in the 5 state reports (California: 271; Massachusetts: 52; New Jersey: 36; New York: 135; and Pennsylvania: 120). For all states combined, the average 30-day risk-adjusted mortality rate after isolated CABG surgery was 1.68% (SD 1.98, median 1.22%, range 0.00%-16.98%).

We found 96.1% (590/614) of the surgeons to be rated online, including from Healthgrades (n=540) [22], Vitals (n=495) [23], UCompareHealthCare (n=366) [24], and RateMDs (n=103) [25]. We found that 74 of the surgeons were rated on a single website, while 170 were rated on 2 websites, 266 were rated on 3 websites, and 80 were rated on 4 or more websites. The average online rating for the cardiac surgeons was 4.4 on a scale of 1-5, with 1 being the lowest score and 5 being the highest score obtainable. As Table 1 shows, 78.7% (483/614) of the scores were 4 out of 5 or better. The median number of reviews per surgeon was 4, with a wide range (1-89 reviews).

Figure 1 depicts a scatterplot of surgeon risk-adjusted mortality rate versus average online rating. Surgeon online rating did not correlate with risk-adjusted mortality rate (Pearson correlation coefficient –.06, P=.13). Risk-adjusted mortality rates were similar for surgeons across categories of average online rating (P>.05; Figure 2). Similarly, surgeon average online rating was similar across quartiles of surgeon risk-adjusted mortality rate (P>.05; Table 2).

Table 1. Average online ratingsa of cardiac surgeons who had risk-adjusted mortality rates listed, July-September 2015.
Average online ratingn (%)
5.00159 (25.9)
4.00-4.99324 (52.8)
3.00-3.9994 (15.3)
2.00-2.998 (1.3)
1.00-1.995 (0.8)
Not rated online24 (3.9)
Total614 (100.0)

aRatings are out of 5.

Figure 1. Average online rating versus risk-adjusted mortality rate. Ratings are out of 5.
View this figure
Figure 2. Risk-adjusted mortality rate, by average online rating. Note that the categories of average online rating differ in size. Error bars indicate 95% CIs, which vary in magnitude due to the number of ratings in each category (n=13 for 1.00-2.99, n=94 for 3.00-3.99, n=324 for 4.00-4.99, and n=159 for 5.00). Ratings are out of 5. There were no significant differences between the groups (P>.05).
View this figure
Table 2. Average online rating, by quartile of surgeon risk-adjusted mortality rate.
Surgeon risk-adjusted mortality ratenSurgeon average online ratingaP valueb
QuartileRangeMeanSD
Very low0.00%-0.41%1484.40.7(reference)
Low0.45%-1.20%1474.50.5.34
Medium1.23%-2.31%1484.40.6.49
High≥2.34%1474.40.7.40

aRatings are out of 5.

bCompared with surgeons with risk-adjusted mortality rates categorized as “very low.”


In this study of cardiac surgeons practicing in the 5 US states that publicly report outcomes, we found no correlation between online rating and risk-adjusted mortality rates.

We are not aware of any prior study assessing the correlation between physician online rating and accepted measures of quality. However, 2 prior studies have examined the relationship between patients’ subjective assessments of care and objective measures of quality. In these 2 studies, both of which were conducted among individuals over the age of 65 years, the subjective ratings given by patients were not found to correlate with the accepted quality measures [26,27].

Our study is not without its limitations. We used 30-day risk-adjusted mortality rates to measure quality, and it is possible that our results could have differed had we examined long-term mortality rates or rates of major morbidity (such as renal failure or stroke). However, 30-day risk-adjusted mortality is the most commonly accepted measure of quality in the field [28]. In addition, since we investigated cardiac surgeons in 5 US states, it is unclear whether the findings can be generalized to other fields of medicine or other locations.

For physicians, who have long argued that online ratings do not reflect clinical competency [29], the results of our study may not be surprising. However, our findings serve as a reminder that the provision of high-quality medical care may not necessarily translate into higher online ratings.

Our study also has important implications for patients. Consumers are increasingly using online reviews to guide their selection of goods and services, and health care is no exception [15]. Based on the results of our study, patients using online rating websites to guide their choice of physician should recognize that these ratings may not reflect actual quality of care as defined by accepted metrics. In contrast, they may be more reflective of factors such as clinic wait times [30] or bedside manner [31].

In summary, this study of cardiac surgeons practicing in the 5 US states that publicly report outcomes found no correlation between online rating and risk-adjusted mortality rates. Patients using online rating websites to guide their choice of physician should recognize that these ratings may not reflect actual quality of care as defined by accepted metrics.

Acknowledgments

The authors gratefully acknowledge Mr Donovan Delgado for his assistance with data collection. Publication of this paper was funded by the University of Colorado Boulder Libraries Open Access Fund.

Conflicts of Interest

K Okike receives educational meeting support from DePuy, Stryker Corporation, Synthes, and Zimmer, and teaching honoraria from Synthes. TK Peter-Bibb, KC Xie, and ON Okike report no conflicts of interest.

  1. Atkinson S. Current status of online rating of Australian doctors. Aust J Prim Health 2014;20(3):222-223. [CrossRef] [Medline]
  2. Bakhsh W, Mesfin A. Online ratings of orthopedic surgeons: analysis of 2185 reviews. Am J Orthop (Belle Mead NJ) 2014 Aug;43(8):359-363. [Medline]
  3. Black EW, Thompson LA, Saliba H, Dawson K, Black NM. An analysis of healthcare providers' online ratings. Inform Prim Care 2009;17(4):249-253 [FREE Full text] [Medline]
  4. Detz A, López A, Sarkar U. Long-term doctor-patient relationships: patient perspective from online reviews. J Med Internet Res 2013;15(7):e131 [FREE Full text] [CrossRef] [Medline]
  5. Ellimoottil C, Hart A, Greco K, Quek ML, Farooq A. Online reviews of 500 urologists. J Urol 2013 Jun;189(6):2269-2273. [CrossRef] [Medline]
  6. Emmert M, Meier F. An analysis of online evaluations on a physician rating website: evidence from a German public reporting instrument. J Med Internet Res 2013;15(8):e157 [FREE Full text] [CrossRef] [Medline]
  7. Emmert M, Meier F, Heider A, Dürr C, Sander U. What do patients say about their physicians? an analysis of 3000 narrative comments posted on a German physician rating website. Health Policy 2014 Oct;118(1):66-73. [CrossRef] [Medline]
  8. Gao GG, McCullough JS, Agarwal R, Jha AK. A changing landscape of physician quality reporting: analysis of patients' online ratings of their physicians over a 5-year period. J Med Internet Res 2012;14(1):e38 [FREE Full text] [CrossRef] [Medline]
  9. Kadry B, Chu LF, Kadry B, Gammas D, Macario A. Analysis of 4999 online physician ratings indicates that most patients give physicians a favorable rating. J Med Internet Res 2011;13(4):e95 [FREE Full text] [CrossRef] [Medline]
  10. López A, Detz A, Ratanawongsa N, Sarkar U. What patients say about their doctors online: a qualitative content analysis. J Gen Intern Med 2012 Jun;27(6):685-692 [FREE Full text] [CrossRef] [Medline]
  11. Merrell JG, Levy BH, Johnson DA. Patient assessments and online ratings of quality care: a “wake-up call” for providers. Am J Gastroenterol 2013 Nov;108(11):1676-1685. [CrossRef] [Medline]
  12. Sabin JE. Physician-rating websites. Virtual Mentor 2013 Nov;15(11):932-936 [FREE Full text] [CrossRef] [Medline]
  13. Segal J, Sacopulos M, Sheets V, Thurston I, Brooks K, Puccia R. Online doctor reviews: do they track surgeon volume, a proxy for quality of care? J Med Internet Res 2012;14(2):e50 [FREE Full text] [CrossRef] [Medline]
  14. Wallace BC, Paul MJ, Sarkar U, Trikalinos TA, Dredze M. A large-scale quantitative analysis of latent factors and sentiment in online doctor reviews. J Am Med Inform Assoc 2014;21(6):1098-1103 [FREE Full text] [CrossRef] [Medline]
  15. Hanauer DA, Zheng K, Singer DC, Gebremariam A, Davis MM. Public awareness, perception, and use of online physician rating sites. JAMA 2014 Feb 19;311(7):734-735. [CrossRef] [Medline]
  16. United States Census Bureau. Table 1: annual estimates of the resident population for the United States, regions, states, and Puerto Rico: April 1, 2010 to July 1, 2015. 2015.   URL: http://www.census.gov/popest/data/state/totals/2015/tables/NST-EST2015-01.xlsx [accessed 2016-12-11] [WebCite Cache]
  17. State of California Office of Statewide Health Planning and Development. CABG surgery in California. 2016.   URL: http://www.oshpd.ca.gov/HID/CABG-Report.html [accessed 2016-08-21] [WebCite Cache]
  18. Massachusetts Data Analysis Center (Mass-DAC). Cardiac study: annual reports. 2016.   URL: http://www.massdac.org/index.php/reports/cardiac-study-annual/ [accessed 2016-08-22] [WebCite Cache]
  19. State of New Jersey Department of Health. New Jersey hospital performance report.   URL: http://web.doh.state.nj.us/apps2/hpr/index.aspx [accessed 2016-08-22] [WebCite Cache]
  20. New York State Department of Health. Cardiovascular disease data and statistics.   URL: https://www.health.ny.gov/statistics/diseases/cardiovascular/ [accessed 2016-08-22] [WebCite Cache]
  21. Pennsylvania Health Care Cost Containment Council. Public reports: cardiac care.   URL: http://www.phc4.org/reports/cabg/ [accessed 2016-08-22] [WebCite Cache]
  22. Healthgrades: better health begins here. Denver, CO: Healthgrades Operating Company; 2016.   URL: https://www.healthgrades.com [accessed 2016-12-09] [WebCite Cache]
  23. Vitals: find better care. Lyndhurst, NJ: Vitals; 2016.   URL: http://www.vitals.com [accessed 2016-12-12] [WebCite Cache]
  24. UCompareHealthCare: search and compare for better care. Marlborough, MA: UCompare Holdings, LLC; 2016.   URL: http://www.ucomparehealthcare.com [accessed 2016-12-11] [WebCite Cache]
  25. RateMDs: doctors you can trust. San Jose, CA: RateMDs Inc; 2016.   URL: https://www.ratemds.com/co/colorado-springs/ [accessed 2016-12-12] [WebCite Cache]
  26. Rao M, Clarke A, Sanderson C, Hammersley R. Patients' own assessments of quality of primary care compared with objective records based measures of technical quality of care: cross sectional study. BMJ 2006 Jul 1;333(7557):19 [FREE Full text] [CrossRef] [Medline]
  27. Chang JT, Hays RD, Shekelle PG, MacLean CH, Solomon DH, Reuben DB, et al. Patients' global ratings of their health care are not associated with the technical quality of their care. Ann Intern Med 2006 May 2;144(9):665-672. [Medline]
  28. Brown DL, Clarke S, Oakley J. Cardiac surgeon report cards, referral for cardiac surgery, and the ethical responsibilities of cardiologists. J Am Coll Cardiol 2012 Jun 19;59(25):2378-2382. [CrossRef] [Medline]
  29. Johnson C. Survey finds physicians very wary of doctor ratings. Physician Exec 2013;39(1):6-8, 10, 12. [Medline]
  30. Bleustein C, Rothschild DB, Valen A, Valatis E, Schweitzer L, Jones R. Wait times, patient satisfaction scores, and the perception of care. Am J Manag Care 2014 May;20(5):393-400. [Medline]
  31. Uhas AA, Camacho FT, Feldman SR, Balkrishnan R. The relationship between physician friendliness and caring, and patient satisfaction: findings from an internet-based survey. Patient 2008 Apr 1;1(2):91-96. [Medline]


CABG: coronary artery bypass graft


Edited by G Eysenbach; submitted 04.09.16; peer-reviewed by D Hanauer, S Atkinson; comments to author 29.09.16; revised version received 04.10.16; accepted 24.10.16; published 13.12.16

Copyright

©Kanu Okike, Taylor K Peter-Bibb, Kristal C Xie, Okike N Okike. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 13.12.2016.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.