Published on in Vol 21, No 4 (2019): April

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/11646, first published .
“But His Yelp Reviews Are Awful!”: Analysis of General Surgeons’ Yelp Reviews

“But His Yelp Reviews Are Awful!”: Analysis of General Surgeons’ Yelp Reviews

“But His Yelp Reviews Are Awful!”: Analysis of General Surgeons’ Yelp Reviews

Original Paper

The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States

*these authors contributed equally

Corresponding Author:

Yosef Nasseri, MD

The Surgery Group of Los Angeles

Research Foundation

Suite 880W

8635 W 3rd Street

Los Angeles, CA, 90048

United States

Phone: 1 310 289 1518

Fax:1 310 289 1526

Email: yosefnasseri@gmail.com


Background: Patients use Web-based platforms to review general surgeons. However, little is known about the free-form text and structured content of the reviews or how they relate to the physicians’ characteristics or their practices.

Objective: This observational study aimed to analyze the Web-based reviews of general surgeons on the west side of Los Angeles.

Methods: Demographics, practice characteristics, and Web-based presence were recorded. We evaluated frequency and types of Yelp reviews and assigned negative remarks to 5 categories. Tabulated results were evaluated using independent t test, one-way analysis of variance, and Pearson correlation analysis to determine associations between the number of total and negative reviews with respect to practice structure and physician characteristics.

Results: Of the 146 general surgeons, 51 (35%) had at least 1 review and 29 (20%) had at least 1 negative review. There were 806 total reviews, 679 (84.2%) positive reviews and 127 (15.8%) negative reviews. The negative reviews contained a total of 376 negative remarks, categorized into physician demeanor (124/376, 32.9%), clinical outcomes (81/376, 22%), office or staff (83/376, 22%), scheduling (44/376, 12%), and billing (44/376, 12%). Surgeons with a professional website had significantly more reviews than those without (P=.003). Surgeons in private practice had significantly more reviews (P=.002) and more negative reviews (P=.03) than surgeons who were institution employed. A strong and direct correlation was found between a surgeon’s number of reviews and number of negative reviews (P<.001).

Conclusions: As the most common category of complaints was about physician demeanor, surgeons may optimize their Web-based reputation by improving their bedside manner. A surgeon’s Web presence, private practice, and the total number of reviews are significantly associated with both positive and negative reviews.

J Med Internet Res 2019;21(4):e11646

doi:10.2196/11646

Keywords



One of the most ubiquitous business review websites, Yelp was established because the founder was unable to find recommendations for local physicians on the Web [1]. As customer feedback has become increasingly accessible, Web-based rating sites now have an influential impact on the impression and decisions of patients, with as many as 68% of patients turning to these resources to research or review physicians [2]. Physicians are beginning to realize that reactions and ratings detailed on these websites may impact which and how many patients visit them, as well as their overall reputation [3-5].

There are at least 33 websites where patients can describe their experience at hospitals, clinics, or clinical practices [6]. These websites range from general consumer rating websites to websites geared specifically toward the medical field. The structure of the all-purpose websites tends to afford more freedom to the commenters, whereas medical-based websites generally have a more structured format. In addition to premade surveys, several medical-based rating websites also allow reviewers to make unique remarks [6]. Yelp is one of the most used Web-based resources to review physicians [7-12].

An analysis of the content of Web-based reviews of general surgeons, including free-form content, has not been systematically described. We investigated the Yelp reviews of general surgeons in a defined region to categorize the content of the negative reviews and determine whether the number of reviews and the number of negative reviews correlated with the characteristics of the physicians. Los Angeles was chosen as the site of this study because it is home to a large variety of practices and institutions.


We identified general surgeons practicing on the west side of Greater Los Angeles using The Medical Board of California Web-based database [13] and InfoUSA (Papillion, Nebraska), a marketing company that provides contact databases and mailing lists. The physicians practicing in the 31 zip codes on the west side of Los Angeles were examined. The active practice status of the physician was determined using The Medical Board of California’s License verification Web-based tool [13]. Those surgeons who were still in training, retired, and those without an active medical license were eliminated. Physician gender, years since graduating from medical school, and medical school attended were identified. A Google search was performed to determine if the physician had a medical practice website.

A physician was considered to have a Yelp page if they could be found and reviewed on a page designated for an institution, clinical practice, or the physician. The following was documented: presence of a Yelp page, number of reviews, number of positive reviews, and number of negative reviews. Yelp users rate an institution or physician on a 5-star system, with 5 stars defined as “Woohoo! As good as it gets!” and 1 star defined as “Eek! Methinks not.” Yelp defines 4 stars as “Yay! I’m a fan.” As 4 stars was less than a perfect star rating, a negative review was defined as a review that contained at least 1 negative remark and had a rating of 4 stars or fewer. Yelp defines 3 stars as A-OK, and because it is not a 4- or 5-star review, a level of mediocrity is implied [14]. A positive review was defined as any 5-star or 4-star review that contained no negative remarks.

Negative reviews were further characterized. Each commenter’s username, date of review, and star rating was noted. Free-form reviews were manually tabulated, categorized, and resolved by 2 independent reviewers and were empirically divided into 5 categories modified from previously defined categories [15]: Scheduling (doctor availability and punctuality), billing, office and staff (staff friendliness or professionalism, staff presence, and office décor or location), clinical outcome (correct diagnosis or treatment, technical skill, treatment of unforeseen complications, if additional treatment was needed, and follow-up care), and physician demeanor (education, empathy, bedside manner, professionalism, preparedness or organization, time with doctor, communication skills, shared decision making, and general impression). Remarks within negative reviews were noted to be negative, neutral, or positive.

Once the raw data were collected, summary statistics were calculated using univariable analyses. An independent t test, which determines whether there is a statistically significant difference between the means in 2 unrelated groups, was run to determine whether or not certain factors (possession of professional website, private practice or institution employed, gender, and medical school outside of the United States) impacted the number of total reviews or negative reviews of a surgeon. One-way analysis of variance analysis, which determines whether there is a statistically significant difference between the means in more than 2 unrelated groups, was used to determine if zip code or the number of years since graduating from medical school impacted a surgeon’s number of total reviews or negative reviews. A Pearson correlation, which measures the linear correlation between 2 variables, was performed to determine if there was any relationship between a give surgeon’s total number of reviews and quantity of negative reviews. Statistical Package for the Social Sciences statistical software was used.


We identified 146 practicing general surgeons on the west side of Greater Los Angeles. A total of 33 (22.6%, 33/146) surgeons were female and 113 (77.4%, 113/146) were male. Moreover, 55 (37.7%, 55/146) surgeons were in private practice and 91 (69.2%, 91/146) were institution employed. In addition, 19 (13.0%, 19/146) surgeons went to medical school outside of the United States. Furthermore, 42 (28.8%, 42/146) physicians had a professional website (Table 1).

A total of 59 (40.4%, 59/146) surgeons had a Yelp page. Moreover, 51 physicians (34.9% of all surgeons and 86% of those with a Yelp) had at least 1 review. Of the 59 physicians who had a Yelp page, 29 (49%, 29/59) had at least 1 negative review and 48 (81%, 48/59) had at least 1 positive review (Figure 1).

There was a total of 806 documented reviews. Of these reviews, 679 (84.2%) were positive and 126 (15.6%) were negative. Within the 126 negative reviews, there were 376 negative remarks: 124 (32.9%) concerning physician demeanor, 81 (21.5%) on clinical outcomes, 83 (22.0%) regarding the office or staff, 44 (11.7%) about scheduling, and 44 (11.7%) in relation to billing (Table 1).

Table 1. Demographics (N=146).
CharacteristicsStatistics, n (%)
Male113 (77.4)
Private practice55 (38)
Institutional employed91 (69.2)
Physicians with website42 (28.7)
Foreign medical school graduate19 (13.0)
Physicians with a Yelp page59 (40.4)
Physicians reviewed on Yelp51 (34.9)
Physicians with negative reviews29 (19.8)
Figure 1. Distribution of reviews. The chart on the left shows the breakdown of positive reviews among the 59 physicians who had a Yelp page, whereas the chart on the right shows the breakdown of negative reviews. The number of negative and positive reviews is shown along with the number of physicians who had that number of negative and positive reviews, respectively.
View this figure
Table 2. Statistical analysis via 2-way Pearson correlation analysis.
ReviewsNumber of reviews, mean (SD)P value
Total reviews (n=806)5.6 (18.08).001
Negative reviews (n=126)0.86 (4.33).001

The existence of a professional website as well as the type of employment had a significant impact on the total number of reviews (Tables 2-4). Those with a professional website had significantly more overall reviews compared with those without (mean 16.36, SD 31.0 vs mean 1.14, SD 3.23; P=.003). Two-thirds of physicians in private practice had reviews, compared with one-third of physicians employed by an institution (mean 18.0, SD 31.4 vs mean 0.96, SD 3.18; P=.002). Being in private practice was also significantly associated with the number of negative reviews (mean 3.11, SD 8.13 vs mean 0.09, SD 0.35; P=.03). As such, private practice was significantly associated with both more overall reviews and more negative reviews. Furthermore, there was a trend toward a correlation between a greater number of reviews and a greater number of years since graduation from medical school (P=.05).

A 2-tailed Pearson correlation was performed to evaluate the association between the total number of reviews and the number of negative reviews. The correlation between the total number of reviews and negative reviews was strong (r=.862; P<.001).

The total number of reviews was not impacted by a surgeon’s gender (male: mean 4.2, SD 11.6 vs female: mean 9.9, SD 31.3; P=.32), whether they attended medical school outside the United States (non-United States: mean 1.6, SD 3.20 vs United States: mean 6.1, SD 19.3; P=.32), or the practice zip code (P=.20).

The likelihood of having negative reviews did not significantly differ between surgeons with a professional website versus surgeons without a professional website (mean 2.3, SD 7.80 vs mean 0.28, SD 0.98; P=.10). Similarly, the number of negative reviews was not significantly affected by a surgeon’s gender (male: mean 0.48, SD 1.22 vs female: mean 2.18, SD 8.79; P=.28), whether or not they attended medical school outside the United States (non-United States: mean 0.6, SD 1.54 vs United States: mean 0.91, SD 4.6; P=.76), the zip code they practiced in (P=.77), or the number of years since the surgeon graduated from medical school (P=.85).

Table 3. Statistical analysis via independent t test.
VariableNumber of physicians with any reviewsReviewsNegative reviews

Total, NMean (SD)P valueTotal, NMean (SD)P value
Private practice physicians (n=55)3472018 (31.4).0021183.11 (8.13).03
Institution employed physicians (n=91)17860.96 (3.18).00280.09 (0.35).03
Medical school out of United States (n=19)5311.6 (3.20).32110.6 (1.54).76
Medical school in United States (n=127)467756.1 (19.3).321150.91 (4.60).76
Male (n=113)384804.2 (11.6).32540.48 (1.22).28
Female (n=33)133269.9 (31.3).32722.18 (8.79).28
Possess professional website (n=42)2868716 (31.0).003a972.3 (7.80).10
Do not possess professional website (n=104)231191.1 (3.23).003290.28 (0.98).10
Table 4. Statistical analysis via 1-way analysis of variance analysis.
VariableP value

Comparing total number of reviewsComparing number of negative reviews
Number of years since graduating from medical school.05.85
Zip code.20.7

Our finding that the majority of Web-based reviews were positive supports prior findings [10,16]. Unfortunately, negative reviews do exist and can have a damaging effect on a physician’s reputation and practice [3,5,9,17,18]. The most common category of Yelp complaints was physician demeanor. As such, surgeons may optimize their Web-based reputation by improving their bedside manner. A surgeon’s type of practice (ie, private practice or institution employed) and the total number of reviews were significantly associated with more negative reviews. A surgeon’s type of practice and the possession of a personal website were significantly associated with more total Yelp reviews.

Of the negative reviews, the most common category was criticism of physician demeanor. Patient perception of physician demeanor is a factor about which physicians may have some control. Thus, it might be possible to improve one’s reviews by enhancing patient-physician interactions. This includes being better prepared for each consultation, spending sufficient time with patients, clearly communicating the plan of care and disease processes affecting each patient, and displaying empathy. Over one-fifth of the complaints centered around clinical outcomes and another one-fifth concentrated on the office or staff. Although physicians are not always in control of clinical outcomes, keeping one’s skills up to date and practicing evidence-based medicine may improve clinical outcomes. To reduce the number of negative Web-based reviews, it is probably important to treat unforeseen complications quickly and empathically. In addition, hiring courteous office staff and optimizing the aesthetic appearance of one’s practice environment may enhance the quality of Web-based reviews.

Our study found that Web presence significantly impacted a physician’s total number of reviews. Presence of a website, personal or private practice, was shown to have a significant effect on the total number of reviews a physician received. The mean number of reviews for those with a professional website was significantly greater than the number of reviews for surgeons without a website, indicating that an increased Web presence leads to more Web-based reviews. General surgeons in private practice had significantly more reviews overall (P=.002) and significantly more negative reviews (P=.03) than those who were institution employed. In addition, the total number of reviews was strongly correlated with the number of negative reviews. This suggests that although a Web-based presence may be important in enhancing a surgeon’s reputation, it may also be detrimental, depending on the content of the individual reviews.

There are several limitations to this study. Restricting physicians examined to a single geographic area is descriptive for the region but is also limiting in size and scope. Due to the presence of large health institutions in the west side of Los Angeles, the number of surgeons with reviews may be artificially low because institution-based physicians may have a much more limited Web presence and thus fewer Web-based reviews. Yelp was the only Web-based rating website examined, excluding any reviews and Web-based presence surgeons may have had on other rating websites [7]. There is inherent response bias as patients make the active decision whether to write a review or not. This study only aimed to examine patient perception of physicians as well as any perceived problems they encountered in their experiences, and as such, we cannot comment on how this reflects on the quality of the physicians themselves [17,19].

Future studies are needed to determine if the trends and correlations found are applicable to surgeons practicing in other and larger geographic regions. Other specialties need to be examined to see if there is a difference in the number and type of reviews and Web presence among other specialties. Most importantly, the impact of Yelp reviews on a surgeon’s practice needs to be assessed.

Acknowledgments

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Conflicts of Interest

None declared.

  1. Yelp Official Blog: Yelp, Inc; 2013. Study: Yelp ratings work for hospitals, too   URL: https://www.yelpblog.com/2013/03/study-yelp-ratings-work-for-hospitals-too [accessed 2019-03-17] [WebCite Cache]
  2. Rastegar-Mojarad M, Ye Z, Wall D, Murali N, Lin S. Collecting and analyzing patient experiences of health care from social media. JMIR Res Protoc 2015;4(3):e78 [FREE Full text] [CrossRef] [Medline]
  3. Emmert M, Meszmer N, Sander U. Do health care providers use online patient ratings to improve the quality of care? Results from an online-based cross-sectional study. J Med Internet Res 2016 Sep 19;18(9):e254 [FREE Full text] [CrossRef] [Medline]
  4. Emmert M, Sauter L, Jablonski L, Sander U, Taheri-Zadeh F. Do physicians respond to web-based patient ratings? An analysis of physicians' responses to more than one million web-based ratings over a six-year period. J Med Internet Res 2017 Jul 26;19(7):e275 [FREE Full text] [CrossRef] [Medline]
  5. Holliday AM, Kachalia A, Meyer GS, Sequist TD. Physician and patient views on public physician rating websites: a cross-sectional study. J Gen Intern Med 2017 Jun;32(6):626-631. [CrossRef] [Medline]
  6. Lagu T, Hannon NS, Rothberg MB, Lindenauer PK. Patients' evaluations of health care providers in the era of social networking: an analysis of physician-rating websites. J Gen Intern Med 2010 Sep;25(9):942-946 [FREE Full text] [CrossRef] [Medline]
  7. Lagu T, Metayer K, Moran M, Ortiz L, Priya A, Goff SL, et al. Website characteristics and physician reviews on commercial physician-rating websites. J Am Med Assoc 2017 Dec 21;317(7):766-768 [FREE Full text] [CrossRef] [Medline]
  8. Geletta S. Measuring patient satisfaction with medical services using social media generated data. Int J Health Care Qual Assur 2018 Mar 12;31(2):96-105. [CrossRef] [Medline]
  9. Dorfman RG, Purnell C, Qiu C, Ellis MF, Basu CB, Kim JYS. Happy and unhappy patients: a quantitative analysis of online plastic surgeon reviews for breast augmentation. Plast Reconstr Surg 2018 Dec;141(5):663e-673e. [CrossRef] [Medline]
  10. Ellimoottil C, Hart A, Greco K, Quek ML, Farooq A. Online reviews of 500 urologists. J Urol 2013 Jun;189(6):2269-2273. [CrossRef] [Medline]
  11. Ramkumar PN, Navarro SM, Chughtai M, La T, Fisch E, Mont MA. The patient experience: an analysis of orthopedic surgeon quality on physician-rating sites. J Arthroplasty 2017 Dec;32(9):2905-2910. [CrossRef] [Medline]
  12. Kadry B, Chu LF, Kadry B, Gammas D, Macario A. Analysis of 4999 online physician ratings indicates that most patients give physicians a favorable rating. J Med Internet Res 2011;13(4):e95 [FREE Full text] [CrossRef] [Medline]
  13. Medical Board of California. 2018. Medical Board of California Breeze License Verification   URL: http://www.mbc.ca.gov/Breeze/License_Verification.aspx [accessed 2019-03-14] [WebCite Cache]
  14. Woolf M. minimaxir. The Statistical Difference Between 1-Star5-Star Reviews on Yelp   URL: https://minimaxir.com/2014/09/one-star-five-stars/ [accessed 2019-03-14] [WebCite Cache]
  15. Asanad K, Parameshwar PS, Houman J, Spiegel BM, Daskivich TJ, Anger JT. Online physician reviews in female pelvic medicine and reconstructive surgery: what do patients really want? Female Pelvic Med Reconstr Surg 2018;24(2):109-114. [CrossRef] [Medline]
  16. Cloney M, Hopkins B, Shlobin N, Dahdaleh NS. Online ratings of neurosurgeons: an examination of web data and its implications. Neurosurgery 2018 Apr 03;83(6):1143-1152. [CrossRef] [Medline]
  17. Chen J, Presson A, Zhang C, Ray D, Finlayson S, Glasgow R. Online physician review websites poorly correlate to a validated metric of patient satisfaction. J Surg Res 2018 Jul;227:1-6. [CrossRef] [Medline]
  18. Emmert M, Meier F, Pisch F, Sander U. Physician choice making and characteristics associated with using physician-rating websites: cross-sectional study. J Med Internet Res 2013;15(8):e187 [FREE Full text] [CrossRef] [Medline]
  19. Haskins IN, Krpata DM, Rosen MJ, Perez AJ, Tastaldi L, Butler RS, et al. Online surgeon ratings and outcomes in hernia surgery: an Americas Hernia Society quality collaborative analysis. J Am Coll Surg 2017 Nov;225(5):582-589. [CrossRef] [Medline]

Edited by G Eysenbach; submitted 23.07.18; peer-reviewed by T Daskivich, S Zheng; comments to author 04.10.18; revised version received 11.11.18; accepted 23.01.19; published 30.04.19

Copyright

©Cynthia Liu, Meka Uffenheimer, Yosef Nasseri, Jason Cohen, Joshua Ellenhorn. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 30.04.2019.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.