Format

Send to

Choose Destination
J Neurosurg Spine. 2017 Feb;26(2):235-242. doi: 10.3171/2016.7.SPINE16183. Epub 2016 Sep 23.

An assessment of data and methodology of online surgeon scorecards.

Author information

1
Department of Neurosurgery, Stanford University School of Medicine, Palo Alto, California.
2
Department of Neurosurgery, Mayo Clinic, Rochester, Minnesota.
3
Department of Neurosurgery, Wake Forest University School of Medicine, Winston-Salem, North Carolina; and.
4
Department of Neurosurgery, Massachusetts General Hospital, Boston, Massachusetts.

Abstract

OBJECTIVE Recently, 2 surgeon rating websites (Consumers' Checkbook and ProPublica) were published to allow the public to compare surgeons through identifying surgeon volume and complication rates. Among neurosurgeons and orthopedic surgeons, only cervical and lumbar spine, hip, and knee procedures were included in this assessment. METHODS The authors examined the methodology of each website to assess potential sources of inaccuracy. Each online tool was queried for reports on neurosurgeons specializing in spine surgery and orthopedic surgeons specializing in spine, hip, or knee surgery. Surgeons were chosen from top-ranked hospitals in the US, as recorded by a national consumer publication ranking system, within the fields of neurosurgery and orthopedic surgery. The results were compared for accuracy and surgeon representation, and the results of the 2 websites were also compared. RESULTS The methodology of each site was found to have opportunities for bias and limited risk adjustment. The end points assessed by each site were actually not complications, but proxies of complication occurrence. A search of 510 surgeons (401 orthopedic surgeons [79%] and 109 neurosurgeons [21%]) showed that only 28% and 56% of surgeons had data represented on Consumers' Checkbook and ProPublica, respectively. There was a significantly higher chance of finding surgeon data on ProPublica (p < 0.001). Of the surgeons from top-ranked programs with data available, 17% were quoted to have high complication rates, 13% with lower volume than other surgeons, and 79% had a 3-star out of 5-star rating. There was no significant correlation found between the number of stars a surgeon received on Consumers' Checkbook and his or her adjusted complication rate on ProPublica. CONCLUSIONS Both the Consumers' Checkbook and ProPublica websites have significant methodological issues. Neither site assessed complication occurrence, but rather readmissions or prolonged length of stay. Risk adjustment was limited or nonexistent. A substantial number of neurosurgeons and orthopedic surgeons from top-ranked hospitals have no ratings on either site, or have data that suggests they are low-volume surgeons or have higher complication rates. Consumers' Checkbook and ProPublica produced different results with little correlation between the 2 websites in how surgeons were graded. Given the significant methodological issues, incomplete data, and lack of appropriate risk stratification of patients, the featured websites may provide erroneous information to the public.

KEYWORDS:

CABG = coronary artery bypass grafting; CI = confidence interval; CMS = Centers for Medicare and Medicaid Services; Consumers' Checkbook; LOS = length of stay; NSQIP = National Surgical Quality Improvement Program; ProPublica; orthopedic surgery; surgeon ratings; website

PMID:
27661563
DOI:
10.3171/2016.7.SPINE16183
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Sheridan PubFactory
Loading ...
Support Center