%0 Journal Article %@ 1438-8871 %I JMIR Publications %V 23 %N 12 %P e31042 %T Measuring and Improving Evidence-Based Patient Care Using a Web-Based Gamified Approach in Primary Care (QualityIQ): Randomized Controlled Trial %A Burgon,Trever %A Casebeer,Linda %A Aasen,Holly %A Valdenor,Czarlota %A Tamondong-Lachica,Diana %A de Belen,Enrico %A Paculdo,David %A Peabody,John %+ QURE Healthcare, 450 Pacific Ave, Suite 200, San Francisco, CA, 94133, United States, 1 4153213388 ext 101, jpeabody@qurehealthcare.com %K quality improvement %K physician engagement %K MIPS %K case simulation %K feedback %K value-based care %K care standardization %K simulation %K gamification %K medical education %K continuing education %K outcome %K serious game %K decision-support %D 2021 %7 23.12.2021 %9 Original Paper %J J Med Internet Res %G English %X Background: Unwarranted variability in clinical practice is a challenging problem in practice today, leading to poor outcomes for patients and low-value care for providers, payers, and patients. Objective: In this study, we introduced a novel tool, QualityIQ, and determined the extent to which it helps primary care physicians to align care decisions with the latest best practices included in the Merit-Based Incentive Payment System (MIPS). Methods: We developed the fully automated QualityIQ patient simulation platform with real-time evidence-based feedback and gamified peer benchmarking. Each case included workup, diagnosis, and management questions with explicit evidence-based scoring criteria. We recruited practicing primary care physicians across the United States into the study via the web and conducted a cross-sectional study of clinical decisions among a national sample of primary care physicians, randomized to continuing medical education (CME) and non-CME study arms. Physicians “cared” for 8 weekly cases that covered typical primary care scenarios. We measured participation rates, changes in quality scores (including MIPS scores), self-reported practice change, and physician satisfaction with the tool. The primary outcomes for this study were evidence-based care scores within each case, adherence to MIPS measures, and variation in clinical decision-making among the primary care providers caring for the same patient. Results: We found strong, scalable engagement with the tool, with 75% of participants (61 non-CME and 59 CME) completing at least 6 of 8 total cases. We saw significant improvement in evidence-based clinical decisions across multiple conditions, such as diabetes (+8.3%, P<.001) and osteoarthritis (+7.6%, P=.003) and with MIPS-related quality measures, such as diabetes eye examinations (+22%, P<.001), depression screening (+11%, P<.001), and asthma medications (+33%, P<.001). Although the CME availability did not increase enrollment in the study, participants who were offered CME credits were more likely to complete at least 6 of the 8 cases. Conclusions: Although CME availability did not prove to be important, the short, clinically detailed case simulations with real-time feedback and gamified peer benchmarking did lead to significant improvements in evidence-based care decisions among all practicing physicians. Trial Registration: ClinicalTrials.gov NCT03800901; https://clinicaltrials.gov/ct2/show/NCT03800901 %M 34941547 %R 10.2196/31042 %U https://www.jmir.org/2021/12/e31042 %U https://doi.org/10.2196/31042 %U http://www.ncbi.nlm.nih.gov/pubmed/34941547