Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/52071, first published .
An Electronic Health Record–Based Automated Self-Rescheduling Tool to Improve Patient Access: Retrospective Cohort Study

An Electronic Health Record–Based Automated Self-Rescheduling Tool to Improve Patient Access: Retrospective Cohort Study

An Electronic Health Record–Based Automated Self-Rescheduling Tool to Improve Patient Access: Retrospective Cohort Study

Original Paper

1Department of Medicine, University of California San Francisco, San Francisco, CA, United States

2Department of Urology, University of California San Francisco, San Francisco, CA, United States

3UCSF Health Faculty Practices, University of California San Francisco, San Francisco, CA, United States

*these authors contributed equally

Corresponding Author:

Smitha Ganeshan, MD, MBA

Department of Medicine

University of California San Francisco

505 Parnassus Avenue, #M1493

San Francisco, CA, 94117‭

United States

Phone: 1 415 514 1000

Email: smitha.ganeshan@ucsf.edu


Background: In many large health centers, patients face long appointment wait times and difficulties accessing care. Last-minute cancellations and patient no-shows leave unfilled slots in a clinician’s schedule, exacerbating delays in care from poor access. The mismatch between the supply of outpatient appointments and patient demand has led health systems to adopt many tools and strategies to minimize appointment no-show rates and fill open slots left by patient cancellations.

Objective: We evaluated an electronic health record (EHR)–based self-scheduling tool, Fast Pass, at a large academic medical center to understand the impacts of the tool on the ability to fill cancelled appointment slots, patient access to earlier appointments, and clinical revenue from visits that may otherwise have gone unscheduled.

Methods: In this retrospective cohort study, we extracted Fast Pass appointment offers and scheduling data, including patient demographics, from the EHR between June 18, 2022, and March 9, 2023. We analyzed the outcomes of Fast Pass offers (accepted, declined, expired, and unavailable) and the outcomes of scheduled appointments resulting from accepted Fast Pass offers (completed, canceled, and no-show). We stratified outcomes based on appointment specialty. For each specialty, the patient service revenue from appointments filled by Fast Pass was calculated using the visit slots filled, the payer mix of the appointments, and the contribution margin by payer.

Results: From June 18 to March 9, 2023, there were a total of 60,660 Fast Pass offers sent to patients for 21,978 available appointments. Of these offers, 6603 (11%) were accepted across all departments, and 5399 (8.9%) visits were completed. Patients were seen a median (IQR) of 14 (4-33) days sooner for their appointments. In a multivariate logistic regression model with primary outcome Fast Pass offer acceptance, patients who were aged 65 years or older (vs 20-40 years; P=.005 odds ratio [OR] 0.86, 95% CI 0.78-0.96), other ethnicity (vs White; P<.001, OR 0.84, 95% CI 0.77-0.91), primarily Chinese speakers (P<.001; OR 0.62, 95% CI 0.49-0.79), and other language speakers (vs English speakers; P=.001; OR 0.71, 95% CI 0.57-0.87) were less likely to accept an offer. Fast Pass added 2576 patient service hours to the clinical schedule, with a median (IQR) of 251 (216-322) hours per month. The estimated value of physician fees from these visits scheduled through 9 months of Fast Pass scheduling in professional fees at our institution was US $3 million.

Conclusions: Self-scheduling tools that provide patients with an opportunity to schedule into cancelled or unfilled appointment slots have the potential to improve patient access and efficiently capture additional revenue from filling unfilled slots. The demographics of the patients accepting these offers suggest that such digital tools may exacerbate inequities in access.

J Med Internet Res 2024;26:e52071

doi:10.2196/52071

Keywords



Health care access continues to be a concern as patients endure long wait times to access care [1,2]. In an industry survey, more than half of hospital leaders said that it takes more than 2 weeks to schedule patients on average and that access to specialty care is worse than before the COVID-19 pandemic [3]. Many health systems lack an efficient process to manage unfilled slots from cancellations. Unfilled cancellations and no-shows may exacerbate patient access issues. The process of scheduling and rescheduling appointments often requires significant time and manual labor, as staff members spend time calling patients to schedule, confirm, and reschedule appointments [4].

At busy medical centers, the high volumes of patient visits make this manual process of maintaining waitlists and filling cancelled appointments impractical. More recently, health care systems have used a range of tools to improve appointment completion rates and the process of scheduling patients into unfilled slots. Electronic health record (EHR)–based scheduling tools, phone call reminders, and automated SMS text message appointment reminders can all help ensure patients select an appointment time that works with their schedule, have the option to reschedule, and are prompted to confirm their appointment. EHR-based scheduling tools that allow patients to view available appointments as well as schedule, cancel, and reschedule their appointments through an app or a web-based portal saw a dramatic rise in use during the COVID-19 pandemic, with studies showing improved patient satisfaction [5-10]. EHR self-scheduling tools have been deployed in a variety of use cases, including the COVID-19 vaccine, radiology imaging, and well-child primary care visit scheduling [10,11]. Tools that allow patients to view and schedule themselves into available appointments may allow earlier and easier access for patients, reduce staff burden, and increase clinic volume. More advanced self-scheduling tools can allow for the maintenance of an electronic waitlist that can provide patients with notifications when an earlier appointment is available. Despite these reported benefits, data quantifying the efficacy of these tools, their impact on patient access, and their financial value for health systems are needed to help spur adoption and investment [12].

The University of California, San Francisco (UCSF) Medical Center implemented an EHR module, Fast Pass (Epic Systems Corporation). The Fast Pass module, which is available through Epic’s suite of self-scheduling tools, allows a patient to add themselves to a waitlist for an earlier appointment slot. When a slot becomes available, patients receive an automated notification through SMS text message or email prompting them to log into their patient portal (MyChart) to self-schedule into the new, earlier appointment slot or to keep their existing appointment slot. As an existing module within the Epic EHR, included as part of the MyChart patient portal, there was limited integration necessary, as would be required with a third-party vendor. While the Fast Pass tool had previously been available through Epic, there had been no enterprise-wide implementation effort at UCSF to facilitate its use.

The primary objective of this study was to describe the uptake of the enterprise-wide Fast Pass scheduling tool among patients, understand the impacts of Fast Pass on patient wait times for appointments, and determine the potential incremental dollar value of visits that may have otherwise remained unscheduled. Our secondary aim was to study the uptake of the tool by specialty, given the unique patient needs and workflows in each specialty.


Setting

UCSF Health is a large academic health system with 3 campuses, over 1000 inpatient beds, and 9 primary care practices serving approximately 90,000 patients. UCSF Health has approximately 45,000 hospital admissions and 1.7 million outpatient visits annually. As of January 2022, approximately 89% of adult ambulatory care patients were enrolled in UCSF’s EHR-tethered patient portal. There are a total of 1538 different department entities across all service areas at UCSF. Before November 2022, 103 different departments were sending offers through Fast Pass, and by January 2023, the number of departments sending Fast Pass offers had increased to 220.

Fast Pass Implementation at UCSF

To use Fast Pass, a patient opts into the program through the patient portal and elects to receive notifications through email, SMS text messaging, or both regarding earlier appointment slots as they become available. Fast Pass offers are sent in batches in the evening to multiple patients for a single appointment slot, and patients have 12 hours to sign up for the earlier slot, after which the offer expires. When patients log into their patient portal, they can self-schedule into or decline the earlier appointment. When a patient declines, they have the opportunity to receive another offer at a different time. Beginning in November 2022, a central multidisciplinary team was formed to implement the Fast Pass tool across all departments at UCSF. Staff at the clinics were trained on using and implementing this tool in their respective clinics. A tip sheet was developed to facilitate patient education and awareness, and the tool was added to the UCSF digital and website promotions.

Data Acquisition

Fast Pass offer data from June 18, 2022, to March 9, 2023, for the 220 departments included in the Fast Pass implementation were extracted from the EHR. The demographic and scheduling data associated with the Fast Pass offers were included in the data extraction. Demographic data included patient age, gender, race and ethnicity, insurance financial class, marital status, and primary language. Fast Pass data included the offering department, provider name, visit type, whether the patient had an existing appointment, offer sent date and time, offered slot date and time, visit length, and the date and time the offer was first viewed. Visit types were stratified by in-person office visits (including radiology and procedures), video visits, and nonbillable phone calls. Nonbillable phone calls were excluded from the analysis. Clinics offering the Fast Pass feature were grouped into their respective specialties by a physician-informaticist familiar with the UCSF system and EHR department entities.

The primary exposure of our retrospective cohort study was the dichotomous variable of whether the clinic had access to the enterprise Fast Pass tool. We only analyzed results from clinics that had access to Fast Pass in this study. Fast Pass appointment offers were also stratified according to their 4 possible categorical outcomes: accepted, declined, expired, or unavailable (offer accepted by another patient first). Additionally, Fast Pass–accepted appointments were stratified by 3 possible categorical appointment outcome end points: completed, no-show, and cancelled.

Data Analysis

Descriptive statistics were performed. We compared differences in the patient cohort stratified by Fast Pass offer acceptance status (accepted vs nonaccepted) using the chi-square test for categorical features. A multivariate logistic regression model was built with primary outcome of offer acceptance status to determine predictors of offer acceptance and the size of associations between primary outcome and the demographic factors extracted from the EHR as additional independent variables chosen by theoretical criteria. No stepwise regression was conducted. Offer acceptance status was used as the primary outcome due to Fast Pass functionality primarily being designed to increase the number of appointment slots filled rather than decrease the number of no-shows or cancellations. All analysis was conducted in R (version 3.5.1; R Foundation for Statistical Computing), and a value of P<.05 was considered significant.

Analysis of Fast Pass by Departments

Descriptive analyses were conducted on the outcomes of offers as well as the outcomes of accepted appointments, stratified by specialty.

Visit Dollar Value Calculation

We extracted the Fast Pass offers that were accepted and completed in a 9-month period and calculated the dollar value of the professional fees for these visits. We used the number of visits that Fast Pass added to the schedule within 7 days of the appointment in the 9 months of implementation, the payer mixes for these visits, and the average professional fee collections for the visits by specialty and by payer to estimate the revenue of the added visits. We hypothesize that visits scheduled within 7-days of an appointment were more likely to remain unfilled. These revenue numbers only include physician evaluation and management revenue, not facility or technical fees. We do not include downstream revenue associated with patient visits. While these Fast Pass visits do not represent true net new patient service revenue, they help fill a proportion of cancelled slots that may otherwise have gone unfilled and may create downstream capacity for new patients.

Ethical Considerations

This study was reviewed and approved by the UCSF institutional review board (22–35948) and was determined to not be human participants research. Epic Systems did not fund this study or participate in the analysis, and it is a paid vendor of UCSF.


From June 18, 2022, to March 9, 2023, there were a total of 60,660 Fast Pass offers for 21,978 appointments sent to patients. As of March 17, 2023, there were 182 offers currently active or deleted on the clinic’s end, for a total of 60,478 offers analyzed. The median (IQR) number of requests sent for each appointment was 2 (1-3). Out of 60,478 offers, there were 6703 (11%) offers accepted, for a monthly median of 139 visits across all departments. Of the 21,978 appointments sent, 6703 offers were accepted, for a 30.5% success rate. Of the accepted offers, 294 were beyond the time period of this analysis and therefore excluded, resulting in 6409 total analyzed accepted offers. Of the accepted offers within the time period of this study, 5399 appointments were completed, resulting in an overall completed appointment from the Fast Pass offer rate of 84.2%.

The cohort of patients who accepted their offer was significantly more likely to be older, male, White, have English as their primary language, and have an in-person visit, a physician visit, and a shorter appointment as part of their Fast Pass offer (Table 1). There was no difference in Fast Pass offer acceptance by insurance class. In addition, among patients that accepted a Fast Pass offer, patients who were older, male, of White race or ethnicity, had commercial insurance, saw a physician provider, and had a shorter appointment time were more likely to complete their appointment. There were no differences in appointment completion rates between visit types or primary languages.

In a multivariate regression model with the primary outcome of Fast Pass appointment offer acceptance, patients who were aged 65 years or older (odds ratio [OR] 0.86, 95% CI 0.78-0.96) versus patients who were aged younger than 40 years, other ethnicity (OR 0.84, 95% CI 0.77-0.91) versus White patients, primarily Chinese speakers (OR 0.62, 95% CI 0.49-0.79) and other language speakers (OR 0.71, 95% CI 0.57-0.87) versus English-speaking patients were less likely to accept an offer (Table 2). Male patients (OR 1.09, 95% CI 1.03-1.15) were more likely to accept their offer compared with female patients. In terms of the Fast Pass offer details, compared to office visits, phone calls were 41% less likely to be accepted (OR 0.59, 95% CI 0.38-0.86), while video visits were 29% more likely to be accepted (OR 1.29, 95% CI 1.20-1.38). A Fast Pass offer with a physician was 55% more likely to be accepted (OR 1.55, 95% CI 1.46-1.63) compared with a nonphysician offer, while appointments longer than 45 minutes (OR 1.20, 95% CI 1.09-1.32) and between 20 minutes and 45 minutes (OR 1.15, 95% CI 1.09-1.22) were both more likely to be accepted versus those less than 20 minutes. Patients were seen a median (IQR) of 14 (7-33) days before their originally scheduled appointment for those that eventually completed their visits from a Fast Pass offer (Figure 1).

Table 1. Demographics of Fast Pass offers stratified by offer acceptance.

Accepted (n=6703), n (%)Not accepted (n=53,775), n (%)P value
Patient age (years).01

18-401637 (24.4)12,307 (22.9)

40-642839 (42.4)23,033 (42.8)

≥652227 (33.2)18,435 (34.3)
Gender.004

Male2585 (38.6)19,649 (36.6)

Female4038 (60.2)33,418 (62.1)

Other or unknown80 (1.2)708 (1.3)
Ethnicity<.001

White3807 (56.8)28,825 (53.6)

Black or African American318 (4.7)2717 (5.1)

Hispanic or Latino649 (9.7)5102 (9.5)

Asian, Native Hawaiian, or other Pacific Islander1137 (17.0)99,76 (18.6)

Other or unknown792 (11.8)7155 (13.3)
Primary language<.001

English6462 (96.4)50,832 (94.5)

Spanish68 (1.0)677 (1.3)

Chinese76 (1.1)1110 (2.1)

Other97 (1.4)1156 (2.1)
Insurance.43

Commercial3805 (56.9)30,217 (56.3)

Medicare703 (29.8)5715 (30.0)

Medicaid1994 (10.5)16,099 (10.6)

Other186 (2.8)1676 (3.1)
Provider type<.001

Physician3994 (59.6)27,933 (48.1)

Nonphysician2709 (40.4)25,842 (41.9)
Offered visit type<.001

Office visit, radiology, or procedure5212 (77.9)45,109 (84.0)

Video visit1465 (21.9)291 (0.5)

Nonbillable phone call26 (0.4)8375 (15.6)
Offered appointment length (minutes)<.001

<202813 (42.1)24,580 (45.8)

20-453183 (47.5)24,215 (45.0)

>45700 (10.4)4965 (9.2)
Table 2. Multivariate predictive model for Fast Pass use.
TermORa95% CIP value
Patient age (vs <40 years)

40-640.960.89-1.01.11

≥650.860.78-0.96.005
Sex (vs female)

Male1.091.03-1.15.002

Other or unknown0.910.71-1.15.45
Ethnicity (vs White)

Black or African American0.920.81-1.04.19

Hispanic or Latino1.030.94-1.14.49

Asian, Native Hawaiian, or other Pacific Islander0.970.90-1.04.39

Other or unknown0.840.77-0.91<.001
Primary Language (vs English)

Spanish0.790.60-1.02.09

Chinese0.620.49-0.79<.001

Other0.710.57-0.87.001
Insurance (vs commercial)

Medicare1.060.97-1.16.21

Medicaid1.040.94-1.14.46

Other0.930.79-1.08.36
Visit type (vs office visit)

Nonbillable phone call0.590.38-0.86<.001

Video visit1.291.20-1.38<.001
Provider type (vs nonphysician)

Physician1.551.46-1.63<.001
Appointment length (minutes)

20-451.151.09-1.22<.001

>451.201.09-1.32 <.001

aOR: odds ratio.

Figure 1. Distribution of number of days improvement from existing appointment and rescheduled appointment through Fast Pass.

In a multivariate regression model with primary outcome Fast Pass appointment outcome post–offer acceptance, older patients (aged 40-64 years: OR 1.32, 95% CI 1.11-1.56; P=.001; and aged 65 years or older: OR 1.68, 95% CI 1.27-2.22; P<.001) versus patients aged 40 years or younger, and male patients (OR 1.35, 95% CI 1.17-1.57; P<.001) were more likely to complete their appointment. Patients with Medicaid (OR 0.51, 95% CI 0.42-0.63; P<.001), Medicare (OR 0.64, 95% CI 0.50-0.84; P<.001), and other insurance (OR 0.47, 95% CI 0.33-0.68; P<.001) versus commercial, and patients with appointment lengths longer than 45 minutes (OR 0.79, 95% CI 0.63-1.00; P=.05) compared with appointments less than 20 minutes in length were less likely to complete their appointment. There were no observed differences in the patient cohort in terms of race or ethnicity, primary language, visit type, or provider type for appointment completion.

There was differential uptake of Fast Pass among the specialties at our institution, both by offers accepted and by rates of completed appointments from Fast Pass offers (Figure 2). Oncology (49/211, 23.2%), nephrology (78/431, 18.1%), and pulmonology (70/394, 17.8%) were the specialties with the highest rates of fast pass scheduling offers accepted, while optometry (127/1831, 6.9%), radiology (1459/19134, 7.6%), and integrative medicine (114/283, 8.9%) had the lowest percentage of accepted fast pass offers. Rheumatology (139/155, 89.6%), endocrinology (69/77, 89.6%), and nephrology (66/75, 88.0%) had the highest rates of completed appointments using fast pass offers, while optometry (92/124, 74.1%), obstetrics and gynecology (314/429, 73.2%), and infectious diseases (31/42, 73.8%) had the lowest percentage (Figure 3).

In the Fast Pass revenue analysis, a total of 5387 of the 5399 completed visits added in the first 9 months of implementation across 25 specialties had payer information available and were included in the analysis (Table 3). These visits represent an estimated US $3 million in professional fees.

Figure 2. Outcomes of Fast Pass offers: unavailable, expired, declined, and accepted.
Figure 3. Outcomes of accepted Fast Pass offers: no show, canceled, and completed.
Table 3. Dollar value of physician fees from Fast Pass visits.
SpecialtyCommercial, nMedicare, nMedicaid, nOther, nTotal, NTotal revenue (US $)
Allergy and immunology14510207819.00
Cardiology126187151329103,446.66
Dermatology183214120972,043.80
Endocrinology41151026825,550.09
Family medicine6633101011929,710.25
Gastroenterology935125617567,427.70
General internal medicine3261202811485130,770.54
Hematology440082760.16
Infectious diseases111730319159.47
Integrative medicine4029827924,713.67
Nephrology1942506616,908.58
Neurology768113417453,757.85
Obstetrics and gynecology244292911313113,294.91
Occupational health10001275.89
Oncology2310223714,267.10
Ophthalmology485813512426,609.56
Optometry37321669120,016.01
Other1216322921567,280.05
Otolaryngology16600227907.58
Palliative care241071848.01
Pulmonology2227715717,875.52
Radiology8192401773612721656,417.00
Rheumatology824017013949,700.57
Surgery5633407320996308,187.04
Transplant121041079.55
Urology159149317346103,267.57
Total3137160551113453872,932,094

Overview

FFast Pass is an EHR-based tool that allows patients to schedule into earlier appointment slots and allows clinics to fill unfilled appointments without the significant manual staff time needed to call and reschedule appointments in a timely manner. This study found that patients saw a median improvement in their appointment time slot between existing and rescheduled appointments of 14 days, which supports the ability of self-scheduling tools to facilitate quicker patient access to care. From June 18, 2022, to March 9, 2023, there were a total of 60,660 Fast Pass offers sent to patients. Of these offers, 6703 (11%) Fast Pass offers were accepted across all departments, for a monthly median of 139 visits across all departments. The median (IQR) number of requests for each appointment slot was 2 (1-2), which further highlights that this tool can free up staff time for direct patient care tasks or to assist patients who are less familiar with the EHR patient portal. This study builds on existing literature on the Fast Pass EHR tool by supporting its feasibility and implementation in a large tertiary health system and adding additional data on patient use of the tool, impacts across specialties, and potential revenue impacts [12].

When we analyzed Fast Pass offers by specialty, we found differential uptake of Fast Pass among the specialties at our institution. Oncology, nephrology, and pulmonology were the specialties with the highest rates of fast pass scheduling offers, while optometry, radiology, and integrative medicine had the lowest percentage of accepted fast pass offers. It is possible that our oncology, nephrology, and pulmonology clinics have longer wait times for ambulatory visits or higher cancellation rates, and therefore, the clinics and patients gain more value from automated Fast Pass offers and scheduling. Additional qualitative research may help us understand the drivers of differential uptake to better facilitate enterprise-wide implementation efforts of scheduling tools.

In our analysis, Fast Pass added 2576 patient service hours to the clinical schedule, with a median (IQR) of 251 (216-322) hours per month. The dollar value of physician fees from these visits through Fast Pass scheduling for 9 months of Fast Pass at our institution was approximately US $3 million. While this dollar value does not necessarily equate to the net additional revenue impact of Fast Pass, it does help quantify the financial impacts of self-scheduling tools since Fast Pass is an add-on tool in the EHR with minimal incremental cost [4]. Collecting additional longitudinal data will help us better understand the overall impacts of Fast Pass on ambulatory visit volumes, template use, patient access, and no-show rates.

This study builds on existing demographic data on digital tools, which show that patients who are White, in a younger age group, and have commercial insurance are more likely to use the tool [11,13-17]. These data raise concerns about equity in access to digital tools as health systems increasingly rely on EHR applications to facilitate front- and back-office functions, including triage and scheduling. Our data suggests that Fast Pass and similar tools may limit access to earlier appointments for older, non-White patients with a preferred language other than English. Given the personnel cost savings, these digital tools are likely to increase in use, and resources need to be channeled into supporting equitable adoption by all patients, or in the case of scheduling, ensuring that a system is put in place for offering patients earlier appointments based on clinical urgency. Possible interventions that may be undertaken to improve health equity in the area of self-scheduling include outreach of the tool in multiple languages or product design to increase accessibility for patients with low technological literacy.

Limitations

Our study has several limitations. This experience at an urban academic medical center with a wide clinical catchment area across California may not be generalizable to other regions of the country or other institutions, especially facilities with fewer subspecialty referrals or with different EHRs lacking a scheduling tool directly connected with their patient portal. This was also designed as a retrospective study. Future prospective natural experiments that test different modes of patient and clinician engagement could identify the key factors necessary for the successful implementation of this program enterprise-wide. In addition, the use of Fast Pass was limited to patients who had access to their MyChart portal. Previous data have demonstrated inequities in the ability to access and use MyChart, which may further compound inequities in using the Fast Pass tool. We did not have data to access patient digital literacy, which may play an important role in Fast Pass use. Further qualitative data are needed to better understand the specific patient and clinical workflow factors that lead to differential use of Fast Pass across specialties and departments.

Conclusions

Fast Pass, an EHR-based self-scheduling tool, afforded patients the opportunity for an earlier appointment and provided our medical center with the opportunity to efficiently capture revenue from cancelled appointments. We found that the patients accepting these offers were more likely to be older, male, and English-speaking, suggesting that these digital tools could exacerbate inequities in access.

Authors' Contributions

SG and AWL had full access to all of the data in the study and took responsibility for the integrity of the data and the accuracy of the data analysis. SG, AWL, AYO, and MM conceived of the study concept and design and reviewed and provided critical feedback on manuscript drafts. AK, PA, RS, AR, and DV were instrumental in building and implementing the Fast Pass tool, provided feedback on the study concept and design, and also provided critical feedback on drafts. All authors read and approved the final manuscript.

Conflicts of Interest

None declared.

  1. Kyle MA, Tipirneni R, Thakore N, Dave S, Ganguli I. Primary care access during the COVID-19 pandemic: a simulated patient study. J Gen Intern Med. Dec 2021;36(12):3766-3771. [FREE Full text] [CrossRef] [Medline]
  2. Berchick ER, Hood E, Barnett JC. Health insurance coverage in the United States: 2017 current population reports. United States Census Bureau. 2018. URL: https://www.census.gov/library/publications/2018/demo/p60-264.html [accessed 2024-01-30]
  3. Hospital patient volumes move towards 2019 levels. McKinsey & Company. URL: https:/​/www.​mckinsey.com/​industries/​healthcare/​our-insights/​survey-us-hospital-patient-volumes-move-back-towards-2019-levels [accessed 2023-07-27]
  4. Judson TJ, Odisho AY, Neinstein AB, Chao J, Williams A, Miller C, et al. Rapid design and implementation of an integrated patient self-triage and self-scheduling tool for COVID-19. J Am Med Inform Assoc. Jun 01, 2020;27(6):860-866. [FREE Full text] [CrossRef] [Medline]
  5. Cao W, Wan Y, Tu H, Shang F, Liu D, Tan Z, et al. A web-based appointment system to reduce waiting for outpatients: a retrospective study. BMC Health Serv Res. 2011;11:318. [FREE Full text] [CrossRef] [Medline]
  6. Zhang X, Yu P, Yan J. Patients' adoption of the e-appointment scheduling service: a case study in primary healthcare. Stud Health Technol Inform. 2014;204:176-181. [CrossRef]
  7. Lyles CR, Nelson EC, Frampton S, Dykes PC, Cemballi AG, Sarkar U. Using electronic health record portals to improve patient engagement: research priorities and best practices. Ann Intern Med. Jun 02, 2020;172(11 Suppl):S123-S129. [FREE Full text] [CrossRef] [Medline]
  8. North F, Nelson EM, Majerus RJ, Buss RJ, Thompson MC, Crum BA. Impact of web-based self-scheduling on finalization of well-child appointments in a primary care setting: retrospective comparison study. JMIR Med Inform. Mar 18, 2021;9(3):e23450. [FREE Full text] [CrossRef] [Medline]
  9. Han HR, Gleason KT, Sun CA, Miller HN, Kang SJ, Chow S, et al. Using patient portals to improve patient outcomes: systematic review. JMIR Hum Factors. 2019;6(4):e15038. [FREE Full text] [CrossRef] [Medline]
  10. Zhao P, Yoo I, Lavoie J, Lavoie BJ, Simoes E. Web-based medical appointment systems: a systematic review. J Med Internet Res. 2017;19(4):e134. [FREE Full text] [CrossRef] [Medline]
  11. Wallace LS, Angier H, Huguet N, Gaudino JA, Krist A, Dearing M, et al. Patterns of electronic portal use among vulnerable patients in a nationwide practice-based research network: from the OCHIN Practice-based Research Network (PBRN). J Am Board Fam Med. 2016;29(5):592-603. [FREE Full text] [CrossRef] [Medline]
  12. Chung S, Martinez MC, Frosch DL, Jones VG, Chan AS. Patient-centric scheduling with the implementation of health information technology to improve the patient experience and access to care: retrospective case-control analysis. J Med Internet Res. Jun 10, 2020;22(6):e16451. [FREE Full text] [CrossRef] [Medline]
  13. Denizard-Thompson NM, Feiereisel KB, Stevens SF, Miller DP, Wofford JL. The digital divide at an urban community health center: implications for quality improvement and health care access. J Community Health. 2011;36(3):456-460. [CrossRef] [Medline]
  14. Anthony DL, Campos-Castillo C, Lim PS. Who isn't using patient portals and why? Evidence and implications from a national sample of US adults. Health Aff (Millwood). 2018;37(12):1948-1954. [FREE Full text] [CrossRef] [Medline]
  15. Health information technology: HHS should assess the effectiveness of its efforts to enhance patient access to and use of electronic health information. U.S. GAO. URL: https://www.gao.gov/products/gao-17-305 [accessed 2021-10-24]
  16. Grossman LV, Creber RMM, Benda NC, Wright D, Vawdrey DK, Ancker JS. Interventions to increase patient portal use in vulnerable populations: a systematic review. J Am Med Inform Assoc. 2019;26(8-9):855-870. [FREE Full text] [CrossRef] [Medline]
  17. Antonio MG, Petrovskaya O, Lau F. The state of evidence in patient portals: umbrella review. J Med Internet Res. 2020;22(11):e23851. [FREE Full text] [CrossRef] [Medline]


EHR: electronic health record
UCSF: University of California, San Francisco


Edited by T de Azevedo Cardoso; submitted 21.08.23; peer-reviewed by H Veldandi, A Gangadhara Rao, C Asuzu; comments to author 21.11.23; revised version received 23.12.23; accepted 22.01.24; published 19.03.24.

Copyright

©Smitha Ganeshan, Andrew W Liu, Anne Kroeger, Prerna Anand, Richard Seefeldt, Alexis Regner, Diana Vaughn, Anobel Y Odisho, Michelle Mourad. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 19.03.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.