Published on in Vol 21, No 1 (2019): January

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/9076, first published .
Evaluating Digital Maturity and Patient Acceptability of Real-Time Patient Experience Feedback Systems: Systematic Review

Evaluating Digital Maturity and Patient Acceptability of Real-Time Patient Experience Feedback Systems: Systematic Review

Evaluating Digital Maturity and Patient Acceptability of Real-Time Patient Experience Feedback Systems: Systematic Review

Authors of this article:

Mustafa Khanbhai1 Author Orcid Image ;   Kelsey Flott1 Author Orcid Image ;   Ara Darzi1 Author Orcid Image ;   Erik Mayer1 Author Orcid Image

Review

Corresponding Author:

Mustafa Khanbhai, MBChB, BScHons, MRCS (Ed)

Centre for Health Policy

Imperial College London

10th Floor QEQM Building

Centre for Health Policy

London, W2 1NY

United Kingdom

Phone: 44 02033126428

Fax:44 207594184

Email: m.khanbhai@imperial.ac.uk


Background: One of the essential elements of a strategic approach to improving patients’ experience is to measure and report on patients’ experiences in real time. Real-time feedback (RTF) is increasingly being collected using digital technology; however, there are several factors that may influence the success of the digital system.

Objective: The aim of this review was to evaluate the digital maturity and patient acceptability of real-time patient experience feedback systems.

Methods: We systematically searched the following databases to identify papers that used digital systems to collect RTF: The Cochrane Library, Global Health, Health Management Information Consortium, Medical Literature Analysis and Retrieval System Online, EMBASE, PsycINFO, Web of Science, and CINAHL. In addition, Google Scholar and gray literature were utilized. Studies were assessed on their digital maturity using a Digital Maturity Framework on the basis of the following 4 domains: capacity/resource, usage, interoperability, and impact. A total score of 4 indicated the highest level of digital maturity.

Results: RTF was collected primarily using touchscreens, tablets, and Web-based platforms. Implementation of digital systems showed acceptable response rates and generally positive views from patients and staff. Patient demographics according to RTF responses varied. An overrepresentation existed in females with a white predominance and in patients aged ≥65 years. Of 13 eligible studies, none had digital systems that were deemed to be of the highest level of maturity. Three studies received a score of 3, 2, and 1, respectively. Four studies scored 0 points. While 7 studies demonstrated capacity/resource, 8 demonstrated impact. None of the studies demonstrated interoperability in their digital systems.

Conclusions: Patients and staff alike are willing to engage in RTF delivered using digital technology, thereby disrupting previous paper-based feedback. However, a lack of emphasis on digital maturity may lead to ineffective RTF, thwarting improvement efforts. Therefore, given the potential benefits of RTF, health care services should ensure that their digital systems deliver across the digital maturity continuum.

J Med Internet Res 2019;21(1):e9076

doi:10.2196/jmir.9076

Keywords



Background of Patient Experience Feedback

Alongside measures of clinical effectiveness and safety outcomes, patient experience is increasingly recognized as an important indicator of the quality of health care provision and is frequently cited in health policies [1-3]. Yet, in practice and research, the concept of patient experience has had varied uses and is often discussed with little more explanation than the term itself [4]. In 2011, the English National Health Service (NHS) [5] outlined 8 domains that help define patient experience and are critical to patients’ experience of health care services. This has been used as an agreed working definition of patient experience to guide the measurement of patient experience across the NHS.

Patient Experience Feedback in Real Time

One of the essential elements of a strategic approach to improving patients’ experience is to measure and report on patients’ experiences to assess progress, strengthen accountability, and identify new opportunities for improving performance [6]. Evidence suggests that this can be achieved using real-time feedback (RTF) [7]. RTF involves the systematic collection, analysis, and reporting of information from individuals at the point of care [8]. Previous studies have found that RTF has the potential to enable health care organizations to respond promptly to patients’ concerns and make timely improvements to services [7,9-11].

Collecting Real-Time Feedback Using Digital Technology

With the ability to maximize the scalability and speed of data collection while reducing cost [12], digital tools (tablets, kiosks, emailed survey, and websites) are increasingly being utilized to gather patient experience feedback [13,14], allowing summary results to be reported on an ongoing [15] real-time basis. In a recent Cochrane review [16], self-administered survey questionnaire responses collected using mobile apps were compared with those collected using other methods, and it was concluded that the delivery of survey questionnaires through apps does not affect data equivalence and can improve data completeness. However, none of the questionnaires evaluated patient experience.

There is a growing pressure on health care services to embrace digital technologies tosignificantly improve the patient experience [17]. With an increase in adoption of digital systems pertaining to RTF, health care services must recognize and overcome the barriers that may hinder successful integration and uptake of such technologies. One of the ways of achieving this is through systematic evaluation and monitoring of digital systems to ensure they operate in the way they are intended and cultivate a better patient experience [17]. Digital maturity—the extent to which digital technologies are used as enablers to deliver a high-quality health service—is an emerging concept across developed health care systems [17]. Evidence suggests that digital maturity is linked to better outcomes and is indicative of a well-performing organization [18]. Evaluating digital systems for digital maturity highlights where gaps exist in maturity, which presents an opportunity to address the specific shortcomings [17]. The digital maturity of the systems that are being used to collect RTF has not been previously evaluated. A deeper understanding of the limitations of individual digital systems may help health care services pinpoint areas in need of improvement and organizational change. Without this, digital systems used for RTF may not be successful, and patients’ voices may go unheard.

Therefore, this systematic review aims to evaluate the digital maturity of digital RTF systems. Specific objectives were to (1) describe the digital modes utilized; (2) provide insights into patients’ and staff views of these digital systems; and (3) demonstrate digital maturity by systematically scoring individual digital systems.


Search Strategy

The following databases were searched: Medical Literature Analysis and Retrieval System Online, EMBASE, PsycINFO, The Cochrane Library (Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Cochrane Methodology Register), Global Health, Health Management Information Consortium, CINAHL, and Web of Science. In addition, gray literature and Google Scholar were utilized to extract papers that were not retrieved in the databases searched. Publications from January 2000 to February 2017 were included. We limited our electronic searches to studies on or after 2000 because it was around this time when the digital technology revolution in health care began to emerge [19]. Owing to the diversity of terms used inferring patient experience, combinations of search terms were used. Multimedia Appendix 1 provides the complete list of subheadings (Medical Subject Headings) and keywords.

Inclusion Criteria

Studies were deemed eligible for inclusion if (1) digital modes of administration were employed; (2) collection was in real time (at the point of care) or near real time (immediately after discharge); and (3) they were conducted in a primary or secondary care setting. There was no restriction on the age of the patient population of interest. Both quantitative and qualitative studies were included. Multimedia Appendix 1 provides further details of the inclusion criteria.

Search Flow

The research adhered to the guideline presented in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2009 checklist [20]. Data analysis involved the comparison of included studies and extracted data. Due to the heterogeneous nature of the studies, a narrative synthesis was deemed most appropriate. This was followed by scoring the digital systems reported in the studies to determine the digital maturity.

Digital Maturity

An existing framework [17] was utilized to determine the digital maturity of individual digital systems. This framework is embedded in the literature and has the benefit of systematically assessing the effectiveness of any digital system in health care. Each included study was scored across four key domains—capacity or resource, usage, interoperability, and impact. The framework highlights key questions for each domain and is based on “yes” or “no” response to each question. We assigned a maximum of 1 point if the digital system in each study demonstrated appropriate evidence for that particular domain. The maximum overall points a study could achieve was 4. This indicates the overall success of the digital platform [17]. Two reviewers (MK and KF) independently scored each study; disagreements in scoring were resolved by discussion between the two reviewers. Interrater agreement kappa was calculated.


Overall Description

The initial search returned 3456 papers; after removing duplicates, 3438 papers were retained. The titles and abstracts were screened, and 112 papers were identified as potentially eligible for inclusion. Full-text papers were retrieved and assessed for inclusion, of which 13 were retained for final inclusion. RTF was defined as feedback collected while patients are in a hospital, at the point of care, while receiving care, or immediately after discharge. The main reason for exclusion was the papers did not report patient experiences or reported user experience of other digital technology. Figure 1 illustrates the PRISMA flowchart representing the study selection process and reasons for exclusion.

The reasons for exclusion of records (n=3326) and full-text papers (n=99) were the following: nondigital mode of administration (n=241), not real time or near real time (n=130), assessment or evaluation of other parameters (n=1489), feasibility or usability testing of digital systems not related to patient experience data collection (n=655), experience of other digital technology (n=848), and reviews (n=62).

Systematic Review of Studies

Study Characteristics

Of the 13 studies included in the final review (Table 1), 5 were based in general practice [10,21-24] and the rest were based in an acute care hospital setting [9,25-31]. Of all the studies, 5 were from the United Kingdom [21,22,24,25,28], 7 from the United States [9,10,26,27,29-31], and 1 from Canada [23]. All but one [28] studies were based on adult populations. However, the experience was reported by adults (parents or relatives) for the study [28] conducted in the neonatal ward. Five studies were qualitative studies on patients’ or staff views [21,24] and on barriers to and facilitators [25,27,28] of RTF. The remainder of the studies used a quantitative approach with varied outcome measures, of which the recurring measures were response rates of RTF [9,10,22,23,27,29] and association with patient demographics, that is, age, gender, ethnicity, and literacy [9,22,23,31].

Figure 1. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses 2009 checklist.
View this figure
Table 1. Study characteristics of the 13 studies included in the systematic review.
Publication title; author(s), yearStudy designDuration of studyTypes of survey questionnaire(s)Mode of administration
Capturing patient experience: a qualitative study implementing real-time feedback in primary care; Carter M et al, 2016 [21]Qualitative3 monthsModified
Staff opinions (semistructured interviews and focus groups) using Normalization Process Theory
Kiosks
Patients’ use and views of real-time feedback technology in general practice; Wright C et al, 2017Exploratory randomized trial3 monthsModified (amalgamated Friends and Family Test, 6 items focusing on access, communication and satisfaction [derived from general practitioner patient survey], 2 practices tailored questionsa)2 touch screens (1 Kiosk and 1 desktop device)
Measuring the patient experience in primary care. Comparing e-mail and waiting room survey delivery in a family health team; Slater M et al, 2016 [23]Cross-sectional comparative analysis1 monthModified (amalgamated Commonwealth Fund International Health Policy Survey, patient demographics, self-rated health)Tablet
Barriers and facilitators of a near real-time feedback approach for measuring patient experiences of hospital care; Kasbauer et al, 2017 [25]Qualitative10 monthsNovel and validated (Compassionate Care Toolkit)Tablet and kiosks
Real-time patient survey data during routine clinical activities for rapid-cycle quality improvement; Wofford et al, 2015 [10]Feasibility study1 monthModified (dental care, waiting room experience, continuity, and internet access)Tablet
Real-time patient experience surveys of hospitalized medical patients; Indovina et al, 2016 [26]Prospective randomized5 monthsPreviously validated (US Department of Health and Human Service and the Hospital Consumer Assessment of Healthcare Providers and Systems [HCAHPS])Web-based platform
Evaluating patient-centred care (PCC): pilot study testing feasibility of electronic data collection in hospitalised older adults; Duffy et al, 2012 [27]Cross-sectional feasibility3 monthsPreviously validated (e-Caring Assessment Tool)Tablet
Patient experience tracker (PET) survey as measure of quality in the neonatal unit; Aladangady et al, 2011 [28]QualitativeN/AbNovelTablet
Development and validation of the tool to assess inpatient satisfaction with care from hospitalists; Torok et al, 2014 [29]Cross-sectional3 monthsNovel and validated (Tool to Assess Inpatient Satisfaction with Care from Hospitalists)Tablet
Paper
Incentivized digital outcomes collection; Isenberg S et al, 2001 [30]Feasibility study3 monthsPreviously validated (Visit Rating Questionnaire)Telephone (electronic voice response technology)
Exploring patients’ views toward giving Web-based feedback and rating to general practitioners in England: a qualitative descriptive study; Patel et al, 2016 [24]Descriptive exploratory qualitative approachN/APreviously validated (Friends and Family Test)Web-based platform (National Health Service Choices)
Paper
Obtaining patient feedback at point of service using electronic kiosks; Dirocco et al, 2011 [9]Feasibility study1.5 monthsModified (National Committee for Quality Assurance’s Healthcare Effectiveness Data and Information Set and Quality Measurement standards)Kiosk
Improving patient satisfaction through physician education, feedback, and incentives; Banka et al, 2015 [31]Nonrandomized comparative study14 months (7 months each year)Previously validated (Assessing Residents’ Connect with patients, Introduce yourself and role, Communicate, Ask and anticipate, Respond, and Exit courteously and HCAHPS)Paper and Web-based platform (results sent via email)

aFilter questions (tailored to patients visit).

bN/A: not applicable.

Modes of Feedback

RTF was collected using touchscreens (kiosk) [9,21,22,25], tablets [10,23,25,27-29], and Web-based platform [24,26,30,31] in the included studies. With regards to the timing of feedback, 11 studies collected their feedback in real time [9,10,21-23,25,27-31] and 2 studies in near real time [24,30] (ie, within 48 hours of discharge). Patient experience questionnaires used in each study varied; 5 studies modified an existing questionnaire [9,10,21-23], 5 used previously validated questionnaires [24,26,27,30,31], 2 used novel and validated questionnaires [25,29], and 1 used a novel and nonvalidated questionnaire [28]. The majority of the questionnaires were in English only.

Response Rates of Real-Time Feedback

Of the 13 studies, 6 studies [9,10,22,23,27,29] evaluated response rates (percentage) as part of their outcome measures. The response rates were 55.9% [23], 43.4% [10], and 2.5% [22] for studies conducted in primary care and 54.9% [9], 59.2% [27], and 61.5% [29] for those conducted in secondary care. Multimedia Appendix 2 demonstrates the absolute numbers of responses in each study identified above. Only 1 study evaluated response rates of RTF compared with non-RTF and showed that RTF improved response rates (55.9% vs 19.8%) [23].

Completion Time

Of all studies, 4 evaluated time to completion, and this varied from 40.4 seconds [10], <2 minutes [22], 3 minutes [9] to 31 minutes [27]. However, each questionnaire was different as it varied in length and number of questions.

Patient Demographics According to Real-Time Feedback Responses

The patient demographics that were collected varied, and 7 [9,22,23,25,27,29,31] of the 13 studies evaluated responses by patient demographics. Only Slater et al [23] compared RTF and non-RTF responses by patient demographics; RTF showed a higher percentage response in males (43.5 vs 36.6), in age (years) bracket 18-24 and 24-34 (6.7 vs 3.6 and 23.8 vs 18.5, respectively), and in those with lower literacy (34.4 vs 21.3) compared with non-RTF. Banka et al [31] revealed a higher percentage of male respondents (55.3 vs 41.4) with a white predominance (62.5 vs 60.9) using RTF compared with non-RTF. However, Wright et al [22] revealed different findings. There was a higher percentage response in females in the 46-65 (26.2%) and >65 (33.8%) age range and higher response from white than from ethnic patients. Dirocco et al [9] showed mixed results in that although the response was higher from females, there was a higher percentage response from African American individuals than from white American individuals (48.5 vs 38.6) and those aged 18-49 years (45.2). Torok et al [29] and Duffy et al [27] showed that the response rates from elderly patients were adequate; however, of these elderly patients, there was an overrepresentation in females [27,29], white individuals [27,29], and educated patients [27] who were aged ≥60 years. Kasbauer et al [25] showed that elderly patients responded to the survey but with the help of volunteers. Multimedia Appendix 3 summarizes the findings from the included studies.

Patient Views of Real-Time Feedback

Patients’ views of RTF were collected in 6 studies [9,10,22,24,27,30]. The main pros of RTF were as follows: easy to complete [10,22], easy to use [9,27], and willingness to use [24,27] the data collection tool. Isenberg et al [30] showed that patients were motivated to use RTF; however, this was incentivized with the reward of free long-distance minutes for patients and with a practice improvement program for the staff. Patel et al [24] identified that younger patients (aged <50 years) found digital platforms more accessible compared with older ones (aged ≥60 years). However, Duffy et al [27] further evaluated opinions of older adults and found a preference toward digital platforms; for example, 70% of patients preferred to answer questions using an iPad. Four studies found that patients thought RTF completion was quick with fast turnaround time [9,10,22,27]. The main cons of RTF were as follows: lack of awareness of the opportunity to leave feedback [22,24], lack of time [22], concerns over technology [22,24], concerns over anonymity [22,24], and age- or disease-related exclusion [24,27]. Interestingly, Wright et al [22] showed that those who did not use RTF were still positive about the idea of providing RTF.

Staff Views of Real-Time Feedback

Staff views were collected in 4 studies [9,21,25,28]. Positive staff views toward RTF were as follows: immediacy of RTF compared with traditional surveys, which helped offset “feedback fatigue,” complemented other forms of feedback with the potential of integrating with other data sources [21]. Staff found the RTF data was felt to be useful when summarized, highlighting areas of improvement at-a-glance on a dashboard [25], and when there was coworking with senior clinical staff [25], increasing staff morale and awareness of RTF. In some studies, free text was found to be more useful than quantitative questions [21,25] as it brought the experiences to life for frontline staff and added a “sense of urgency” to address them in improvement efforts [25]. Concerns included duplication with other forms of feedback, lack of time for patients to reflect on experience, extreme views, exclusion of certain patient groups, staff did not feel included in decision making [21], initial reluctance [28], limited time to review, and lack of access to the results [25].

Evaluating Digital Maturity of Real-Time Feedback Systems

Three studies received a score of 3 [10,25,30], 2 [9,21,22], or 1 [27,28,31] points, respectively, while 4 studies [23,24,26,29] were attributed 0 points. While 7 studies demonstrated capacity or resource, 8 demonstrated impact. None of the studies demonstrated digital systems that were deemed to be mature, that is, did not achieve a full score in all of the 4 domains. We describe in detail how the digital system in each study demonstrated evidence that determined whether a point was given in each of the 4 domains. Following independent scoring by MK and KF, Cohen kappa was calculated as 0.98, suggesting an almost perfect agreement. There was only one domain where the scoring differed (Usage) [26], and through discussion, a final score of 0 was inputted. Multimedia Appendix 4 details the individual scoring with a description on each domain.


Principal Findings

This review highlights that digital modes of administration of RTF are well accepted by patients and staff, with response rates equivalent to, and in some studies better than, non-RTF. From a patient’s perspective, it has been reported that its influx alongside demographic shifts such as increasing aging population and ethnicity [32,33] can lead to patient disengagement and poor uptake of these technologies [34]. On the contrary, we have shown that patients are in fact willing to engage with this technology for experience reporting, thereby disrupting previous nonreal-time, paper-based feedback. However, digital technology may not be preferred by all patients; therefore, health care organizations need to be mindful and consider other means of inclusivity when developing digital health care systems. From a staff perspective, although RTF is well received, problems arising from the lack of robust digital infrastructure [25] can thwart improvement efforts. From a health care organizational perspective, most digital systems were unable to demonstrate interoperability and very few demonstrated impact and, therefore, were not deemed digitally mature, compromising success within the organization.

Digital Maturity of Existing Real-Time Feedback

By nature, maturity frameworks not only identify components of a successful system but also capture the evolution of a digital system from conception to implementation to impact. Using evidence where possible from the included studies, we highlight how each of the 4 domains contribute to ensuring digital maturity.

Capacity or Resource

The success of digital health is contingent on establishing the necessary capacity and resources to build, use, and support access to high-quality health services and harvest useful information in the health system. There was a general lack of analytical support in most digital systems to extract valuable information such as the ability to examine user-specific interaction [10] or adjust survey responses [23]. A Kings’ Fund report explains that gleaning information from experience data requires the same analytical capability as interpreting clinical data; however, this is often unavailable [35]. The resource capacity requirements stretch beyond health professionals and technology specialists across the continuum of care to include health information managers to information security professionals [36]. Staff time was an important barrier in data collection, and this is in keeping with other studies that reported a lack of time or resources to collect, analyze, or act on data and need for staff training in data analysis and statistics to facilitate full understanding and use of results [37-39]. To circumvent this, some of the studies used volunteers [25,31] and incentives [30,31] to gather data. This generates concerns such as response bias and competition among clinicians, and there may be an element of the Hawthorne effect, whereby clinicians modify their behavior as they are being observed, which may explain the positive outcomes in those studies. Going forward, a strategic approach should take into consideration building human and institutional capacity, nurturing clinical and community champions, and developing the base of knowledgeable users to drive appropriate adoption of digital systems [36].

Usage

As the needs and experiences differ greatly across these groups, a flexible or responsive data collection mode is needed, which can aid patients during the data collection process. While most patients in the included studies were comfortable using the digital system and required little prompting or help once they engaged with the survey, some differences were noted. Concerns exist that older patients are less comfortable with technology [40]; however, there was an overrepresentation in responses from patients aged >65 years in the included studies [22,25,27,29,31]. Moreover, some of these studies were conducted in the elderly population [25,27]. Despite this, the response rate in this patient group was adequate, suggesting that their movement into digital life is evolving. Furthermore, when certain conditions are met such as providing a supportive, nonhurried environment [41]; bold, plain, and large font with fewer graphics; and avoidance of certain colors [42], older adults can be successful users. The use of videos in the software can enhance accuracy and acceptability [10]. Furthermore, trained volunteers can provide a responsive approach to real-time data collection from lesser heard groups [14], increasing patient engagement and, subsequently, improving response rates. In addition, they can help reduce the data collection burden, which may otherwise fall on clinical staff and may even account for the false positives as seen in some surveys [43] due to the presence of staff during survey completion.

If data collection is obtrusive, unrealistic, or inaccurate, it can undermine enthusiasm for assessing and improving practice quality [44,45]. However, the digital systems in this review demonstrated quick turnaround of data collection, collation, and dissemination of results. This was key to successful implementation of digital systems as it promoted “buy-in” from staff. Furthermore, patients are fatigued of requests for feedback in health care and in daily life; hence, data collection should be quick, focused, and part of routine care to encourage participation from patients and clinicians and to be sustainable in a busy setting [10].

Interoperability

From a health care perspective, interoperability is needed to reform the chaotic, and, at times, dysfunctional nature of how information is shared within and among health care services. There was a ubiquitous shortfall in achieving interoperability among the digital systems in this review. Data that are not interoperable cannot be analyzed alongside other data indicative of care quality. This perpetuates the siloed approach to data interpretation and creates a chasm between data and information for improvement. Some of the challenges in achieving interoperability include resistance from some vendors, prohibitively high data exchange fees, and lack of incentives to develop interoperability and technical variations. Without tackling the interoperability of RTF systems, it is likely that health care organizations will fail on delivering impactful quality improvement. This must now be a major focus for health care services, but it should be done in an organized way that prioritizes interoperability so that patient feedback data can communicate seamlessly to generate information that will benefit health care services and patients alike.

Impact

To genuinely demonstrate impact, the digital system should not only be able to generate quality improvement activities but also demonstrate sustainability and cost-effectiveness. Three studies [10,25,30] achieved a score for impact as they demonstrated a change in practice following the implementation of RTF. However, the majority of the digital systems were not able to demonstrate any impact. The duration of studies in this review was short (between 1 and 12 months), and the sustainability of the digital system to continue to deliver quality improvement could not be evaluated. Without considerations of sustainability, digital programs are of limited value.

Limitations

Due to publication bias, there may be unpublished evidence of nonsignificant or negative findings or findings held locally, which are not published or otherwise publicly available. We ascribed a level of digital maturity of individual systems based on the evidence provided in the studies. If this was not discernible while reviewing the studies, we assumed it to be lacking. As the remit of the included papers differed greatly, their digital systems may, in fact, have existing procedures for ensuring capacity, usage, interoperability, and impact, but the authors did not specify this information in the published paper. Of note, the studies were not evaluated for research quality before being included in the synthesis owing to the limited number of studies that were identified after exclusion.

Conclusions

There was not a large body of published literature on digital modes in relation to RTF. However, the evidence provided by the studies in this review demonstrates that there is a potential in using digital modes of administration of RTF as an agent for improving service delivery. Patients and staff alike are willing to engage in RTF, demonstrated by acceptable response rates. However, for RTF to be impactful, health care organizations must ensure that they have strategies in place to deliver across all levels of digital maturity. Health care services have the capacity to introduce digital solutions for RTF; however, lack of interoperability is slowing down progress. In addition, it is possible that some health care services may be wasting effort and resources when they invest in digital technologies for RTF. On balance, the direction of the health care ecosystem toward embracing digital technology looks promising, and as health care shifts toward a patient-centered model, digital technology will be an important partner in this transformation.

Acknowledgments

The research was supported by the National Institute for Health Research (NIHR) Imperial Patient Safety Translational Research Centre, NIHR Imperial Biomedical Research Centre, and The Health Foundation. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR, or the Department of Health.

Authors' Contributions

MK, KF, and EM contributed to the design of the study. MK and KF managed the review process. MK led on screening, data extraction, and scoring. KF assisted in data synthesis and scoring. MK drafted the manuscript, and KF, AD, and EM reviewed the manuscript.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Subheadings (MESH) and Keywords.

PDF File (Adobe PDF File), 35KB

Multimedia Appendix 2

This demonstrates the percentage response rates for RTF completion including absolute number of patients in each study.

PDF File (Adobe PDF File), 16KB

Multimedia Appendix 3

Response rate in percentage and representation of responses according to patients’ demographics documented from the studies in the systematic review.

PDF File (Adobe PDF File), 50KB

Multimedia Appendix 4

Individual scores in relation to the digital maturity framework domains. A score of one indicates that the study demonstrates evidence in that particular domain, otherwise a score of zero is ascribed.

PDF File (Adobe PDF File), 50KB

  1. Darzi A. High quality care for all: NHS Next Stage Review final report. 2008. Department of Health   URL: https:/​/www.​gov.uk/​government/​publications/​high-quality-care-for-all-nhs-next-stage-review-final-report [accessed 2018-11-15] [WebCite Cache]
  2. Department of Health and Social Care. Gov.UK. 2010 Jul 12. Liberating the NHS white paper   URL: https://www.gov.uk/government/publications/liberating-the-nhs-white-paper [accessed 2018-11-15] [WebCite Cache]
  3. Foot C, Fitzsimons B. The policy and practice of measuring patient experience. Nurs Manag (Harrow) 2011 Jun;18(3):18-19. [CrossRef] [Medline]
  4. Niederhauser V, Wolf J. Patient Experience: A Call to Action for Nurse Leadership. Nurs Adm Q 2018;42(3):211-216. [CrossRef] [Medline]
  5. NHS National Quality Board. Department of Health. 2011. NHS Patient Experience Framework   URL: https:/​/assets.​publishing.service.gov.uk/​government/​uploads/​system/​uploads/​attachment_data/​file/​215159/​dh_132788.​pdf [accessed 2018-11-15] [WebCite Cache]
  6. Coulter A. The King's Fund. 2012. Leadership for patient engagement   URL: https:/​/www.​kingsfund.org.uk/​sites/​default/​files/​leadership-patient-engagement-angela-coulter-leadership-review2012-paper.​pdf [accessed 2018-11-15] [WebCite Cache]
  7. Northumbria Healthcare NHS Foundation Trust. 2017 Apr 13. Northumbria supports patient experience feedback across the NHS   URL: https://www.northumbria.nhs.uk/northumbria-supports-patient-experience-feedback-across-nhs/ [accessed 2017-12-01]
  8. Garrisson M, Wolf J. The Beryl Institute. 2016 Apr. The Role of the Volunteer in Improving Patient Experience   URL: http:/​/www.​theberylinstitute.org/​news/​284517/​The-Role-of-the-Volunteer-in-improving-Patient-Experience-Explored-by-The-Beryl-Institute-.​htm [accessed 2017-11-12]
  9. Dirocco DN, Day SC. Obtaining patient feedback at point of service using electronic kiosks. Am J Manag Care 2011 Jul 01;17(7):e270-e276 [FREE Full text] [Medline]
  10. Wofford JL, Campos CL, Jones RE, Stevens SF. Real-time patient survey data during routine clinical activities for rapid-cycle quality improvement. JMIR Med Inform 2015 Mar 12;3(1):e13 [FREE Full text] [CrossRef] [Medline]
  11. Larsen D, Peters H, Keast J, Devon R. Using real time patient feedback to introduce safety changes. Nurs Manag (Harrow) 2011 Oct;18(6):27-31. [CrossRef] [Medline]
  12. Greenlaw C, Brown-Welty S. A comparison of web-based and paper-based survey methods: testing assumptions of survey mode and response cost. Eval Rev 2009 Oct;33(5):464-480. [CrossRef] [Medline]
  13. Lane SJ, Heddle NM, Arnold E, Walker I. A review of randomized controlled trials comparing the effectiveness of hand held computers with paper methods for data collection. BMC Med Inform Decis Mak 2006 May 31;6:23 [FREE Full text] [CrossRef] [Medline]
  14. Shih T, Fan X. Comparing response rates in e-mail and paper surveys: A meta-analysis. Educational Research Review 2009 Jan;4(1):26-40. [CrossRef]
  15. Kelsey T. NHS England. 2014 Jul. The Friends and Family Test   URL: https://www.england.nhs.uk/wp-content/uploads/2014/07/fft-imp-guid-14.pdf [accessed 2018-11-15] [WebCite Cache]
  16. Marcano Belisario JS, Jamsek J, Huckvale K, O'Donoghue J, Morrison CP, Car J. Comparison of self-administered survey questionnaire responses collected using mobile apps versus other methods. Cochrane Database Syst Rev 2015 Jul 27(7):MR000042. [CrossRef] [Medline]
  17. Flott K, Callahan R, Darzi A, Mayer E. A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review. J Med Internet Res 2016 Apr 14;18(4):e75 [FREE Full text] [CrossRef] [Medline]
  18. NHS England Data Catalogue. Digital Maturity Assessment 2015/2016   URL: https:/​/data.​england.nhs.uk/​dataset/​digital-maturity-assessment-2015-2016/​resource/​06f03a99-6451-4c15-a795-99aafc840cf0 [accessed 2018-11-15] [WebCite Cache]
  19. Macnaughtan L. Innovatemedtec. 2015 May 25. The Curious Case of Digital Health   URL: https://innovatemedtec.com/content/the-curious-case-of-digital-health [accessed 2017-11-13]
  20. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009 Jul 21;6(7):e1000097 [FREE Full text] [CrossRef] [Medline]
  21. Carter M, Davey A, Wright C, Elmore N, Newbould J, Roland M, et al. Capturing patient experience: a qualitative study of implementing real-time feedback in primary care. Br J Gen Pract 2016 Nov;66(652):e786-e793 [FREE Full text] [CrossRef] [Medline]
  22. Wright C, Davey A, Elmore N, Carter M, Mounce L, Wilson E, et al. Patients' use and views of real-time feedback technology in general practice. Health Expect 2017 Dec;20(3):419-433 [FREE Full text] [CrossRef] [Medline]
  23. Slater M, Kiran T. Measuring the patient experience in primary care: Comparing e-mail and waiting room survey delivery in a family health team. Can Fam Physician 2016 Dec;62(12):e740-e748 [FREE Full text] [Medline]
  24. Patel S, Cain R, Neailey K, Hooberman L. Exploring Patients' Views Toward Giving Web-Based Feedback and Ratings to General Practitioners in England: A Qualitative Descriptive Study. J Med Internet Res 2016 Dec 05;18(8):e217 [FREE Full text] [CrossRef] [Medline]
  25. Käsbauer S, Cooper R, Kelly L, King J. Barriers and facilitators of a near real-time feedback approach for measuring patient experiences of hospital care. Health Policy Technol 2017 Mar;6(1):51-58 [FREE Full text] [CrossRef] [Medline]
  26. Indovina K, Keniston A, Reid M, Sachs K, Zheng C, Tong A, et al. Real-time patient experience surveys of hospitalized medical patients. J Hosp Med 2016 Apr;11(4):251-256. [CrossRef] [Medline]
  27. Duffy JR, Kooken WC, Wolverton CL, Weaver MT. Evaluating patient-centered care: feasibility of electronic data collection in hospitalized older adults. J Nurs Care Qual 2012;27(4):307-315 [FREE Full text] [CrossRef] [Medline]
  28. Aladangady N, Negus J. Patient Experience Tracker (PET) survey as measure of quality in the Neonatal Unit. Clinical Risk 2011 May 24;17(3):88-91. [CrossRef]
  29. Torok H, Ghazarian SR, Kotwal S, Landis R, Wright S, Howell E. Development and validation of the tool to assess inpatient satisfaction with care from hospitalists. J Hosp Med 2014 Sep;9(9):553-558. [CrossRef] [Medline]
  30. Isenberg SF, Davis CL, Adams CE, Isenberg JS, Isenberg MA. Incentivized digital outcomes collection. Am J Med Qual 2001;16(6):202-211. [CrossRef] [Medline]
  31. Banka G, Edgington S, Kyulo N, Padilla T, Mosley V, Afsarmanesh N, et al. Improving patient satisfaction through physician education, feedback, and incentives. J Hosp Med 2015 Aug;10(8):497-502. [CrossRef] [Medline]
  32. Goldzweig C, Orshansky G, Paige NM, Towfigh AA, Haggstrom DA, Miake-Lye I, et al. Electronic patient portals: evidence on health outcomes, satisfaction, efficiency, and attitudes: a systematic review. Ann Intern Med 2013 Nov 19;159(10):677-687. [CrossRef] [Medline]
  33. Irizarry T, DeVito Dabbs A, Curran CR. Patient Portals and Patient Engagement: A State of the Science Review. J Med Internet Res 2015 Jun 23;17(6):e148 [FREE Full text] [CrossRef] [Medline]
  34. Krane D, Vella A, Bechtel C, Lederer L, Padget J, Latham B, et al. National Partnership for Women & Families. 2014 Dec. Engaging Patients and Families: How Consumers Value and Use Health IT   URL: http:/​/www.​nationalpartnership.org/​research-library/​health-care/​HIT/​engaging-patients-and-families.​pdf [accessed 2018-11-15] [WebCite Cache]
  35. Robert GC, Cornwell J, Brearley S, Foot C, Goodrich J, Joule N, et al. Developing the evidence base for measuring and improving patient experience. NHS Institute for Innovation and Improvement 2011 Nov;London:1-200 [FREE Full text]
  36. Ho K, Al-Shorjabji N, Brown E, Zelmer J, Gabor N, Maeder A, et al. Applying the Resilient Health System Framework for Universal Health Coverage. Stud Health Technol Inform 2016;231:54-62. [Medline]
  37. Barr JK, Giannotti TE, Sofaer S, Duquette CE, Waters WJ, Petrillo MK. Using public reports of patient satisfaction for hospital quality improvement. Health Serv Res 2006 Jun;41(3 Pt 1):663-682 [FREE Full text] [CrossRef] [Medline]
  38. Reeves R, Seccombe I. Do patient surveys work? The influence of a national survey programme on local quality-improvement initiatives. Qual Saf Health Care 2008 Dec;17(6):437-441 [FREE Full text] [CrossRef] [Medline]
  39. Davies E, Shaller D, Edgman-Levitan S, Safran DG, Oftedahl G, Sakowski J, et al. Evaluating the use of a modified CAHPS survey to support improvements in patient-centred care: lessons from a quality improvement collaborative. Health Expect 2008 Jun;11(2):160-176 [FREE Full text] [CrossRef] [Medline]
  40. Huntington P, Williams P, Nicholas D. Age and gender differences of a touch-screen kiosk: a study of kiosk transaction log files. Inform Prim Care 2002 Mar;10(1):3-9 [FREE Full text]
  41. Murata A, Iwase H. Usability of touch-panel interfaces for older adults. Hum Factors 2005;47(4):767-776. [CrossRef] [Medline]
  42. Czaja SJ, Charness N, Fisk AD, Hertzog C, Nair SN, Rogers WA, et al. Factors predicting the use of technology: findings from the Center for Research and Education on Aging and Technology Enhancement (CREATE). Psychol Aging 2006 Jun;21(2):333-352 [FREE Full text] [CrossRef] [Medline]
  43. Insight Team, NHS England. NHS England. 2014 Jul 21. Review of the Friends and Family Test   URL: https://www.england.nhs.uk/wp-content/uploads/2014/07/fft-rev1.pdf [accessed 2018-11-15] [WebCite Cache]
  44. Baxley EG, Bennett KJ, Pumkan C, Crutcher S, Helms MG. “PDSA-ADHD”: a newly reported syndrome. J Am Board Fam Med 2011;24(6):752-757.
  45. Price EL, Mackenzie TD, Metlay JP, Camargo CA, Gonzales R. A computerized education module improves patient knowledge and attitudes about appropriate antibiotic use for acute respiratory tract infections. Patient Educ Couns 2011 Dec;85(3):493-498. [CrossRef] [Medline]


NHS: National Health Service
NIHR: National Institute for Health Research
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses
RTF: real-time feedback


Edited by G Eysenbach; submitted 28.09.17; peer-reviewed by H Brown, L Kayser, L Wintner, N Ribeiro, K Agarwal, D Nault; comments to author 09.12.17; revised version received 24.04.18; accepted 24.09.18; published 14.01.19

Copyright

©Mustafa Khanbhai, Kelsey Flott, Ara Darzi, Erik Mayer. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 14.01.2019.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.