Viewpoint
Abstract
Digital health interventions have gained prominence in recent years, offering innovative solutions to improve health care delivery and patient outcomes. Researchers are increasingly using qualitative approaches to explore patient experiences of using digital health interventions. Yet, the qualitative methods used in these studies can vary widely, and some methods are frequently misapplied. We highlight the methods we find most fit for purpose to explore user experiences of digital tools and propose 5 questions for researchers to use to help them select a qualitative method that best suits their research aims.
J Med Internet Res 2024;26:e62761doi:10.2196/62761
Keywords
Introduction
Digital health interventions encompass a wide range of technologies, such as mobile apps, websites, wearable devices, and telemedicine platforms. Their use in behavioral change and mental health interventions are common; such interventions are increasingly used as both an adjunct to face-to-face care and as standalone interventions [
- ]. Although digital health interventions have ostensible benefits in terms of scalability and potential to improve equity of access, the high dropout rates of digital health interventions, which can reach up to 80% [ , ], raise questions of acceptability and usability. Understanding participants’ experiences in these interventions is therefore crucial for developing effective digital tools, improving and tailoring existing digital tools, and optimizing health outcomes.Qualitative research exploring user experience of digital health tools has surged alongside the exponential increase in digital health interventions. Researchers frequently use qualitative approaches to explore engagement, usability, and uptake of digital health interventions. In this type of “applied research,” researchers often start with predefined research questions, such as how well an intervention works and under what circumstances [
]. Yet, despite these focused research questions, the qualitative methods (ie, the qualitative approaches used) and methodology (ie, the theoretical rationale and perspective that guide the research) used in these studies can vary widely and be misapplied. For example, a systematic review of 16 studies that used qualitative methods to assess user experiences of digital interventions for pediatric patients [ ] found an eclectic range of data analysis methods used, including thematic analysis (37%), content analysis (31%), hermeneutic research analysis (6%), and the generically described “deductive” analysis (6%). Nearly 20% of the articles did not describe their qualitative analysis methodology (ie, their theoretically informed approach). These vague, incomplete, or absent descriptions of qualitative methodology are common across the health intervention literature [ ].As health psychology researchers who have conducted a range of studies exploring patient experiences using digital health interventions [
, ], we understand the challenge of selecting a qualitative approach for this type of research, as there are no existing guidelines to inform the selection of the right qualitative method (ie, the approach that is used) and/or methodology (ie, the overarching philosophical framework or lens). Although there exist several guidelines for improving the reporting of qualitative research [ - ], these generally do not offer guidance on selecting appropriate qualitative approaches. Furthermore, qualitative approaches are also more flexible and interpretative than quantitative methods, which can add to selection confusion. Therefore, in this article, we summarize the most common qualitative approaches used in digital health research when exploring user engagement and make recommendations based on our experiences. We also provide 5 questions and a decision tree that digital health intervention researchers can use to select a qualitative approach that best meets their research goals.First Things First: Grounded in Theory, or Not?
As qualitative research has grown in popularity in behavioral research, so too has confusion over the range of approaches available and the theoretical or methodological frameworks involved. For example, frequent misapplication of thematic analysis, one of the most often used qualitative methods, has led Braun and Clarke [
], authors of the seminal paper on the method, to publish 4 recent papers clarifying its methodology and “flexible” theoretical approach [ - ].Starting first with theory and its underlying epistemology (ie, underlying philosophy) can help you choose the qualitative approach that best fits your research question. Qualitative researchers exploring participant experiences in health-related research often take either a constructionist or realist approach. A constructionist approach, also referred to as interpretivist, critical, or “artfully interpretive” [
], assumes that reality is socially constructed and shaped through interactions, language, and meaning-making processes. It emphasizes that individuals and groups actively create their own realities through their interpretations and interactions with the world. Social constructionist qualitative approaches in health- and mental health–related research often analyze patterns in language, discourse, and narratives to explore the underlying meaning and the social constructs of reality [ ].In contrast, a realist approach, or a “scientifically descriptive” [
] approach, posits that there is a degree of objectivity that exists independently of human perception. It suggests that social phenomena have inherent structures and properties that exist regardless of human interpretation. Realist approaches are often used in mixed methods research and tend to analyze qualitative data with a focus on identifying a “consistency of meaning,” often through triangulation and the use of several co-coders [ ].Researchers also need to consider whether they are taking a deductive (top-down), inductive (data-driven), or abductive (combination of both) approach in their analysis and whether they will focus on semantic (explicit) or latent (implicit) meanings in the data. It is increasingly acknowledged that the coding and analysis process is rarely completely inductive or deductive [
] and is usually a combination of both [ , ]. For example, even if you choose a mainly deductive approach, it is not uncommon that unexpected and largely inductive (data-driven) codes may also become apparent during the qualitative coding process, even if you are coding with a framework in mind or are approaching the data with predetermined deductive codes. This “abductive” approach also allows for a more nuanced and contextual story to be told. For example, even if you are coding participants’ responses using a digital usability framework, you can also code inductively when participants raise important concepts, experiences, and thoughts that relate to the overall research question and aims.Untangling Qualitative Methodologies
Overview
Some qualitative approaches are inherently linked to their broader methodological frameworks (eg, narrative analysis, interpretative phenomenological analysis, and grounded theory) while others are more flexible in terms of belonging to a certain methodological framework (eg, thematic analysis [
] and content analysis [ ]). Thematic analysis, qualitative content analysis, grounded theory, and interpretative phenomenological analysis are some of the many forms of pattern-seeking qualitative approaches. In our opinion, grounded theory and interpretative phenomenological analysis, which are highly interpretive and explorative, are less likely to fit the needs of most digital health research exploring participant usability of and engagement with digital tools, and we are not going to discuss them here. (Good guidance on these methods can be found via Strauss and Corbin [ ] and McLeod [ ].) Instead, we’ve highlighted the 2 most commonly used approaches in digital health research exploring participant experience and user engagement.Qualitative Content Analysis
Qualitative content analysis, including conventional, directed, and summative approaches [
], uses systematic methods to analyze patterns in text and explore meaning. Sometimes considered an “intermediary” approach between qualitative and quantitative methods [ ], content analysis, particularly some types, such as summative, use methods that borrow from quantitative research, such as counting the frequency of words or phrases [ ]. Researchers using content analysis may take a deductive, inductive, or abductive (combined) approach and often use it deductively given its utility for exploring predefined categories and/or incorporating an existing behavioral or theoretical framework [ , ]. Conventional content analysis can also be used more interpretively, and shares some similarities with certain types of thematic analysis (such as the framework method, discussed below) [ ].We find qualitative content analysis particularly useful in digital health intervention evaluations that build on existing frameworks or previous research (as most of our work does). We have also found directed content analysis to be an efficient and effective method to analyze qualitative data within mixed methods randomized controlled trials of digital health interventions (such as in Brenton-Peters et al [
] and Serlachius et al [ ]).Thematic Analysis
Similar in some aspects to content analysis, thematic analysis is a “family” of related methods [
] that involve labeling (“coding”) data and then organizing them into themes (patterns of meaning). Reflexive thematic analysis, which directly addresses the researcher’s role and process in the analysis, is Braun and Clarke’s [ ] updated approach and seeks in part to differentiate itself from older styles of thematic analysis, such as codebook and framework analysis, which take a more structured, and often combined inductive/deductive, approach.Due in part to Braun and Clarke’s [
] detailed, step-by-step description of thematic analysis published in 2006, this approach has become widely used. They have written at length on how this approach is frequently misunderstood and misapplied, as demonstrated by a recent study where they reviewed 20 health-related studies that used thematic analysis and found the most common problem was in the creation of themes [ ]. In reflexive thematic analysis, a theme should highlight a broader meaning across the dataset, telling an interpretive story. However, many studies confuse themes with “topic summaries,” such as “helpfulness of the intervention” or “ease of use.” Braun and Clarke [ ] note that if the theme could have been created before conducting the data analysis (ie, it could be mapped directly to an interview question), then it is a topic summary.We feel that if your research question and aims align best with topic summaries and deductive analysis, qualitative content analysis or codebook or framework thematic analysis may be a better fit than reflexive thematic analysis or other more interpretative approaches.
Selecting a Qualitative Approach: 5 Questions to Consider
To use qualitative approaches more effectively in digital health user engagement research, we believe there needs to be more coherence and consistency in choosing qualitative methods. Therefore, we suggest researchers use the following 5 questions to help select the qualitative approach most appropriate for their research aims (
shows a decision tree.)Question 1: What Is Your Theoretical/Methodological Position: Are You a Realist or an Interpreter?
Most digital health intervention evaluation studies have pragmatic goals grounded in a realist (ie, “scientifically oriented”) view, building on previous research findings, for example, as part of mixed methods research. If your study fits within this orientation, strongly consider using a method like content analysis or thematic framework analysis, which can flexibly accommodate existing frameworks and build on previous quantitative findings.
While more inductively aligned methods of thematic analysis or other more interpretative approaches such as narrative analysis or grounded theory can be excellent approaches at the development stage of a digital health intervention, when there is more emphasis on exploring in-depth patient experiences, they are frequently less aligned with most research aims around digital health intervention evaluation and exploring questions regarding user experience and user engagement through a realist epistemology. However, there are exceptions; for example, a study by Knox et al [
] explored stakeholder views on virtual pulmonary rehabilitation using a critical epistemology, and a paper by Bleyel et al [ ] explored patient perceptions of mental health care through video consultations from a critical realist perspective [ ]. These examples demonstrate the importance of articulating your research aims, methods, and methodological perspective to justify your chosen approach, which is not always easily achievable within the tight word-count limits of medical journals.Also carefully consider whether you want to take a deductive, inductive, or abductive approach in your analysis. While you can use any of these approaches across the different types of content and thematic analysis methods [
, ], you should select an analysis method that works within your research aims and state this orientation in your reported methodology.Question 2: Are You Using an Existing Health or Behavioral Change Framework?
Many digital health interventions are built on existing intervention or behavior change frameworks, such as the capability, opportunity, motivation-behavior (COM-B) model [
] and the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) framework [ ]. If you want to use an existing framework in your qualitative analysis, as many digital intervention studies do, choose an approach designed for this, such as directed content analysis or thematic framework analysis. Do not choose a highly interpretive approach, such as reflexive thematic analysis or interpretative phenomenological analysis, if you intend to map your qualitative findings onto an existing framework. A good example by Szinay and colleagues [ ] used the framework method to explore engagement with well-being apps informed by the COM-B model. For further details regarding the framework method, refer to Gale and colleagues [ ], who provide a comprehensive outline of the 7 stages of analysis using this method in health research.Question 3: What Type of Research Question Are You Trying to Answer?
Qualitative research is well suited to answer “how” and “why” questions, such as why individuals did (or did not) use a digital health intervention and how they went about it. But different research questions may be better suited to one type of qualitative analysis than another.
For example, let us suppose your digital health intervention for weight loss had a high drop-out rate, and you want to use qualitative methods to explore reasons why. The most suitable qualitative approach depends on your underlying rationale and research strategy (including your methodological/theoretical position) and what you plan to do in response to your findings (ie, changes or iterations to the intervention or exploring a different approach altogether). If you want to explore topics such as participants’ beliefs about weight loss and/or perceived barriers to weight loss, narrative analysis or another more inductive approach may be more appropriate. However, if, as is more common in digital health intervention evaluation research, you have specific questions related to participants’ opinions about the design, content, and activities within the intervention itself that caused them to stop using it, then a more focused and deductive or abductive qualitative approach (ie, content analysis) may be more useful.
Question 4: How Are You Collecting Data?
According to a 2019 systematic review, frequently used qualitative data collection tools in digital health intervention evaluation and testing and exploring users’ perspectives include “think-aloud” protocols, focus groups, and interviews [
]. We often also use open-ended online survey questions in our mixed methods evaluation studies. It is important to consider the type of data you are collecting alongside your analysis plan.Stemming from cognitive psychology, the think-aloud method asks participants to share what they are thinking while performing a task [
], which can lead to a focused data set. Because of this, the think-aloud method is popular in usability research and has been found to successfully generate intervention improvements [ ]. Think-aloud data are most commonly analyzed using a deductive approach and with methods similar to content analysis, such as counting positive or negative sentiment [ ], but these data can also be coded and organized into topics or themes [ ].Focus groups and interviews can generate large datasets that may cover a broad range of topics and experiences. While exploratory questions may be useful in digital health intervention development research, a more focused approach is usually necessary to meet the goals of evaluation research or exploring whether a digital tool met participants’ needs. Therefore, it is critical to develop an interview guide that considers your research questions, theoretical framework, and desired result type (ie, themes or frequencies). Otherwise, you may end up with a wide-ranging and meandering dataset that does not answer your research questions and leads to what Braun and Clarke [
] call a “mish-mash” of ineffective qualitative analysis.Question 5: How Much Time Do You Have?
Digital health interventions often require rapid evolution to stay relevant, and researchers may have limited time, resources, and funding [
]. Therefore, it is important to consider how quickly you need to generate your results, as well as the resources needed to complete the analysis.In terms of the approaches discussed here, reflexive thematic analysis, interpretive phenomenological analysis, grounded theory, and narrative analysis often require significant time to conduct, with multiple rounds of detailed, in-depth coding and thematic review and ongoing exploration regarding reflexivity and how it influences the process and outcomes of the analysis. Content analysis or framework thematic analysis can be done in a more efficient and timely manner, with codebooks often developed in advance of analysis and the frequent use of co-coders with a focus on accuracy.
Increasingly, so-called rapid approaches have been developed and compared against more traditional qualitative approaches due to this need to reduce time and improve efficiency in health care research [
, ]. For example, Holdsworth et al [ ] used a rapid form of framework analysis to summarize evaluation data from intensive care unit site visits while still on site. They found this approach delivered significant savings in analysis and transcription costs.Conclusion
After more than a decade of conducting qualitative research exploring patient experiences of and user engagement with digital health tools, we too have grappled with choosing appropriate qualitative methods. In this article, we draw on our experience, which has been largely in health psychology and digital health–related research, but acknowledge that similar qualitative design challenges exist in health informatics, health services research, and the broader usability research literature [
, ]. Ultimately, we believe that many of these ongoing interdisciplinary challenges and discussions enhance overall understanding of the benefits of qualitative and mixed methods research and lead to improvements not just in the conducting and reporting of qualitative research but also in the ongoing development and clarification of the methods themselves. This paper did not have the scope to explore questions regarding what constitutes “good” or rigorous qualitative research in digital health, but there are several articles providing useful summaries of key criteria in digital health research, health services research, and digital health assessment [ , ].Digital health interventions offer innovative solutions to improve health care delivery and patient outcomes. Researchers now frequently use qualitative approaches to explore engagement, usability, and uptake of digital health interventions. However, some of the most popular qualitative approaches are frequently misapplied in this type of research and may not be the most appropriate for this type of research. Therefore, we suggest researchers consider their research aims, theoretical orientation, and the time and resources available before selecting a qualitative approach to use when exploring participants’ experiences of digital tools.
Conflicts of Interest
None declared.
References
- Li J, Theng Y, Foo S. Game-based digital interventions for depression therapy: a systematic review and meta-analysis. Cyberpsychol Behav Soc Netw. Aug 2014;17(8):519-527. [CrossRef] [Medline]
- McLean G, Band R, Saunderson K, Hanlon P, Murray E, Little P, et al. DIPSS co-investigators. Digital interventions to promote self-management in adults with hypertension systematic review and meta-analysis. J Hypertens. Apr 2016;34(4):600-612. [FREE Full text] [CrossRef] [Medline]
- Philippe TJ, Sikder N, Jackson A, Koblanski ME, Liow E, Pilarinos A, et al. Digital health interventions for delivery of mental health care: systematic and comprehensive meta-review. JMIR Ment Health. May 12, 2022;9(5):e35159. [FREE Full text] [CrossRef] [Medline]
- Rose T, Barker M, Maria Jacob C, Morrison L, Lawrence W, Strömmer S, et al. A systematic review of digital interventions for improving the diet and physical activity behaviors of adolescents. J Adolesc Health. Dec 2017;61(6):669-677. [FREE Full text] [CrossRef] [Medline]
- Melville KM, Casey LM, Kavanagh DJ. Dropout from Internet-based treatment for psychological disorders. Br J Clin Psychol. Nov 2010;49(Pt 4):455-471. [CrossRef] [Medline]
- Pedersen DH, Mansourvar M, Sortsø C, Schmidt T. Predicting dropouts from an electronic health platform for lifestyle interventions: analysis of methods and predictors. J Med Internet Res. Sep 04, 2019;21(9):e13617. [FREE Full text] [CrossRef] [Medline]
- Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med. Nov 2016;51(5):843-851. [FREE Full text] [CrossRef] [Medline]
- Cheng L, Liu F, Mao X, Peng W, Wang Y, Huang H, et al. The pediatric cancer survivors' user experiences with digital health interventions: a systematic review of qualitative data. Cancer Nurs. 2022;45(1):E68-E82. [CrossRef] [Medline]
- O'Cathain A, Hoddinott P, Lewin S, Thomas KJ, Young B, Adamson J, et al. Maximising the impact of qualitative research in feasibility studies for randomised controlled trials: guidance for researchers. Pilot Feasibility Stud. 2015;1(1):32. [FREE Full text] [CrossRef] [Medline]
- Garner K, Thabrew H, Lim D, Hofman P, Jefferies C, Serlachius A. Exploring the Usability and acceptability of a well-being app for adolescents living with type 1 diabetes: qualitative study. JMIR Pediatr Parent. Dec 22, 2023;6:e52364. [FREE Full text] [CrossRef] [Medline]
- Wallace-Boyd K, Boggiss AL, Ellett S, Booth R, Slykerman R, Serlachius AS. ACT2COPE: A pilot randomised trial of a brief online acceptance and commitment therapy intervention for people living with chronic health conditions during the COVID-19 pandemic. Cogent Psychology. May 11, 2023;10(1):2208916. [CrossRef]
- Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. Dec 2007;19(6):349-357. [CrossRef] [Medline]
- Ancker J, Benda N, Reddy M, Unertl K, Veinot T. Guidance for publishing qualitative research in informatics. J Am Med Inform Assoc. Nov 25, 2021;28(12):2743-2748. [FREE Full text] [CrossRef] [Medline]
- Mann DM, Chokshi SK, Kushniruk A. Bridging the gap between academic research and pragmatic needs in usability: a hybrid approach to usability evaluation of health care information systems. JMIR Hum Factors. Nov 28, 2018;5(4):e10721. [FREE Full text] [CrossRef] [Medline]
- Murphy E, Dingwall R, Greatbatch D, Parker S, Watson P. Qualitative research methods in health technology assessment: a review of the literature. Health Technol Assess. 1998;2(16):iii-ix, 1. [FREE Full text] [Medline]
- Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. Jan 2006;3(2):77-101. [CrossRef]
- Braun V, Clarke V, Hayfield N, Terry G. Answers to frequently asked questions about thematic analysis. University of Auckland. URL: https://cdn.auckland.ac.nz/assets/psych/about/our-research/documents/Answers%20to%20frequently%20asked%20questions%20about%20thematic%20analysis%20April%202019.pdf [accessed 2024-11-21]
- Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health. Jun 13, 2019;11(4):589-597. [CrossRef]
- Braun V, Clarke V. Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern‐based qualitative analytic approaches. Couns Psychother Res. Oct 18, 2020;21(1):37-47. [CrossRef]
- Braun V, Clarke V. Toward good practice in thematic analysis: avoiding common problems and be(com)ing a researcher. Int J Transgend Health. 2023;24(1):1-6. [FREE Full text] [CrossRef] [Medline]
- Finlay L. Thematic analysis: the ‘good’, the ‘bad’ and the ‘ugly’. Eur J Qual Res Psychother. 2021;11:103-116. [FREE Full text] [CrossRef]
- Harper D, Thompson AR, editors. Qualitative Research Methods in Mental Health and Psychotherapy: A Guide for Students and Practitioners. 1st ed. New York, NY. Wiley; 2011.
- Madill A, Jordan A, Shirley C. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies. Br J Psychol. Feb 2000;91 ( Pt 1):1-20. [CrossRef] [Medline]
- Bingham AJ. From data management to actionable findings: a five-phase process of qualitative data analysis. Int J Qual Methods. Aug 10, 2023;22:3. [CrossRef]
- Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. Mar 01, 2006;5(1):80-92. [CrossRef]
- Proudfoot K. Inductive/deductive hybrid thematic analysis in mixed methods research. J Mix Methods Res. Sep 20, 2022;17(3):308-326. [CrossRef]
- Graneheim UH, Lindgren B, Lundman B. Methodological challenges in qualitative content analysis: a discussion paper. Nurse Educ Today. Sep 2017;56:29-34. [CrossRef] [Medline]
- Strauss A, Corbin JC. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Thousand Oaks, CA. Sage Publications; 1990.
- McLeod J. Qualitative Research in Counselling and Psychotherapy. Thousand Oaks, CA. Sage Publications; 2012.
- Hsieh H, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. Nov 2005;15(9):1277-1288. [CrossRef] [Medline]
- Mayring PAE. Qualitative content analysis. In: International Encyclopedia of Education (Fourth Edition). Amsterdam, Netherlands. Elsevier; 2023:314-322.
- Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. Sep 18, 2013;13:117. [FREE Full text] [CrossRef] [Medline]
- Brenton-Peters JM, Consedine NS, Cavadino A, Roy R, Ginsberg KH, Serlachius A. Finding kindness: a randomized controlled trial of an online self-compassion intervention for weight management (SC4WM). Br J Health Psychol. Feb 2024;29(1):37-58. [CrossRef] [Medline]
- Serlachius A, Schache K, Kieser A, Arroll B, Petrie K, Dalbeth N. Association between user engagement of a mobile health app for gout and improvements in self-care behaviors: randomized controlled trial. JMIR Mhealth Uhealth. Aug 13, 2019;7(8):e15021. [FREE Full text] [CrossRef] [Medline]
- Knox L, Gemine R, Dunning M, Lewis K. Reflexive thematic analysis exploring stakeholder experiences of virtual pulmonary rehabilitation (VIPAR). BMJ Open Respir Res. Jul 2021;8(1):e000800. [FREE Full text] [CrossRef] [Medline]
- Bleyel C, Hoffmann M, Wensing M, Hartmann M, Friederich H, Haun MW. Patients' perspective on mental health specialist video consultations in primary care: qualitative preimplementation study of anticipated benefits and barriers. J Med Internet Res. Apr 20, 2020;22(4):e17330. [FREE Full text] [CrossRef] [Medline]
- Roseveare C. Thematic Analysis: A Practical Guide, by Virginia Braun and Victoria Clarke. Can J Program Eval. Jun 01, 2023;38(1):143-145. [CrossRef]
- Keyworth C, Epton T, Goldthorpe J, Calam R, Armitage CJ. Acceptability, reliability, and validity of a brief measure of capabilities, opportunities, and motivations ("COM-B"). Br J Health Psychol. Sep 2020;25(3):474-501. [FREE Full text] [CrossRef] [Medline]
- Forman J, Heisler M, Damschroder LJ, Kaselitz E, Kerr EA. Development and application of the RE-AIM QuEST mixed methods framework for program evaluation. Prev Med Rep. Jun 2017;6:322-328. [FREE Full text] [CrossRef] [Medline]
- Szinay D, Perski O, Jones A, Chadborn T, Brown J, Naughton F. Perceptions of factors influencing engagement with health and well-being apps in the United Kingdom: qualitative interview study. JMIR Mhealth Uhealth. Dec 16, 2021;9(12):e29098. [FREE Full text] [CrossRef] [Medline]
- Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: A scoping review. Int J Med Inform. Jun 2019;126:95-104. [CrossRef] [Medline]
- Ericsson KA, Simon HA. Verbal reports as data. Psychol Rev. May 1980;87(3):215-251. [CrossRef]
- Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med. Nov 2016;51(5):833-842. [CrossRef] [Medline]
- Yardley L, Bradbury K, Morrison L. Using qualitative research for intervention development and evaluation. In: Qualitative Research in Psychology: Expanding Perspectives in Methodology and Design (2nd Ed). Washington, DC. American Psychological Association; 2021.
- Taylor B, Henshall C, Kenyon S, Litchfield I, Greenfield S. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. Oct 08, 2018;8(10):e019993. [FREE Full text] [CrossRef] [Medline]
- Vindrola-Padros C, Johnson GA. Rapid techniques in qualitative research: a critical review of the literature. Qual Health Res. Aug 2020;30(10):1596-1604. [CrossRef] [Medline]
- Holdsworth LM, Safaeinili N, Winget M, Lorenz KA, Lough M, Asch S, et al. Adapting rapid assessment procedures for implementation research using a team-based approach to analysis: a case example of patient quality and safety interventions in the ICU. Implement Sci. Feb 22, 2020;15(1):12. [FREE Full text] [CrossRef] [Medline]
- Mays N, Pope C. Qualitative research in health care. Assessing quality in qualitative research. BMJ. Jan 01, 2000;320(7226):50-52. [CrossRef] [Medline]
- Tracy SJ. Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qual Inq. Oct 01, 2010;16(10):837-851. [CrossRef]
Abbreviations
COM-B: capability, opportunity, motivation-behavior |
Re-Aim: reach, effectiveness, adoption, implementation, and maintenance |
Edited by T de Azevedo Cardoso; submitted 30.05.24; peer-reviewed by D Singh, Y Sha, L Weinert; comments to author 20.09.24; revised version received 03.10.24; accepted 28.10.24; published 28.11.24.
Copyright©Kristin Harrison Ginsberg, Katie Babbott, Anna Serlachius. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 28.11.2024.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.