Review
Abstract
Background: Increasing adoption of sensor-based digital health technologies (sDHTs) in recent years has cast light on the many challenges in implementing these tools into clinical trials and patient care at scale across diverse patient populations; however, the methodological approaches taken toward sDHT usability evaluation have varied markedly.
Objective: This review aims to explore the current landscape of studies reporting data related to sDHT human factors, human-centered design, and usability, to inform our concurrent work on developing an evaluation framework for sDHT usability.
Methods: We conducted a scoping review of studies published between 2013 and 2023 and indexed in PubMed, in which data related to sDHT human factors, human-centered design, and usability were reported. Following a systematic screening process, we extracted the study design, participant sample, the sDHT or sDHTs used, the methods of data capture, and the types of usability-related data captured.
Results: Our literature search returned 442 papers, of which 85 papers were found to be eligible and 83 papers were available for data extraction and not under embargo. In total, 164 sDHTs were evaluated; 141 (86%) sDHTs were wearable tools while the remaining 23 (14%) sDHTs were ambient tools. The majority of studies (55/83, 66%) reported summative evaluations of final-design sDHTs. Almost all studies (82/83, 99%) captured data from targeted end users, but only 18 (22%) out of 83 studies captured data from additional users such as care partners or clinicians. User satisfaction and ease of use were evaluated for 83% (136/164) and 91% (150/164) of sDHTs, respectively; however, learnability, efficiency, and memorability were reported for only 11 (7%), 4 (2%), and 2 (1%) out of 164 sDHTs, respectively. A total of 14 (9%) out of 164 sDHTs were evaluated according to the extent to which users were able to understand the clinical data or other information presented to them (understandability) or the actions or tasks they should complete in response (actionability). Notable gaps in reporting included the absence of a sample size rationale (reported for 21/83, 25% of all studies and 17/55, 31% of summative studies) and incomplete sociodemographic descriptive data (complete age, sex/gender, and race/ethnicity reported for 14/83, 17% of studies).
Conclusions: Based on our findings, we suggest four actionable recommendations for future studies that will help to advance the implementation of sDHTs: (1) consider an in-depth assessment of technology usability beyond user satisfaction and ease of use, (2) expand recruitment to include important user groups such as clinicians and care partners, (3) report the rationale for key study design considerations including the sample size, and (4) provide rich descriptive statistics regarding the study sample to allow a complete understanding of generalizability to other patient populations and contexts of use.
doi:10.2196/57628
Keywords
Introduction
Sensor-based digital health technologies (sDHTs), defined as connected digital medicine products that process data captured by mobile sensors using algorithms to generate measures of behavioral and/or physiological function [
], have been increasingly adopted in both research and health care in recent years [ , ]. sDHTs include products designed to capture data passively (such as continuous glucose monitors and wearables for monitoring sleep) or during active tasks (such as mobile spirometry or smartphone-based cognitive assessments) from wearable, implantable, ingestible, or ambient tools. Implementation of sDHTs requires interactions across the hardware containing the sensor or sensors, the software that is used to convert sensor data to health-related measures, and the users (who could be consumers, patients, clinicians, and more) who interact at one or more stages of data capture. Given this complexity and the increasing use of sDHTs, defining and understanding best practices for human factors, human-centered design, and usability (defined in ) of sDHTs is a critical need. Although regulatory guidance focused on the usability of medical devices is well established, sDHTs require unique consideration because (1) sDHTs used in clinical research studies for data capture may or may not be regulated medical devices [ ], (2) research participants likely have different motivations and needs related to their use of the technology, (3) sDHTs are often used over much longer time periods in research compared with health care settings, and (4) digital measures captured in large studies may be analyzed with limited human oversight or clinical interpretation.The methodological approaches taken toward sDHT usability evaluation have varied substantially [
, ], casting light on the many challenges in implementing these tools into clinical trials and patient care at scale across diverse patient populations [ , ]. For example, some studies have adopted questionnaires developed for products and systems other than sDHTs [ ], while others have described the approach to participatory design alongside qualitative data capture [ ]. Inadequate attention to human-centered design and usability testing approaches can hinder the evaluation of health care interventions, contribute to insufficient adoption, perpetuate health disparities, increase costs, and potentially introduce safety risks [ - ]. Thus, integrating human factors considerations in the design, development, and evaluation of sDHTs is critical to improving their likelihood of being adopted and properly utilized in a way that is safe, effective, inclusive, and optimizes the user experience.While several systematic reviews have focused on understanding and quantifying the usability of digital health products for specific applications [
- ], their focus has primarily been on study outcomes rather than evaluating methodological approaches. Recognizing the urgency of addressing sDHT usability-related challenges, a precompetitive collaboration within the Digital Health Measurement Collaborative Community (DATAcc) hosted by the Digital Medicine Society (DiMe) undertook a scoping review to highlight studies that have performed a usability-related evaluation for sDHTs, outline the dimensions of usability data that were assessed, and highlight the methods of usability evaluation. Our objective was to explore the current landscape and identify gaps, which will inform the development and dissemination of recommendations and an evidence-driven evaluation framework of sDHTs as being fit for purpose from a usability perspective.Human factors
- The application of knowledge about human behavior, abilities, limitations, and other characteristics of users to the design and development of a sensor-based digital health technology (sDHT) to optimize usability within a defined intended use or context of use. This definition incorporates terminology and concepts from the US Food and Drug Administration (FDA) [ ], the UK Medicines and Health Care Products Regulatory Agency (MHRA) [ ], and the National Medical Products Administration (NMPA) of China (translated) [ ].
Human-centered design
- An approach to interactive systems that aims to make systems usable and useful by focusing on the users, their needs and requirements, and by applying human factors and usability knowledge and techniques, as defined in the International Organization for Standardization (ISO) 9241-210:2019 standard [ ].
Usability
- The extent to which an sDHT can be used to achieve specified goals with ease, efficiency, and user satisfaction within a defined intended use or context of use. This definition incorporates terminology and concepts from the FDA [ ], the MHRA [ ], the NMPA (translated) [ ], and ISO 9241-210:2019 [ ].
Methods
Overview
We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines for scoping reviews (
) [ ]. As a scoping review, this work did not meet the criteria for registration on PROSPERO [ ]. The protocol is available from the corresponding author.Literature Search
We completed our literature search in PubMed using search terms designed in 6 layers as follows (terms within each layer were separated by the Boolean operator “OR”, while the layers themselves were separated using “AND” or “NOT”): (1) Medical Subject Heading (MeSH; [
]) term for human participants; (2) MeSH terms related to sDHTs, such as wearable electronic devices and digital technology; (3) keywords related to sDHTs such as wear* (asterisk indicates truncation), remote, and connected; (4) keywords related to human-centered design, usability, human factors, and ergonomics; (5) exclusion of out-of-scope publication types such as editorials and case reports; and (6) published between January 1, 2013, and May 30, 2023. The complete search string is provided in Table S1 in .To avoid potentially overlooking novel or emerging technologies, the search terms did not include descriptions of specific sensor types (such as accelerometer), form factors (such as watch), methodology (such as actigraphy), wear location (such as wrist), or technology make or model.
Study Selection
We systematically screened publications identified in the literature search based on the PICO (patients/participants; intervention; comparator; outcomes) eligibility criteria outlined in
, designed to identify studies describing the incorporation of knowledge about human behavior, abilities, limitations, and other characteristics of users to the design and development process; human-centered design; and ease of use, efficiency, or user satisfaction of sDHTs. Studies reporting sDHT adherence (eg, average wear time) or measurement success metrics (eg, percentage of in-range measurements obtained) were considered out of scope unless they reported one of the aforementioned concepts.Two independent investigators (JC and JPB) began by screening a random selection of 20% of publications; disagreements were resolved by consensus, and clarifications were made to the wording of the eligibility criteria to reduce ambiguity. The same two investigators then reviewed another random selection of 20% of publications; it was determined a priori that if the reviewers were in agreement for ≥90% of these publications, the remaining 60% would be reviewed by a single investigator (JC) as described elsewhere [
, ].PICOa frameworkb | Eligibility criteria |
Patient or participant |
|
Intervention |
|
Comparator |
|
Outcome or outcomes |
|
aPICO: patients/participants; intervention; comparator; outcomes.
bThe PICO framework is described by Eriksen and Frandsen [
].csDHT: sensor-based digital health technology.
dThroughout this review, we refer to “sensor-based digital health technology” (sDHT); however, this was operationalized according to the definition of “biometric monitoring technology” (BioMeT) as described in Goldsack et al [
].eNot applicable.
Data Extraction and Analysis
Data extraction fields included study design and sample characteristics; the type, maturity, make or model, form factor, and wear location (if applicable) of each sDHT evaluated along with the health concept or concepts generated by each sDHT; the methodological approaches; and the types of usability-related data reported in each study. Most fields for data capture were categorical, with categories created in advance to minimize error. Extraction from each publication was undertaken by one of three investigators with adjudication by an independent investigator as needed.
Categories of usability-related data are described in
, and compiled based on the literature including the International Organization for Standardization (ISO) 9241-210:2019 standard [ ] and Nielsen’s [ ] usability attributes, as well as the studies identified in this review; that is, data not clearly fitting into an existing category were extracted and categorized post hoc. We acknowledge that there are various models for capturing data describing usability and related topics [ ]; however, there is no single standard that has been widely adopted.Consistent with the goal of a scoping review, all data were analyzed descriptively.
Category | Definitiona,b |
User satisfaction | The extent to which a user finds the sDHTc to be pleasant to use, which may reflect trust, comfort, aesthetics, engagement, desirability, emotional response, and other considerations. Always captured through self-report. |
Ease of use | The ease with which a user is able to perform user tasks. Can be captured through self-report (such as the mental demand or effort required to complete a task) or objective measures (such as the number of actions, number of attempts, or time required to complete a task). |
Efficiency | The ease with which a user is able to perform user tasks after having learned how to use the sDHT. Captured according to the definition of ease of use above. |
Learnabilityd | The ease with which a user is able to perform user tasks during their first encounter with the sDHT. Captured according to the definition of ease of use above. |
Memorability | The ease with which a user is able to perform user tasks after a period of nonuse, assessed in a test-retest paradigm. Captured according to the definition of ease of use above. |
Usefulnesse | The extent to which a user finds the sDHT, or its specific features or functions, to be valuable, productive, or helpful. Always captured through self-report. |
Use errorsf | An action or lack of action that may result in a use-related hazard (a potential source of harm), as well as error recovery defined as the ability of a user to make a correction following a use error in order to complete a task. Can be captured through self-report or objective assessments. |
Technical performance or malfunctions | Technical performance, such as page load times, or the number, type, and severity of errors associated with sDHT malfunction. Can be captured through self-report or objective assessments. |
Readability | The reading skills a user must possess to understand information presented to them through the sDHT itself, or through written materials such as instructions for use, cautions, warnings, or contraindications; [ | ]. Always captured through objective assessments, and typically reported as a reading grade.
Understandability or actionability | The extent to which users of diverse backgrounds, languages, and varying levels of health literacy understand (1) the clinical data or other information, such as instructions, cautions, warnings, and contraindications, presented to them; and (2) the actions or tasks they should complete in response, such as an sDHT-derived blood glucose measurement requiring an adjustment to medication [ | ]. Always captured through objective assessments.
aNote that in the definitions, “self-report” includes data captured through surveys, interviews, and focus groups, while “objective” includes data captured through observation (direct or video) or through the sDHT itself (or any related software) such as timestamps, app crash reports, and page load times.
bComfort and trust were extracted separately for the purposes of this review.
csDHT: sensor-based digital health technology.
dLearnability refers to the operation of the sDHT rather than a practice effect associated with a research study outcome or endpoint.
eWe have adopted the term usefulness instead of utility, to avoid confusion with clinical utility, which refers to the extent to which implementing a medical product leads to improved health outcomes or provides useful information about diagnosis, treatment, management, or prevention of disease [
].fStudy outcomes that come after “use errors” are not typically considered usability data, but are related concepts often captured during usability evaluations.
Results
Literature Search and Study Selection
The PubMed search conducted on June 1, 2023, yielded 442 results, including one published only as an abstract. After applying the eligibility criteria described in
, a further 356 publications were excluded. As such, 85 studies were determined to be eligible; however, 2 studies were under embargo, leaving 83 studies for data extraction ( ). A complete list of all included studies is provided in Table S2 in [ , , - ].As described above, two investigators reached a consensus on 20% (n=88) of the 442 publications, before any further publications were screened. The same investigators then screened a further 88 publications independently, which resulted in 100% agreement of eligibility. Per protocol, a single investigator screened the remaining 266 papers.
Study Design Considerations
The majority of studies (55/83, 66%;
) reported summative evaluations of products that were marketed or production-equivalent (ie, sample products of final design assembled in a way that differs from—but is equivalent to—the manufacturing processes used for the marketed product [ ]). The remaining 28 (34%) out of 83 studies reported formative evaluations of prototype products; we did not identify any reports focused solely on sDHT design. Most studies (53/83, 64%) were conducted partially or completely off-site. Study sample sizes spanned a wide range (range 1-623; median 27, IQR 13-60); however, only 21 (25%) of the full set of 83 studies, and 17 (33%) of the 55 summative studies, reported a rationale for the sample size (with or without a power calculation).Therapeutic area of sDHTa end users (number of studies in parenthesis) | |||||||||||
Aging (n=19) | Cardiovascular (n=9) | Endocrine (n=3) | Neurology (n=13) | Oncology (n=3) | Respiratory (n=6) | Surgery (n=5) | Healthy (n=15) | Otherb (n=10) | Total (n=83) | ||
Study design, n (%) | |||||||||||
Observational | 17 (89) | 9 (100) | 3 (100) | 13 (100) | 3 (100) | 5 (83) | 4 (80) | 14 (93) | 10 (100) | 78 (94) | |
Interventional | 2 (11) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (17) | 1 (20) | 1 (7) | 0 (0) | 5 (6) | |
Study focusc, n (%) | |||||||||||
Summative; sample size rationale | 3 (16) | 3 (33) | 0 (0) | 4 (31) | 0 (0) | 1 (17) | 2 (40) | 1 (7) | 3 (30) | 17 (20) | |
Summative; no sample size rationale | 6 (32) | 2 (22) | 2 (67) | 7 (54) | 3 (100) | 4 (67) | 1 (20) | 9 (60) | 4 (40) | 38 (46) | |
Formative; sample size rationale | 2 (11) | 1 (11) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (10) | 4 (5) | |
Formative; no sample size rationale | 8 (42) | 3 (33) | 1 (33) | 2 (15) | 0 (0) | 1 (17) | 2 (40) | 5 (33) | 2 (20) | 24 (29) | |
Setting, n (%) | |||||||||||
Remote | 11 (58) | 3 (33) | 1 (33) | 5 (38) | 2 (67) | 6 (100) | 1 (20) | 7 (47) | 6 (60) | 42 (51) | |
On-site | 4 (21) | 6 (67) | 1 (33) | 5 (38) | 0 (0) | 0 (0) | 3 (60) | 8 (53) | 3 (30) | 30 (36) | |
Both remote and on-site | 4 (21) | 0 (0) | 1 (33) | 3 (23) | 1 (33) | 0 (0) | 1 (20) | 0 (0) | 1 (10) | 11 (13) | |
Duration of sDHT data collection, n (%) | |||||||||||
≤1 day | 5 (26) | 3 (33) | 1 (33) | 2 (15) | 0 (0) | 0 (0) | 2 (40) | 7 (47) | 2 (20) | 22 (27) | |
>1 to ≤7 days | 2 (11) | 3 (33) | 0 (0) | 4 (31) | 0 (0) | 2 (33) | 2 (40) | 2 (13) | 2 (20) | 17 (20) | |
>7 to ≤30 days | 6 (32) | 2 (22) | 1 (33) | 3 (23) | 1 (33) | 0 (0) | 1 (20) | 2 (13) | 0 (0) | 16 (19) | |
>31 to ≤90 days | 3 (16) | 0 (0) | 0 (0) | 2 (15) | 1 (33) | 4 (67) | 0 (0) | 3 (20) | 1 (10) | 14 (17) | |
>90 to ≤180 days | 1 (5) | 1 (11) | 0 (0) | 0 (0) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 3 (30) | 6 (7) | |
>180 days | 1 (5) | 0 (0) | 0 (0) | 2 (15) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (10) | 4 (5) | |
Not reported | 1 (5) | 0 (0) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 1 (10) | 4 (5) | |
Study sample | |||||||||||
Sample size, median (IQR) | 30 (13.5-52.5) | 24 (10-41) | 35 (20-189) | 40 (22-70) | 30 (22-31.5) | 14.5 (9.25-19.75) | 29 (15-60) | 25 (13-105) | 21 (12-81.25) | 27 (13-60) | |
Sample size, range | 8-125 | 5-156 | 5-343 | 5-623 | 14- 33 | 1-314 | 10-77 | 1-243 | 3-407 | 1-623 | |
Users, n (%) | |||||||||||
End usersd | 19 (100) | 9 (100) | 3 (100) | 13 (100) | 3 (100) | 6 (100) | 5 (100) | 15 (100) | 9 (90) | 82 (99) | |
Care partner usersd | 0 (0) | 1 (11) | 1 (33) | 2 (15) | 0 (0) | 0 (0) | 0 (0) | 3 (20) | 1 (10) | 8 (10) | |
Clinician usersd | 2 (11) | 3 (33) | 1 (33) | 2 (15) | 0 (0) | 1 (17) | 1 (20) | 1 (7) | 1 (10) | 12 (14) | |
Expertsd | 1 (5) | 0 (0) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 3 (4) | |
Age, n (%) | |||||||||||
Adults only | 19 (100) | 7 (78) | 1 (33) | 10 (77) | 2 (67) | 3 (50) | 5 (100) | 11 (73) | 7 (70) | 65 (78) | |
Children only | 0 (0) | 0 (0) | 1 (33) | 1 (8) | 0 (0) | 2 (33) | 0 (0) | 3 (20) | 3 (30) | 10 (12) | |
Both adults and children | 0 (0) | 1 (11) | 0 (0) | 2 (15) | 1 (33) | 1 (17) | 0 (0) | 1 (7) | 0 (0) | 6 (7) | |
Not reported | 0 (0) | 1 (11) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 2 (2) | |
Sex/gender, n (%) | |||||||||||
Male or men only | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 2 (20) | 3 (4) | |
Female or women only | 1 (5) | 0 (0) | 0 (0) | 0 (0) | 1 (33) | 1 (17) | 0 (0) | 0 (0) | 2 (20) | 5 (6) | |
Both or all sexes/genders | 18 (95) | 7 (78) | 2 (67) | 13 (100) | 2 (67) | 5 (83) | 5 (100) | 13 (87) | 6 (60) | 71 (86) | |
Not reported | 0 (0) | 2 (22) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 4 (5) | |
Race/ethnicity, n (%) | |||||||||||
Race/ethnicity reported | 2 (11) | 1 (11) | 0 (0) | 4 (31) | 0 (0) | 2 (33) | 0 (0) | 3 (20) | 2 (20) | 14 (17) | |
Race/ethnicity not reported | 17 (89) | 8 (89) | 3 (100) | 9 (69) | 3 (100) | 4 (67) | 5 (100) | 12 (80) | 8 (80) | 69 (83) | |
Number of sDHTs assessed | |||||||||||
Range | 1-7 | 1-3 | 1-1 | 1-5 | 1-6 | 1-5 | 1-2 | 1-7 | 1-11 | 1-11 |
asDHT: sensor-based digital health technology.
b“Other” therapeutic area category contains studies with enrollment eligibility focused on anaphylaxis, muscular dystrophy, hemophilia, nocturnal enuresis, blood and marrow transplant, overweight or obesity, pregnancy, and nonspecific hospitalized or chronic illness. One study recruited clinicians only (no end users of the sDHT) and is included in this category.
cStudies reporting formative and summative evaluations are categorized as summative.
dCategories are not mutually exclusive.
Sample Characteristics
As shown in
, the largest target populations were focused on aging and healthy participants (15 and 19 studies, respectively; 34/83, 41% of all studies). Among the various diseases studied, neurology and cardiovascular were the most common therapeutic areas (13 and 9 studies, respectively; 22/44, 50% of studies assessing nonhealthy individuals). Table S3 in contains a list of conditions falling into each therapeutic area.Almost all studies (82/83, 99%) captured data from targeted end users; the remaining study captured data only from clinician users [
]. Several studies captured data from multiple user groups; in total, 8 and 12 studies gathered data from care partner users and clinician users, respectively. Three studies involved experts (not considered to be sDHT users); two of these described a formal heuristic evaluation [ , ] while the other described involving experts in design, biomedical engineering, computer science, and mobile health system production in the sDHTs design and formative testing process [ ]. Finally, we noted substantial missing participant demographic data; age, sex/gender, and race/ethnicity were not reported in 2, 4, and 69 studies, respectively.sDHTs Assessed in Eligible Studies
Across the 83 studies included in our review, a total of 164 different sDHTs were assessed (141 wearable and 23 ambient tools;
), ranging from 1 to 11 sDHTs within a single study. Ingestible and implantable sDHTs were in scope, but none were identified in our literature search. A wide range of form factors (22 distinct categories) and wear locations (14 anatomical locations presented in 5 categories) were identified. Digital clinical measures of vital signs (n=76 sDHTs), physical activity (n=61 sDHTs), and mobility (n=35) were most prevalent. Table S4 in contains more comprehensive information regarding wear locations and health concepts captured by sDHTs.Most sDHTs (126/164, 77%) required only passive interaction by users, meaning that data were captured without user input other than basic tasks such as charging or changing batteries. The remaining 38 (23%) sDHTs required active engagement at specific times such as completion of physical therapy [
], exercise [ ], or blood glucose tests [ ].Therapeutic area of sDHT users (number of sDHTs in parenthesis) | |||||||||||
Aging (n=27) | Cardiovascular (n=12) | Endocrine (n=3) | Neurology (n=31) | Oncology (n=8) | Respiratory (n=15) | Surgery (n=7) | Healthy (n=35) | Otherb (n=26) | Total (n=164) | ||
sDHT type, n (%) | |||||||||||
Wearable | 26 (96) | 9 (75) | 1 (33) | 30 (97) | 8 (100) | 11 (73) | 7 (100) | 33 (94) | 16 (62) | 141 (86) | |
Ambient | 1 (4) | 3 (25) | 2 (67) | 1 (3) | 0 (0) | 4 (27) | 0 (0) | 2 (6) | 10 (38) | 23 (14) | |
sDHT maturity, n (%) | |||||||||||
Prototype | 9 (33) | 5 (42) | 1 (33) | 5 (16) | 0 (0) | 0 (0) | 2 (29) | 5 (14) | 11 (42) | 38 (23) | |
Final or marketed | 18 (67) | 6 (50) | 2 (67) | 26 (84) | 8 (100) | 15 (100) | 5 (71) | 27 (77) | 12 (46) | 119 (73) | |
Not reported | 0 (0) | 1 (8) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 3 (9) | 3 (12) | 7 (4) | |
Form factor, n (%) | |||||||||||
Adhesive patch | 2 (7) | 0 (0) | 1 (33) | 5 (16) | 0 (0) | 0 (0) | 1 (14) | 2 (6) | 1 (4) | 12 (7) | |
Balance board | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (4) | 1 (<1) | |
Camera, video, or still | 0 (0) | 1 (8) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 2 (8) | 3 (2) | |
Clip | 4 (15) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 1 (14) | 1 (3) | 0 (0) | 7 (4) | |
Clothing or shoes | 5 (19) | 3 (25) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 4 (11) | 5 (19) | 17 (10) | |
Contact lens | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Cuff or wrap | 0 (0) | 1 (8) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 1 (14) | 2 (6) | 0 (0) | 5 (3) | |
Electrode or electrodes | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 0 (0) | 2 (6) | 0 (0) | 3 (2) | |
Exercise equipment | 0 (0) | 0 (0) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 2 (8) | 3 (2) | |
Glasses | 1 (4) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 2 (<1) | |
Gloves | 2 (7) | 1 (8) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 4 (2) | |
Glucometer | 0 (0) | 0 (0) | 1 (33) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Handheld thermometer | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Mattress pad | 0 (0) | 1 (8) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 2 (<1) | |
Medication package | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 0 (0) | 1 (4) | 2 (<1) | |
Phone or tablet | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 4 (15) | 5 (3) | |
Probe | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (4) | 2 (<1) | |
Ring | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Spirometer | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 2 (13) | 0 (0) | 0 (0) | 0 (0) | 2 (<1) | |
Contactless unit | 1 (4) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 2 (<1) | |
Strap | 12 (44) | 4 (33) | 0 (0) | 20 (65) | 8 (100) | 9 (60) | 4 (57) | 21 (60) | 9 (35) | 87 (53) | |
Weight scale | 0 (0) | 1 (8) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Wear location, n (%) | |||||||||||
Arms or wrists or hands | 11 (41) | 5 (42) | 0 | 19 (61) | 8 (100) | 8 (53) | 3 (43) | 19 (54) | 9 (35) | 82 (50) | |
Head or face | 1 (4) | 0 | 0 | 4 (13) | 0 | 0 | 0 | 3 (9) | 0 | 8 (5) | |
Legs or ankles or feet | 2 (7) | 2 (17) | 0 (0) | 1 (3) | 0 (0) | 0 (0) | 2 (29) | 3 (9) | 0 (0) | 10 (6) | |
Neck or torso or hips | 10 (37) | 2 (17) | 1 (33) | 3 (10) | 0 (0) | 2 (13) | 2 (29) | 6 (17) | 6 (23) | 32 (20) | |
Multiple locationsc | 2 (7) | 0 (0) | 0 (0) | 3 (10) | 0 (0) | 0 (0) | 0 (0) | 2 (6) | 1 (4) | 8 (5) | |
N/Ad | 1 (4) | 3 (25) | 2 (67) | 1 (3) | 0 (0) | 5 (33) | 0 (0) | 2 (6) | 10 (38) | 24 (15) | |
Interaction typee, n (%) | |||||||||||
Passive | 24 (89) | 7 (58) | 1 (33) | 26 (84) | 8 (100) | 12 (80) | 3 (43) | 30 (86) | 15 (58) | 126 (77) | |
Active | 3 (11) | 5 (42) | 2 (67) | 5 (16) | 0 (0) | 3 (20) | 4 (57) | 5 (14) | 11 (42) | 38 (23) | |
Health conceptsf, n (%) | |||||||||||
Activities of daily living | 6 (22) | 2 (17) | 0 (0) | 2 (6) | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 11 (7) | |
Physical activity | 16 (59) | 4 (33) | 1 (33) | 6 (19) | 9 (113) | 9 (6) | 1 (14) | 15 (43) | 0 (0) | 61 (37) | |
Adherence | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | 1 (7) | 0 (0) | 0 (0) | 0 (0) | 1 (<1) | |
Electrical activity | 0 (0) | 0 (0) | 0 (0) | 15 (48) | 0 (0) | 0 (0) | 0 (0) | 1 (3) | 0 (0) | 16 (<1) | |
Mobility | 5 (19) | 4 (33) | 0 (0) | 11 (35) | 0 (0) | 0 (0) | 2 (29) | 13 (37) | 0 (0) | 35 (21) | |
Sleep | 6 (22) | 1 (8) | 0 (0) | 0 (0) | 1 (13) | 1 (7) | 0 (0) | 5 (14) | 0 (0) | 14 (9) | |
Vital signs | 9 (33) | 5 (42) | 2 (67) | 18 (58) | 0 (0) | 5 (33) | 14 (2) | 23 (66) | 0 (0) | 76 (46) | |
Otherg | 4 (15) | 2 (17) | 0 (0) | 4 (13) | 0 (0) | 2 (13) | 0 (0) | 0 (0) | 0 (0) | 12 (7) |
asDHT: sensor-based digital health technology.
b“Other” therapeutic area category contains studies with enrollment eligibility focused on anaphylaxis, muscular dystrophy, hemophilia, nocturnal enuresis, blood and marrow transplant, overweight or obesity, pregnancy, and nonspecific hospitalized or chronic illness. One study recruited clinicians only (no end users of the sDHT) and is included in this category.
cRefers to multisensor sDHTs worn on different parts of the body, or sDHTs that can be positioned in one of many locations.
dWear location is not applicable to ambient sDHTs. Wear locations are presented in greater detail in Table S4 in
.ePassive: sDHT data are collected over long time periods without user input other than aspects such as charging or changing batteries (such as actigraphy); includes tools for which the absence of data is meaningful (such as smart packaging for adherence monitoring). Active: sDHT data collection requires user engagement at defined timepoints. Categories described previously [
].fHealth concepts are not mutually exclusive; a single sDHT can capture data in multiple categories. Heath concepts are presented in greater detail in Table S4 in
.g“Other” health concept category includes bladder volume, body habitus, cardiac output, fall detection, gaze or visual movement, intraocular pressure, lung or airway function, and tremor detection.
Methodological Approaches
As described in
, most sDHTs (139/164, 85%) were evaluated in the actual environment in which they were intended to be used, while 25 sDHTs were assessed in a simulated environment only. The vast majority were evaluated during actual use (148/164, 90%) rather than through “look and feel” approaches. Of particular interest, a variety of methods were used to evaluate usability and related concepts, including interviews (49 sDHTs), focus groups (29 sDHTs), direct or video observation (35 sDHTs), think-aloud (15 sDHTs), and heuristic analysis (2 sDHTs). Surveys were the most prevalent method for capturing usability data; 86 sDHTs were evaluated using referenced surveys while 81 sDHTs were evaluated using surveys developed in house by study investigators. Data for 4 sDHTs were captured using the sDHT itself; for example, instances of connectivity loss or data capture drops were recorded as “use errors” or “technical performance or product errors” [ , ]. An illustrative example of a use error that may be addressed through design modification is the report of users turning an sDHT on and off repeatedly as it was not clear whether the product was operating correctly [ ]. Additional examples of product errors, distinct from use errors, included instances of system crash [ ] and software malfunctions requiring computer program patches [ ].sDHTa type (number of sDHTs in parenthesis) | |||||||
Ambient (n=23) | Wearable (n=141) | Total (n=164) | |||||
Data collection environment, n (%) | |||||||
Actual environment | 23 (100) | 107 (76) | 130 (79) | ||||
Simulated environment | 0 (0) | 25 (18) | 25 (15) | ||||
Both actual and simulated | 0 (0) | 9 (6) | 9 (5) | ||||
Interactions with sDHT, n (%) | |||||||
Look and feel | 1 (4) | 15 (11) | 16 (10) | ||||
Actual use | 22 (96) | 126 (89) | 148 (90) | ||||
Usability evaluation methodsb | |||||||
Interviews | 5 (22) | 44 (31) | 49 (30) | ||||
Focus groups | 10 (43) | 19 (13) | 29 (18) | ||||
Surveys—referenced | 14 (61) | 72 (51) | 86 (52) | ||||
Surveys—in house | 7 (30) | 74 (52) | 81 (49) | ||||
Think-aloud | 1 (4) | 14 (10) | 15 (9) | ||||
Observation (direct or video) | 1 (4) | 34 (24) | 35 (21) | ||||
Measured by the sDHT | 0 (0) | 5 (4) | 5 (3) | ||||
Heuristic analysis | 1 (4) | 1 (<1) | 2 (<1) | ||||
Type or types of usability data reported, n (%) | |||||||
Mixed methods | 14 (61) | 58 (41) | 72 (44) | ||||
Quantitative only | 6 (26) | 60 (43) | 66 (40) | ||||
Qualitative only | 3 (13) | 23 (16) | 26 (16) | ||||
Categories of usability and related data reported, n (%) | |||||||
User satisfaction | 19 (83) | 117 (83) | 136 (83) | ||||
Comfort | 5 (22) | 107 (76) | 112 (68) | ||||
Ease of use; self-report | 23 (100) | 122 (87) | 145 (88) | ||||
Ease of use; objectively captured | 1 (4) | 4 (3) | 5 (3) | ||||
Learnability | 1 (4) | 10 (7) | 11 (7) | ||||
Efficiency | 0 (0) | 4 (3) | 4 (2) | ||||
Memorability | 0 (0) | 2 (<1) | 2 (<1) | ||||
Usefulness | 16 (70) | 96 (68) | 112 (68) | ||||
Use errors | 6 (26) | 26 (18) | 32 (20) | ||||
User trust | 12 (52) | 53 (38) | 65 (40) | ||||
Readability | 0 (0) | 0 (0) | 0 (0) | ||||
Understandability or actionability | 1 (4) | 13 (9) | 14 (9) | ||||
Technical performance or product errors | 19 (83) | 79 (56) | 98 (60) | ||||
Adherence to sDHT reported, n (%) | |||||||
Objectively measured by the sDHT | 6 (26) | 44 (31) | 50 (30) | ||||
Self or care partner report | 1 (4) | 14 (10) | 15 (9) | ||||
Both objective and self or care partner | 0 (0) | 4 (3) | 4 (2) | ||||
Reported but method not described | 1 (4) | 7 (5) | 8 (5) | ||||
Adherence not reported | 15 (65) | 72 (51) | 87 (53) |
asDHT: sensor-based digital health technology.
bCategories are not mutually exclusive.
Categories of Usability-Related Data Reported
User satisfaction was captured for the majority of sDHTs (136/164, 83%), often as a measure of acceptability or user attitudes. Although overall ease of use was also commonly reported, captured through either self-report or objective methods (n=145 and n=5, respectively), the related concepts of learnability, efficiency, and memorability were reported for only 11, 4, and 2 sDHTs, respectively. Technical performance and product errors associated with malfunction were captured for 98 sDHTs, while use errors were captured for only 32 sDHTs. Finally, although none of the studies in our review reported the readability of information presented to the user, 14 sDHTs were evaluated according to the extent to which users were able to understand the data or information presented to them (understandability) or the actions or tasks they should complete in response (actionability).
Finally, adherence (such as wear or use time) was reported for 77 sDHTs. Of these, 50 sDHTs captured adherence data objectively, adherence to 19 sDHTs was assessed through self-report or care partner report, and the method was not described for 8 sDHTs.
The complexity of the relationships in our dataset comparing usability evaluation methods with sDHT form factor, and comparing usability evaluation methods with the categories of usability-related data reported, are depicted in
and , respectively. For example, the width of each chord in is proportional to the number of sDHTs of the relevant form factor that were assessed using the linked method, demonstrating that surveys (both referenced and in house) were the most common evaluation methods while heuristic analysis was the least common. Similarly, demonstrates that overall satisfaction and self-reported ease of use were captured frequently, in contrast to data related to objective ease of use, efficiency, learnability, and memorability. Both figures contain a large number of linked chords, indicating that specific usability evaluation methods were adopted across diverse sDHT form factors and outcome measures.Tables S5-S7 in
present the data shown in - for the subset of 55 studies reporting the results of summative evaluations, while Tables S8-S10 in present these data for the subset of 28 studies reporting formative evaluations.Discussion
Principal Findings
This paper represents the first scoping review reporting the methodological approaches adopted during usability-related studies specifically focused on sDHTs. We identified 83 formative and summative studies published over the decade from 2013 to 2023 that evaluated human factors, human-centered design, or usability for 164 ambient and wearable tools. Most studies (67/83, 81%) recruited nonhealthy individuals, thereby providing informative data regarding sDHT usability across many diseases in addition to other aspects of health such as aging and pregnancy. Most sDHTs were evaluated in the intended use environment, with multiple facets of usability-related data captured via a range of mixed method approaches including heuristic analysis, surveys, observation, think-aloud, focus groups, interviews, and use errors or technical performance errors captured by the sDHT itself such as instances of connectivity loss.
This review highlights 4 notable gaps that warrant attention as the field advances. First, the breadth and scope of usability and related data were fairly simplistic, relying largely on surveys capturing user satisfaction and ease of use (each captured for >80% of sDHTs) with limited reporting of sDHT use errors, learnability, efficiency, or memorability. The extent to which users understood the health- and behavior-related data or other information presented to them (understandability) and the actions or tasks they should complete in response (actionability) was assessed for only 9% (14/164) of sDHTs. Understandability and actionability are particularly important for sDHTs, given that they are often used by patients or participants in out-of-clinic settings without clinical supervision. For the use of sDHTs in clinical care settings, it is imperative that users understand whether and how to react to clinical data [
], and thus the lack of focus on understandability and actionability is concerning and could be due to the early-stage nature of sDHTs in clinical practice. In the context of clinical research, however, sharing sDHT data with participants in real time has the potential to introduce bias and affect user behavior, thereby posing a risk of yielding inaccurate results [ ]. Additional dimensions related to understandability and actionability, such as understanding optimal ways of implementing remote examinations, also warrant further investigation.Second, only 22% (18/83) of studies considered users other than end users (patients or participants), such as care partners and clinicians, who play crucial roles in sDHT implementation and therefore the quality of data captured [
]. Especially in populations where care partners play a key role in sDHT implementation (eg, children, older people, those with language barriers, and those with disabilities), understanding usability from the care partner perspective is vital. Although existing usability data may be available for some sDHTs that are regulated as medical devices, research participants likely have needs and motivations for using the sDHT that differ from patients using the product as part of usual care. Similarly, the needs of investigator users are likely different from the needs of clinician users, requiring further evaluation.Third, we found that only 31% (17/55) of summative studies (referred to by the US Food and Drug Administration as “human factors validation studies”; [
]) provided a rationale for the sample size, with or without a power calculation. An understanding of key study design considerations, including sample size, is important for evaluating the robustness of study conclusions.Finally, as has been noted previously [
, ], we observed a deficiency in reporting basic sample demographics, with studies typically providing information on age and sex/gender but neglecting to include details on the race and ethnicity of participants. Inadequate reporting of descriptive data, including sociodemographics, precludes a complete understanding of generalizability, potentially leading to the need to repeat studies while contributing to disparities and biases in clinical research [ ].As described above, while there are several existing systematic reviews describing the usability of digital health products for specific applications [
- ], few have focused specifically on evaluating methodological approaches. In addition, most prior systematic reviews with similar objectives have focused on digital health technologies that are not sensor-based, such as electronic medical records systems [ ] and mobile clinical decision support tools [ ], that are not used for remote data capture. In 2023, Maqbool and Herold [ ] published a systematic review of usability evaluations describing a broad suite of over 1000 digital health tools consisting mostly of mobile health apps and including a subset of 20 products approximately aligned to our definition of sDHT, including fitness or activity trackers, digital sphygmomanometers, and wearable fall risk assessment systems. Compared to this study, Maqbool and Herold [ ] found relatively increased rates of clinician and care partner participation, and reporting of learnability, efficiency, and memorability. Such differences emphasize substantial variability in usability study methodology across subcategories of digital health technologies, as well as differences in definitions and terminology of the concepts reported, underscoring the need for a common evaluation framework.Strengths and Limitations
Strengths of this review include the robust approach taken to testing our search terms including a careful assessment against a list of target papers identified a priori to ensure that we were capturing appropriate literature. This process was intended to not only ensure the inclusivity of relevant literature but also the reliability of our findings to help provide a foundation for subsequent reviews and meta-analyses. In-depth data extraction across many domains allowed for a thorough comparative analysis of the identified studies. The decision to focus on studies published within the last decade (2013-2023) was also carefully considered, as it encompasses the recent surge in studies reporting sDHT implementation. While sDHTs have a lengthy history prior to 2013, this temporal scope ensures that our findings reflect contemporary developments and trends, offering insights into the current state of sDHT implementation.
A number of limitations are acknowledged. First, we limited our search to the peer-reviewed literature. We acknowledge that many usability studies undertaken by technology manufacturers may be published in the gray literature; however, our ultimate goal is to use the findings of our review to guide the development of a framework representing best practices, and therefore, the peer review process was used as an indicator of methodological rigor and reporting quality. Second, terminology in the field of digital medicine is still evolving and investigators use many different terms to describe sDHTs; by incorporating 25 descriptive keywords in Layer C of our search terms (Table S1 in
), we found it necessary to rely on MeSH terms developed by the National Library of Medicine [ ] as a means of limiting our literature search to a feasible number of publications. As a consequence, we were limited to conducting our search in PubMed as this is the clinical research database for the National Library of Medicine. While MeSH terms are widely accepted and systematically applied, their specificity may have excluded relevant studies using different terminology potentially resulting in unintentional omissions. In addition, the decision to search within one database may have resulted in missed publications. Our hope is that as the field matures, terminology will become harmonized and sDHT-specific indexing will support the identification of studies adopting these technologies. Third, our decision to exclude descriptions of specific sensors, form factors, methodologies, wear location, and technology make or model may have excluded publications that used these types of keywords in the absence of other descriptors and MeSH terms. This approach was taken to reduce the possibility of overlooking novel or emerging technologies in favor of established digital products such as actigraphy tools. Finally, only 40% (176/442) of publications were screened for eligibility by multiple investigators. This approach to study identification, which has been described and adopted previously [ , ], allowed us to screen a greater number of papers which was necessary given the lack of systematic indexing. The high agreement levels between investigators suggest that our quality-control approach maintained a robust screening process, despite part of the work being conducted by a single investigator.Conclusions and Future Directions
Based on our findings, we suggest 4 actionable recommendations that will help to advance the implementation of sensor-based digital measurement tools in both clinical and research settings. First, we encourage investigators to adopt in-depth assessment and reporting of usability data beyond user satisfaction and ease of use. In particular, it is valuable to understand use errors alongside technical errors, and it is critical to evaluate the extent to which users understand the clinical data and information presented to them and the appropriate tasks to undertake in response, if applicable. Second, it is essential to embrace the diversity of users in all respects, including diversity of stakeholders within the human-centered design process; evaluation of usability across multiple user groups including care partners and clinicians; and ensuring that the participating users are generalizable to the intended use population in terms of sociodemographics, social determinants of health, and other characteristics. Third, rigorous study design is key. Usability is a heterogeneous concept, and it is often beneficial to evaluate usability alongside other objectives such as analytical or clinical validation; thus, we do not advocate a particular study design or set of study outcome measures. We do, however, believe that careful consideration of usability evaluation criteria, study sample sizes, and predetermined thresholds of success is critical for making go or no-go decisions as to whether a particular sDHT is sufficiently usable for implementation in a particular context of use. Finally, we recommend adhering to reporting and publication checklists such as Annex B in ISO 9241-11:2018 [
] and EVIDENCE [ ], the latter of which describes optimal reporting requirements of studies evaluating several aspects of sDHT quality including usability assessments. Ensuring consistency in reporting will enable meaningful comparisons between studies, facilitate better assessments of findings, and enhance the accurate interpretation of results and limitations across studies.Our long-term goal is to develop and disseminate an evidence-driven framework for evaluating sDHTs as being fit for purpose from a usability perspective, informed in part by the findings of this review. By developing such a framework, we endeavor to contribute to the ongoing discourse surrounding sDHTs, ultimately paving the way for the development of safe and effective tools that lead to a more inclusive and patient-centric health care ecosystem poised to improve clinical trials and clinical practice.
Acknowledgments
The authors wish to acknowledge Bethanie McCrary and Danielle Stefko for assistance with project management, Katerina Djambazova for assistance with data extraction, and Jennifer Goldsack for providing feedback on the manuscript. Chord diagrams were created with flourish.studio online software. This work was undertaken within the Digital Health Measurement Collaborative Community (DATAcc), hosted by the Digital Medicine Society (DiMe).
Authors' Contributions
All authors contributed to the study design, data interpretation, and manuscript preparation. The literature search, literature screening, and data extraction were undertaken by JC, SM, and JPB.
Conflicts of Interest
AT is a consultant for Synergen Technology Labs, LLC; Siemens Healthineers; and Gabi SmartCare. BC is an employee of Genentech, a member of the Roche Group and Roche Pharmaceuticals, and owns company stock. EI is an employee of Koneksa Health and may own company stock. NM and SV are employees of Johnson & Johnson Innovative Medicine and hold company stocks or stock options. JPB reports financial interests (consulting income, shares, or stock) in Philips, Signifier Medical Technologies, Koneksa Health, and Apnimed. ES serves on the editorial board as an Associate Editor of JMIR Publications.
PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) checklist.
PDF File (Adobe PDF File), 162 KBSupplementary tables.
DOCX File , 140 KBReferences
- Goldsack JC, Coravos A, Bakker JP, Bent B, Dowling AV, Fitzer-Attas C, et al. Verification, analytical validation, and clinical validation (V3): the foundation of determining fit-for-purpose for biometric monitoring technologies (BioMeTs). NPJ Digital Med. 2020;3:55. [FREE Full text] [CrossRef] [Medline]
- Library of digital endpoints. Digital Medicine Society (DiMe). 2021. URL: https://dimesociety.org/get-involved/library-of-digital-endpoints/ [accessed 2024-01-23]
- Marwaha JS, Landman AB, Brat GA, Dunn T, Gordon WJ. Deploying digital health tools within large, complex health systems: key considerations for adoption and implementation. NPJ Digital Med. 2022;5(1):13. [FREE Full text] [CrossRef] [Medline]
- Center for Drug Evaluation, Research. Digital health technologies for remote data acquisition in clinical investigations. US Food and Drug Administration. URL: https://www.fda.gov/regulatory-information/search-fda-guidance-documents/digital-health-technologies-remote-data-acquisition-clinical-investigations [accessed 2024-01-23]
- Maqbool B, Herold S. Potential effectiveness and efficiency issues in usability evaluation within digital health: a systematic literature review. J Syst Software. 2024;208:111881. [FREE Full text] [CrossRef]
- Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digital Med. 2019;2:38. [FREE Full text] [CrossRef] [Medline]
- Whitehead L, Talevski J, Fatehi F, Beauchamp A. Barriers to and facilitators of digital health among culturally and linguistically diverse populations: qualitative systematic review. J Med Internet Res. 2023;25:e42719. [FREE Full text] [CrossRef] [Medline]
- Bent B, Sim I, Dunn JP. Digital medicine community perspectives and challenges: survey study. JMIR Mhealth Uhealth. 2021;9(2):e24570. [FREE Full text] [CrossRef] [Medline]
- Evans J, Papadopoulos A, Silvers CT, Charness N, Boot WR, Schlachta-Fairchild L, et al. Remote health monitoring for older adults and those with heart failure: adherence and system usability. Telemed e-Health. 2016;22(6):480-488. [FREE Full text] [CrossRef] [Medline]
- Ummels D, Braun S, Stevens A, Beekman E, Beurskens A. Measure it super simple (MISS) activity tracker: (re)design of a user-friendly interface and evaluation of experiences in daily life. Disabil Rehabil Assist Technol. 2022;17(7):767-777. [CrossRef] [Medline]
- Chen Y, Clayton EW, Novak LL, Anders S, Malin B. Human-centered design to address biases in artificial intelligence. J Med Internet Res. 2023;25:e43251. [FREE Full text] [CrossRef] [Medline]
- Levander XA, VanDerSchaaf H, Barragán VG, Choxi H, Hoffman A, Morgan E, et al. The role of human-centered design in healthcare innovation: a digital health equity case study. J Gen Intern Med. 2024;39(4):690-695. [CrossRef] [Medline]
- Benishek LE, Kachalia A, Biddison LD. Improving clinician well-being and patient safety through human-centered design. JAMA. 2023;329(14):1149-1150. [CrossRef] [Medline]
- Tase A, Vadhwana B, Buckle P, Hanna GB. Usability challenges in the use of medical devices in the home environment: a systematic review of literature. Appl Ergon. 2022;103:103769. [FREE Full text] [CrossRef] [Medline]
- Ye B, Chu CH, Bayat S, Babineau J, How T, Mihailidis A. Researched apps used in dementia care for people living with dementia and their informal caregivers: systematic review on app features, security, and usability. J Med Internet Res. 2023;25:e46188. [FREE Full text] [CrossRef] [Medline]
- Siette J, Dodds L, Sharifi F, Nguyen A, Baysari M, Seaman K, et al. Usability and acceptability of clinical dashboards in aged care: systematic review. JMIR Aging. 2023;6:e42274. [FREE Full text] [CrossRef] [Medline]
- Kraaijkamp JJM, van Dam van Isselt EF, Persoon A, Versluis A, Chavannes NH, Achterberg WP. eHealth in geriatric rehabilitation: systematic review of effectiveness, feasibility, and usability. J Med Internet Res. 2021;23(8):e24015. [FREE Full text] [CrossRef] [Medline]
- Keogh A, Argent R, Anderson A, Caulfield B, Johnston W. Assessing the usability of wearable devices to measure gait and physical activity in chronic conditions: a systematic review. J Neuroeng Rehabil. 2021;18(1):138. [FREE Full text] [CrossRef] [Medline]
- Butler S, Sculley D, Santos DS, Fellas A, Gironès X, Singh-Grewal D, et al. Usability of eHealth and mobile health interventions by young people living with juvenile idiopathic arthritis: systematic review. JMIR Pediatr Parent. 2020;3(2):e15833. [FREE Full text] [CrossRef] [Medline]
- Center for Devices, Radiological Health. Applying human factorsusability engineering to medical devices. US Food and Drug Administration. URL: https://www.fda.gov/regulatory-information/search-fda-guidance-documents/applying-human- factors-and-usability-engineering-medical-devices [accessed 2024-01-23]
- Department of Health and Social Care. Guidance on applying human factors and usability engineering to medical devices including drug-device combination products in Great Britain. Medicines and Healthcare products Regulatory Agency. URL: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/970563/Human-Factors_ Medical-Devices_v2.0.pdf [accessed 2024-01-30]
- Guiding principles for technical review of human factors design of medical devices [DiMe, Trans.]. National Medical Products Association. URL: https://datacc.dimesociety.org/wp-content/uploads/2023/09/NMPA-Human-Factors-Guidance- English-Translation-FINAL.pdf [accessed 2024-01-30]
- ISO. ISO 9241-210. 2019. URL: https://www.iso.org/standard/77520.html [accessed 2024-01-23]
- Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. [FREE Full text] [CrossRef] [Medline]
- PROSPERO. URL: https://www.crd.york.ac.uk/prospero/ [accessed 2024-01-23]
- Medical subject headings—home page. U.S. National Library of Medicine. 2020. URL: https://www.nlm.nih.gov/mesh/meshhome.html [accessed 2024-01-30]
- Nussbaumer-Streit B, Sommer I, Hamel C, Devane D, Noel-Storr A, Puljak L, et al. Rapid reviews methods series: guidance on team considerations, study selection, data extraction and risk of bias assessment. BMJ Evidence-Based Med. 2023;28(6):418-423. [FREE Full text] [CrossRef] [Medline]
- Olaye IM, Belovsky MP, Bataille L, Cheng R, Ciger A, Fortuna KL, et al. Recommendations for defining and reporting adherence measured by biometric monitoring technologies: systematic review. J Med Internet Res. 2022;24(4):e33537. [FREE Full text] [CrossRef] [Medline]
- Eriksen MB, Frandsen TF. The impact of patient, intervention, comparison, outcome (PICO) as a search strategy tool on literature search quality: a systematic review. J Med Libr Assoc. 2018;106(4):420-431. [FREE Full text] [CrossRef] [Medline]
- Nielsen J. Usability Engineering. Somerville, Massachusetts. Boston Academic Press; 1993.
- Shachak A, Kuziemsky C, Petersen C. Beyond TAM and UTAUT: future directions for HIT implementation research. J Biomed Inform. 2019;100:103315. [FREE Full text] [CrossRef] [Medline]
- Shoemaker SJ, Wolf MS, Brach C. Development of the patient education materials assessment tool (PEMAT): a new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns. 2014;96(3):395-403. [FREE Full text] [CrossRef] [Medline]
- Bossuyt PMM, Reitsma JB, Linnet K, Moons KGM. Beyond diagnostic accuracy: the clinical utility of diagnostic tests. Clin Chem. 2012;58(12):1636-1643. [CrossRef] [Medline]
- Weenk M, Bredie SJ, Koeneman M, Hesselink G, van Goor H, van de Belt TH. Continuous monitoring of vital signs in the general ward using wearable devices: randomized controlled trial. J Med Internet Res. 2020;22(6):e15471. [FREE Full text] [CrossRef] [Medline]
- Bentley CL, Powell L, Potter S, Parker J, Mountain GA, Bartlett YK, et al. The use of a smartphone app and an activity tracker to promote physical activity in the management of chronic obstructive pulmonary disease: randomized controlled feasibility study. JMIR Mhealth Uhealth. 2020;8(6):e16203. [FREE Full text] [CrossRef] [Medline]
- Liverani M, Ir P, Wiseman V, Perel P. User experiences and perceptions of health wearables: an exploratory study in Cambodia. Glob Health Res Policy. 2021;6(1):33. [FREE Full text] [CrossRef] [Medline]
- Nasseri M, Nurse E, Glasstetter M, Böttcher S, Gregg NM, Nandakumar AL, et al. Signal quality and patient experience with wearable devices for epilepsy management. Epilepsia. 2020;61 Suppl 1:S25-S35. [CrossRef] [Medline]
- Sala-Cunill A, Luengo O, Curran A, Moreno N, Labrador-Horrillo M, Guilarte M, et al. Digital technology for anaphylaxis management impact on patient behaviour: a randomized clinical trial. Allergy. 2021;76(5):1507-1516. [CrossRef] [Medline]
- Vaughn J, Gollarahalli S, Shaw RJ, Docherty S, Yang Q, Malhotra C, et al. Mobile health technology for pediatric symptom monitoring: a feasibility study. Nurs Res. 2020;69(2):142-148. [FREE Full text] [CrossRef] [Medline]
- Ollenschläger M, Kluge F, Müller-Schulz M, Püllen R, Möller C, Klucken J, et al. Wearable gait analysis systems: ready to be used by medical practitioners in geriatric wards? Eur Geriatr Med. 2022;13(4):817-824. [FREE Full text] [CrossRef] [Medline]
- Matcham F, Carr E, White KM, Leightley D, Lamers F, Siddi S, et al. Predictors of engagement with remote sensing technologies for symptom measurement in major depressive disorder. J Affect Disord. 2022;310:106-115. [CrossRef] [Medline]
- Bruno E, Biondi A, Thorpe S, Richardson MP, RADAR-CNS Consortium. Patients self-mastery of wearable devices for seizure detection: a direct user-experience. Seizure. 2020;81:236-240. [FREE Full text] [CrossRef] [Medline]
- Figueiredo J, Carvalho SP, Vilas-Boas JP, Gonçalves LM, Moreno JC, Santos CP. Wearable inertial sensor system towards daily human kinematic gait analysis: benchmarking analysis to MVN BIOMECH. Sensors (Basel). 2020;20(8):2185. [FREE Full text] [CrossRef] [Medline]
- Lunsford-Avery JR, Keller C, Kollins SH, Krystal AD, Jackson L, Engelhard MM. Feasibility and acceptability of wearable sleep electroencephalogram device use in adolescents: observational study. JMIR Mhealth Uhealth. 2020;8(10):e20590. [FREE Full text] [CrossRef] [Medline]
- Standoli CE, Guarneri MR, Perego P, Mazzola M, Mazzola A, Andreoni G. A smart wearable sensor system for counter-fighting overweight in teenagers. Sensors (Basel). 2016;16(8):1220. [FREE Full text] [CrossRef] [Medline]
- Marando CM, Mansouri K, Kahook MY, Seibold LK. Tolerability and functionality of a wireless 24-hour ocular telemetry sensor in African American glaucoma patients. J Glaucoma. 2019;28(2):119-124. [CrossRef] [Medline]
- Pavic M, Klaas V, Theile G, Kraft J, Tröster G, Guckenberger M. Feasibility and usability aspects of continuous remote monitoring of health status in palliative cancer patients using wearables. Oncology. 2020;98(6):386-395. [FREE Full text] [CrossRef] [Medline]
- Yang K, Meadmore K, Freeman C, Grabham N, Hughes A-M, Wei Y, et al. Development of user-friendly wearable electronic textiles for healthcare applications. Sensors (Basel). 2018;18(8):2410. [FREE Full text] [CrossRef] [Medline]
- McGillion MH, Dvirnik N, Yang S, Belley-Côté E, Lamy A, Whitlock R, et al. Continuous noninvasive remote automated blood pressure monitoring with novel wearable technology: a preliminary validation study. JMIR Mhealth Uhealth. 2022;10(2):e24916. [FREE Full text] [CrossRef] [Medline]
- Carrasco JJ, Pérez-Alenda S, Casaña J, Soria-Olivas E, Bonanad S, Querol F. Physical activity monitoring and acceptance of a commercial activity tracker in adult patients with haemophilia. Int J Environ Res Public Health. 2019;16(20):3851. [FREE Full text] [CrossRef] [Medline]
- Argent R, Slevin P, Bevilacqua A, Neligan M, Daly A, Caulfield B. Wearable sensor-based exercise biofeedback for orthopaedic rehabilitation: a mixed methods user evaluation of a prototype system. Sensors (Basel). 2019;19(2):432. [FREE Full text] [CrossRef] [Medline]
- Tsung-Yin O, Chih-Young H, Che-Wei L. A mixed-methods study of users' journey mapping experience and acceptance of telehealthcare technology in Taiwan. Telemed e-Health. 2019;25(11):1057-1070. [CrossRef] [Medline]
- Simblett SK, Biondi A, Bruno E, Ballard D, Stoneman A, Lees S, et al. Patients' experience of wearing multimodal sensor devices intended to detect epileptic seizures: a qualitative analysis. Epilepsy Behav. 2020;102:106717. [CrossRef] [Medline]
- Chevallier T, Buzancais G, Occean B, Rataboul P, Boisson C, Simon N, et al. Feasibility of remote digital monitoring using wireless Bluetooth monitors, the Smart Angel™ app and an original web platform for patients following outpatient surgery: a prospective observational pilot study. BMC Anesthesiol. 2020;20(1):259. [FREE Full text] [CrossRef] [Medline]
- Virbel-Fleischman C, Rétory Y, Hardy S, Huiban C, Corvol JC, Grabli D. Body-worn sensors for Parkinson's disease: a qualitative approach with patients and healthcare professionals. PLoS One. 2022;17(5):e0265438. [FREE Full text] [CrossRef] [Medline]
- Meritam P, Ryvlin P, Beniczky S. User-based evaluation of applicability and usability of a wearable accelerometer device for detecting bilateral tonic-clonic seizures: a field study. Epilepsia. 2018;59 Suppl 1:48-52. [CrossRef] [Medline]
- Esbjörnsson M, Ullberg T. Safety and usability of wearable accelerometers for stroke detection the STROKE ALARM PRO 1 study. J Stroke Cerebrovasc Dis. 2022;31(11):106762. [FREE Full text] [CrossRef] [Medline]
- Hua A, Johnson N, Quinton J, Chaudhary P, Buchner D, Hernandez ME. Design of a low-cost, wearable device for kinematic analysis in physical therapy settings. Methods Inf Med. 2020;59(1):41-47. [CrossRef] [Medline]
- Domingos C, Costa P, Santos NC, Pêgo JM. Usability, acceptability, and satisfaction of a wearable activity tracker in older adults: observational study in a real-life context in Northern Portugal. J Med Internet Res. 2022;24(1):e26652. [FREE Full text] [CrossRef] [Medline]
- Wing D, Godino JG, Baker FC, Yang R, Chevance G, Thompson WK, et al. Recommendations for identifying valid wear for consumer-level wrist-worn activity trackers and acceptability of extended device deployment in children. Sensors (Basel). 2022;22(23):9189. [FREE Full text] [CrossRef] [Medline]
- Cortell-Tormo JM, Garcia-Jaen M, Ruiz-Fernandez D, Fuster-Lloret V. Lumbatex: a wearable monitoring system based on inertial sensors to measure and control the lumbar spine motion. IEEE Trans Neural Syst Rehabil Eng. 2019;27(8):1644-1653. [CrossRef] [Medline]
- Keogh A, Dorn JF, Walsh L, Calvo F, Caulfield B. Comparing the usability and acceptability of wearable sensors among older Irish adults in a real-world context: observational study. JMIR Mhealth Uhealth. 2020;8(4):e15704. [FREE Full text] [CrossRef] [Medline]
- Hamilton C, Lovarini M, van den Berg M, McCluskey A, Hassett L. Usability of affordable feedback-based technologies to improve mobility and physical activity in rehabilitation: a mixed methods study. Disabil Rehabil. 2022;44(15):4029-4038. [CrossRef] [Medline]
- O'Brien J, Mason A, Cassarino M, Chan J, Setti A. Older women's experiences of a community-led walking programme using activity trackers. Int J Environ Res Public Health. 2021;18(18):9818. [FREE Full text] [CrossRef] [Medline]
- Jiménez-Fernández S, de Toledo P, del Pozo F. Usability and interoperability in wireless sensor networks for patient telemonitoring in chronic disease management. IEEE Trans Biomed Eng. 2013;60(12):3331-3339. [CrossRef] [Medline]
- Areia C, Young L, Vollam S, Ede J, Santos M, Tarassenko L, et al. Wearability testing of ambulatory vital sign monitoring devices: prospective observational cohort study. JMIR Mhealth Uhealth. 2020;8(12):e20214. [FREE Full text] [CrossRef] [Medline]
- Bruno E, Biondi A, Böttcher S, Lees S, Schulze-Bonhage A, Richardson MP, et al. RADAR-CNS Consortium. Day and night comfort and stability on the body of four wearable devices for seizure detection: a direct user-experience. Epilepsy Behav. 2020;112:107478. [CrossRef] [Medline]
- Uomoto JM, Skopp N, Jenkins-Guarnieri M, Reini J, Thomas D, Adams RJ, et al. Assessing the clinical utility of a wearable device for physiological monitoring of heart rate variability in military service members with traumatic brain injury. Telemed e-Health. 2022;28(10):1496-1504. [CrossRef] [Medline]
- Mitsopoulos K, Fiska V, Tagaras K, Papias A, Antoniou P, Nizamis K, et al. NeuroSuitUp: system architecture and validation of a motor rehabilitation wearable robotics and serious game platform. Sensors (Basel). 2023;23(6):3281. [FREE Full text] [CrossRef] [Medline]
- Grym K, Niela-Vilén H, Ekholm E, Hamari L, Azimi I, Rahmani A, et al. Feasibility of smart wristbands for continuous monitoring during pregnancy and one month after birth. BMC Pregnancy Childbirth. 2019;19(1):34. [FREE Full text] [CrossRef] [Medline]
- Pradhan S, Kelly VE. Quantifying physical activity in early Parkinson disease using a commercial activity monitor. Parkinsonism Relat Disord. 2019;66:171-175. [FREE Full text] [CrossRef] [Medline]
- Lin WY, Ke HL, Chou WC, Chang PC, Tsai TH, Lee MY. Realization and technology acceptance test of a wearable cardiac health monitoring and early warning system with multi-channel MCGs and ECG. Sensors (Basel). 2018;18(10):3538. [FREE Full text] [CrossRef] [Medline]
- Yurkiewicz IR, Simon P, Liedtke M, Dahl G, Dunn T. Effect of Fitbit and iPad wearable technology in health-related quality of life in adolescent and young adult cancer patients. J Adolesc Young Adult Oncol. 2018;7(5):579-583. [CrossRef] [Medline]
- Mackintosh KA, Chappel SE, Salmon J, Timperio A, Ball K, Brown H, et al. Parental perspectives of a wearable activity tracker for children younger than 13 years: acceptability and usability study. JMIR Mhealth Uhealth. 2019;7(11):e13858. [FREE Full text] [CrossRef] [Medline]
- Hendriks MMS, Vos-van der Hulst M, Keijsers NLW. Feasibility of a sensor-based technological platform in assessing gait and sleep of in-hospital stroke and incomplete spinal cord injury (iSCI) patients. Sensors (Basel). 2020;20(10):2748. [FREE Full text] [CrossRef] [Medline]
- Lin BS, Wong AM, Tseng KC. Community-based ECG monitoring system for patients with cardiovascular diseases. J Med Syst. 2016;40(4):80. [CrossRef] [Medline]
- Radder B, Prange-Lasonder GB, Kottink AIR, Holmberg J, Sletta K, van Dijk M, et al. Home rehabilitation supported by a wearable soft-robotic device for improving hand function in older adults: a pilot randomized controlled trial. PLoS One. 2019;14(8):e0220544. [FREE Full text] [CrossRef] [Medline]
- Bell KM, Onyeukwu C, McClincy MP, Allen M, Bechard L, Mukherjee A, et al. Verification of a portable motion tracking system for remote management of physical rehabilitation of the knee. Sensors (Basel). 2019;19(5):1021. [FREE Full text] [CrossRef] [Medline]
- Pillalamarri SS, Huyett LM, Abdel-Malek A. Novel Bluetooth-enabled tubeless insulin pump: a user experience design approach for a connected digital diabetes management platform. J Diabetes Sci Technol. 2018;12(6):1132-1142. [FREE Full text] [CrossRef] [Medline]
- Burgdorf A, Güthe I, Jovanović M, Kutafina E, Kohlschein C, Bitsch JÁ, et al. The mobile sleep lab app: an open-source framework for mobile sleep assessment based on consumer-grade wearable devices. Comput Biol Med. 2018;103:8-16. [CrossRef] [Medline]
- Del-Valle-Soto C, Valdivia LJ, López-Pimentel JC, Visconti P. Comparison of collaborative and cooperative schemes in sensor networks for non-invasive monitoring of people at home. Int J Environ Res Public Health. 2023;20(7):5268. [FREE Full text] [CrossRef] [Medline]
- Hosseini A, Buonocore CM, Hashemzadeh S, Hojaiji H, Kalantarian H, Sideris C, et al. Feasibility of a secure wireless sensing smartwatch application for the self-management of pediatric asthma. Sensors (Basel). 2017;17(8):1780. [FREE Full text] [CrossRef] [Medline]
- Lin WY, Chou WC, Tsai TH, Lin CC, Lee MY. Development of a wearable instrumented vest for posture monitoring and system usability verification based on the technology acceptance model. Sensors (Basel). 2016;16(12):2172. [FREE Full text] [CrossRef] [Medline]
- Esfahani MIM, Nussbaum MA. Preferred placement and usability of a smart textile system vs. inertial measurement units for activity monitoring. Sensors (Basel). 2018;18(8):2501. [FREE Full text] [CrossRef] [Medline]
- Stubberud A, Tronvik E, Olsen A, Gravdahl G, Linde M. Biofeedback treatment app for pediatric migraine: development and usability study. Headache. 2020;60(5):889-901. [CrossRef] [Medline]
- Chung J, Brakey HR, Reeder B, Myers O, Demiris G. Community-dwelling older adults' acceptance of smartwatches for health and location tracking. Int J Older People Nurs. 2023;18(1):e12490. [FREE Full text] [CrossRef] [Medline]
- Shelley J, Fairclough SJ, Knowles ZR, Southern KW, McCormack P, Dawson EA, et al. A formative study exploring perceptions of physical activity and physical activity monitoring among children and young people with cystic fibrosis and health care professionals. BMC Pediatr. 2018;18(1):335. [FREE Full text] [CrossRef] [Medline]
- Auerswald T, Meyer J, von Holdt K, Voelcker-Rehage C. Application of activity trackers among nursing home residents—a pilot and feasibility study on physical activity behavior, usage behavior, acceptance, usability and motivational impact. Int J Environ Res Public Health. 2020;17(18):6683. [FREE Full text] [CrossRef] [Medline]
- Schuurmans MM, Muszynski M, Li X, Marcinkevičs R, Zimmerli L, Lopez DM, et al. Multimodal remote home monitoring of lung transplant recipients during COVID-19 vaccinations: usability pilot study of the COVIDA desk incorporating wearable devices. Medicina (Kaunas). 2023;59(3):617. [FREE Full text] [CrossRef] [Medline]
- Cesareo A, Nido SA, Biffi E, Gandossini S, D'Angelo MG, Aliverti A. A wearable device for breathing frequency monitoring: a pilot study on patients with muscular dystrophy. Sensors (Basel). 2020;20(18):5346. [FREE Full text] [CrossRef] [Medline]
- Albani G, Ferraris C, Nerino R, Chimienti A, Pettiti G, Parisi F, et al. An integrated multi-sensor approach for the remote monitoring of Parkinson's disease. Sensors (Basel). 2019;19(21):4764. [FREE Full text] [CrossRef] [Medline]
- Demiris G, Chaudhuri S, Thompson HJ. Older adults' experience with a novel fall detection device. Telemed e-Health. 2016;22(9):726-732. [FREE Full text] [CrossRef] [Medline]
- Kaiserman K, Buckingham BA, Prakasam G, Gunville F, Slover RH, Wang Y, et al. Acceptability and utility of the mySentry remote glucose monitoring system. J Diabetes Sci Technol. 2013;7(2):356-361. [FREE Full text] [CrossRef] [Medline]
- Drehlich M, Naraine M, Rowe K, Lai SK, Salmon J, Brown H, et al. Using the technology acceptance model to explore adolescents' perspectives on combining technologies for physical activity promotion within an intervention: usability study. J Med Internet Res. 2020;22(3):e15552. [FREE Full text] [CrossRef] [Medline]
- Nguyen NH, Hadgraft NT, Moore MM, Rosenberg DE, Lynch C, Reeves MM, et al. A qualitative evaluation of breast cancer survivors' acceptance of and preferences for consumer wearable technology activity trackers. Support Care Cancer. 2017;25(11):3375-3384. [CrossRef] [Medline]
- Arends J, Thijs RD, Gutter T, Ungureanu C, Cluitmans P, Van Dijk J, et al. Multimodal nocturnal seizure detection in a residential care setting: a long-term prospective trial. Neurology. 2018;91(21):e2010-e2019. [FREE Full text] [CrossRef] [Medline]
- Finkelstein J, Cisse P, Jeong IC. Feasibility of interactive resistance chair in older adults with diabetes. Stud Health Technol Inform. 2015;213:61-64. [Medline]
- Thilo FJS, Hahn S, Halfens RJG, Schols JMGA. Usability of a wearable fall detection prototype from the perspective of older people—a real field testing approach. J Clin Nurs. 2019;28(1-2):310-320. [CrossRef] [Medline]
- Preusse KC, Mitzner TL, Fausset CB, Rogers WA. Older adults' acceptance of activity trackers. J Appl Gerontol. 2017;36(2):127-155. [FREE Full text] [CrossRef] [Medline]
- Hofbauer LM, Rodriguez FS. How is the usability of commercial activity monitors perceived by older adults and by researchers? A cross-sectional evaluation of community-living individuals. BMJ Open. 2022;12(11):e063135. [FREE Full text] [CrossRef] [Medline]
- Cajamarca G, Rodríguez I, Herskovic V, Campos M, Riofrío JC. StraightenUp+: monitoring of posture during daily activities for older persons using wearable sensors. Sensors (Basel). 2018;18(10):3409. [FREE Full text] [CrossRef] [Medline]
- Caswell N, Kuru K, Ansell D, Jones MJ, Watkinson BJ, Leather P, et al. Patient engagement in medical device design: refining the essential attributes of a wearable, pre-void, ultrasound alarm for nocturnal enuresis. Pharmaceut Med. 2020;34(1):39-48. [CrossRef] [Medline]
- Imbesi S, Corzani M. Multisensory cues for gait rehabilitation with smart glasses: methodology, design, and results of a preliminary pilot. Sensors (Basel). 2023;23(2):874. [FREE Full text] [CrossRef] [Medline]
- Shore L, Power V, Hartigan B, Schülein S, Graf E, de Eyto A, et al. Exoscore: a design tool to evaluate factors associated with technology acceptance of soft lower limb exosuits by older adults. Hum Factors. 2020;62(3):391-410. [CrossRef] [Medline]
- Hart P, Bierwirth R, Fulk G, Sazonov E. The design and evaluation of an activity monitoring user interface for people with stroke. 2014. Presented at: 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; August 26-30, 2014:5908-5911; Chicago, IL. [CrossRef]
- Rupp MA, Michaelis JR, McConnell DS, Smither JA. The role of individual differences on perceptions of wearable fitness device trust, usability, and motivational impact. Appl Ergon. 2018;70:77-87. [CrossRef] [Medline]
- Drabarek D, Anh NT, Nhung NV, Hoa NB, Fox GJ, Bernays S. Implementation of medication event reminder monitors among patients diagnosed with drug susceptible tuberculosis in rural Viet Nam: a qualitative study. PLoS One. 2019;14(7):e0219891. [FREE Full text] [CrossRef] [Medline]
- Vandelanotte C, Duncan MJ, Maher CA, Schoeppe S, Rebar AL, Power DA, et al. The effectiveness of a web-based computer-tailored physical activity intervention using Fitbit activity trackers: randomized trial. J Med Internet Res. 2018;20(12):e11321. [FREE Full text] [CrossRef] [Medline]
- Matsunaga K, Ogasawara T, Kodate J, Mukaino M, Saitoh E. On-site evaluation of rehabilitation patients monitoring system using distributed wireless gateways. 2019. Presented at: 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); July 23-27, 2019:3195-3198; Berlin, Germany. [CrossRef]
- Van Velthoven MH, Oke J, Kardos A. ChroniSense national early warning score study: comparison study of a wearable wrist device to measure vital signs in patients who are hospitalized. J Med Internet Res. 2023;25:e40226. [FREE Full text] [CrossRef] [Medline]
- de'Sperati C, Dalmasso V, Moretti M, Høeg ER, Baud-Bovy G, Cozzi R, et al. Enhancing visual exploration through augmented gaze: high acceptance of immersive virtual biking by oldest olds. Int J Environ Res Public Health. 2023;20(3):1671. [FREE Full text] [CrossRef] [Medline]
- Radder B, Prange-Lasonder G, Kottink AIR, Melendez-Calderon A, Buurke JH, Rietman JS. Feasibility of a wearable soft-robotic glove to support impaired hand function in stroke patients. J Rehabil Med. 2018;50(7):598-606. [FREE Full text] [CrossRef] [Medline]
- McNamara RJ, Tsai LLY, Wootton SL, Ng LWC, Dale MT, McKeough ZJ, et al. Measurement of daily physical activity using the SenseWear armband: compliance, comfort, adverse side effects and usability. Chron Respir Dis. 2016;13(2):144-154. [FREE Full text] [CrossRef] [Medline]
- Radder B, Prange-Lasonder GB, Kottink AIR, Holmberg J, Sletta K, Van Dijk M, et al. The effect of a wearable soft-robotic glove on motor function and functional performance of older adults. Assist Technol. 2020;32(1):9-15. [CrossRef] [Medline]
- ISO 13485:2016. 2020. URL: https://www.iso.org/standard/59752.html [accessed 2024-02-22]
- Cox CL. Patient understanding: how should it be defined and assessed in clinical practice? J Eval Clin Pract. 2023;29(7):1127-1134. [CrossRef] [Medline]
- Thomas EE, Taylor ML, Banbury A, Snoswell CL, Haydon HM, Rejas VMG, et al. Factors influencing the effectiveness of remote patient monitoring interventions: a realist review. BMJ Open. 2021;11(8):e051844. [FREE Full text] [CrossRef] [Medline]
- Sharma Y, Djambazova K, Marquez C, Lyden K, Goldsack J, Bakker J. A systematic review assessing the state of analytical validation for connected, mobile, sensor-based digital health technologies. medRxiv. Preprint posted online on May 23, 2023. 2023:1-21. [CrossRef]
- National Academies of Sciences, Engineering, and Medicine, Policy and Global Affairs, Committee on Women, Bibbins-Domingo K, Helman A. Why diverse representation in clinical research matters and the current state of representation within the clinical research ecosystem. 2022. URL: https://www.ncbi.nlm.nih.gov/books/NBK584396/ [accessed 2024-01-23]
- Wronikowska MW, Malycha J, Morgan LJ, Westgate V, Petrinic T, Young JD, et al. Systematic review of applied usability metrics within usability evaluation methods for hospital electronic healthcare record systems: metrics and evaluation methods for eHealth systems. J Eval Clin Pract. 2021;27(6):1403-1416. [FREE Full text] [CrossRef] [Medline]
- Wohlgemut JM, Pisirir E, Kyrimi E, Stoner RS, Marsh W, Perkins ZB, et al. Methods used to evaluate usability of mobile clinical decision support systems for healthcare emergencies: a systematic review and qualitative synthesis. JAMIA Open. 2023;6(3):ooad051. [FREE Full text] [CrossRef] [Medline]
- ISO 9241-11:2018. 2023. URL: https://www.iso.org/standard/63500.html [accessed 2024-01-23]
- Manta C, Mahadevan N, Bakker J, Irmak SO, Izmailova E, Park S, et al. EVIDENCE publication checklist for studies evaluating connected sensor technologies: explanation and elaboration. Digital Biomarkers. 2021;5(2):127-147. [FREE Full text] [CrossRef] [Medline]
Abbreviations
MeSH: Medical Subject Heading |
PICO: patients/participants; intervention; comparator; outcomes |
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses |
sDHT: sensor-based digital health technology |
Edited by T de Azevedo Cardoso; submitted 22.02.24; peer-reviewed by S Yu, M Nissen; comments to author 11.04.24; revised version received 28.05.24; accepted 11.09.24; published 15.11.24.
Copyright©Animesh Tandon, Bryan Cobb, Jacob Centra, Elena Izmailova, Nikolay V Manyakov, Samantha McClenahan, Smit Patel, Emre Sezgin, Srinivasan Vairavan, Bernard Vrijens, Jessie P Bakker, Digital Health Measurement Collaborative Community (DATAcc) hosted by DiMe. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 15.11.2024.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.