Published on in Vol 24, No 8 (2022): August

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/37368, first published .
Usability Evaluation of a Noninvasive Neutropenia Screening Device (PointCheck) for Patients Undergoing Cancer Chemotherapy: Mixed Methods Observational Study

Usability Evaluation of a Noninvasive Neutropenia Screening Device (PointCheck) for Patients Undergoing Cancer Chemotherapy: Mixed Methods Observational Study

Usability Evaluation of a Noninvasive Neutropenia Screening Device (PointCheck) for Patients Undergoing Cancer Chemotherapy: Mixed Methods Observational Study

Original Paper

1Leuko Labs, Inc, Boston, MA, United States

2Hematology Department, Hospital Universitario 12 de Octubre, Madrid, Spain

3Section of Hematology & Medical Oncology, Department of Medicine, Boston University School of Medicine and Boston Medical Center, Boston, MA, United States

Corresponding Author:

Ganimete Lamaj, BSc

Leuko Labs, Inc

8 St Marys Street

No 613

Boston, MA, 02215

United States

Phone: 1 781 954 0250

Email: ganimete@leuko.com


Background: Patients with cancer undergoing cytotoxic chemotherapy face an elevated risk of developing serious infection as a consequence of their treatment, which lowers their white blood cell count and, more specifically, their absolute neutrophil count. This condition is known as neutropenia. Neutropenia accompanied by a fever is referred to as febrile neutropenia, a common side effect of chemotherapy with a high mortality rate. The timely detection of severe neutropenia (<500 absolute neutrophil count/μL) is critical in detecting and managing febrile neutropenia. Current methods rely on blood draws, which limit them to clinical settings and do not allow frequent or portable monitoring. In this study, we demonstrated the usability of PointCheck, a noninvasive device for neutropenia screening, in a simulated home environment without clinical supervision. PointCheck automatically performs microscopy through the skin of the finger to image the blood flowing through superficial microcapillaries and enables the remote monitoring of neutropenia status, without requiring venipuncture.

Objective: This study aimed to evaluate the usability of PointCheck, a noninvasive optical technology for screening severe neutropenia, with the goal of identifying potential user interface, functionality, and design issues from the perspective of untrained users.

Methods: We conducted a multicenter study using quantitative and qualitative approaches to evaluate the usability of PointCheck across 154 untrained participants. We used a mixed method approach to gather usability data through user testing observations, a short-answer qualitative questionnaire, and a standardized quantitative System Usability Scale (SUS) survey to assess perceived usability and satisfaction.

Results: Of the 154 participants, we found that 108 (70.1%) scored above 80.8 on the SUS across all sites, with a mean SUS score of 86.1 across all sites. Furthermore, the SUS results indicated that, out of the 151 users who completed the SUS survey, 145 (96%) found that they learned how to use PointCheck very quickly, and 141 (93.4%) felt very confident when using the device.

Conclusions: We have shown that PointCheck, a novel technology for noninvasive, home-based neutropenia detection, can be safely and effectively operated by first-time users. In a simulated home environment, these users found it easy to use, with a mean SUS score of 86.1, indicating an excellent perception of usability and placing this device within the top tenth percentile of systems evaluated for usability by the SUS.

Trial Registration: ClinicalTrials.gov NCT04448314; https://clinicaltrials.gov/ct2/show/NCT04448314 (Hospital Universitario 12 de Octubre registration) and NCT04448301; https://clinicaltrials.gov/ct2/show/NCT04448301 (Boston Medical Center registration)

J Med Internet Res 2022;24(8):e37368

doi:10.2196/37368

Keywords



Background

One of the most serious side effects of cytotoxic chemotherapy and immunotherapy is neutropenia—a decrease in neutrophils, the most common type of white blood cell (WBC) and the most important cell needed to prevent bacterial infection. The primary clinical consequence of neutropenia is an elevated risk of life-threatening bacterial infection that typically requires immediate admission to the emergency department, hospitalization, and treatment [1-3]. Every year, approximately 850,000 patients with cancer start chemotherapy treatments in the United States [4], and 140,000 (17%) [5] will endure at least one episode of febrile neutropenia (FN), or neutropenia accompanied by a fever. FN typically requires an admission of over 1 week, costing approximately US $30,000 per episode [6,7], with associated mortality rates between 7% to 10% [8]. The timely detection and awareness of severe neutropenia (ie, <500 absolute neutrophil count/µL) [9] can be crucial to prevent and manage FN in the outpatient setting [10,11] and the emergency department [12,13].

In the current standard of care, the risk of FN is evaluated by using a priori scores, such as the Multinational Association for Supportive Care in Cancer score [14], to indicate primary prophylaxis with growth colony stimulating factors or by patients regularly monitoring their temperature at home to seek emergency care when fever ensues [15]. Despite these existing methods, FN still has an important economic and clinical impact in cancer care. The early detection of neutropenia could be used to prevent FN by triggering an early administration of granulocyte colony-stimulating factor or antibiotics [16-19]. Unfortunately, current neutropenia-monitoring options rely on venipunctures in the clinical setting or finger-prick blood samples at the point of care [20]. These technologies either require laboratory infrastructure limited to the hospital setting or are impractical as they cannot be operated by minimally trained users to achieve accurate and reliable results [21-24]. To address this unmet need, this paper presents a usability evaluation of a novel, noninvasive technology that allows automated and frequent neutropenia monitoring by patients from the home setting with minimal training.

Assessing the usability for this kind of technology is crucial in ensuring the accuracy of the results, driving adoption, and improving patient compliance and adherence [25]. According to the International Organization for Standardization (IEC 62366-1:2015), usability is defined as a “characteristic of the user interface that facilitates use and thereby establishes effectiveness, efficiency and user satisfaction in the intended use environment” [26]. These metrics can be measured by gaining insight into patient perspectives regarding user performance, satisfaction, and acceptability while using an intervention [27]. For this study, the standardized System Usability Scale (SUS) survey was chosen as a quantitative method of assessing subjective usability due to evidence that it can be used to assess any technology [28] and has successfully been used in the medical domain to assess home medical devices [29-31].

The early detection of FN risk is essential as it can be associated with a higher chance of survival, more successful treatment, and improved quality of life. Therefore, the need for these technologies to be user friendly to the majority of the patient population subsequently increases, as this can impact the patients’ perception of the technology and their decision to take the test [32]. Additionally, technology-based solutions such as the one presented in this paper can help strengthen the relationship and communication between patients and their doctors, empower the patients’ well-being, and help doctors make better and more informed decisions [33].

Study Objectives

We hypothesized that novice users will consider PointCheck (Leuko Labs), the first noninvasive optical technology for screening severe neutropenia, to be easy to use. The primary study objective was to evaluate the usability of PointCheck with the goal of identifying potential user interface (UI), functionality, and design issues from the perspective of untrained, first-time users in a simulated home environment. The primary end point for the study, defined a priori, was a group mean score of 80.8 on a standardized SUS, indicating a favorable perception of usability and a higher likelihood of adoption.


Device Description

PointCheck is the first noninvasive device (Figure 1) designed to screen for severe neutropenia in the home setting [34]. By imaging the blood flowing through the capillaries in the finger, PointCheck enables real-time remote monitoring of WBC levels based on optical imaging and without a blood draw [35,36].

Figure 1. The PointCheck device and its main components.
View this figure

The device consists of an optics and illumination system, on-board computing electronics, an 8.9-cm touch screen UI, a power cord, and disposable finger cartridges (Figure 1). It uses a camera microscopy system and LEDs to image capillaries in the nailfold region of the finger—typically the nondominant 4th (ring) finger, which has been shown by previous literature to contain the most intact and visible capillaries when compared to other fingers [37]. The finger cartridge is a disposable component that is prefilled with mineral oil and allows for effective optical refractive index coupling to ensure transdermal imaging quality [38,39]. The finger cartridge is designed for 1-time use. The hardware system design resembles the methods used in standard nailfold video capillaroscopy, which is an established technique used by rheumatologists to evaluate capillary morphology and microcirculation [40].

The UI on the touch screen provides a guided walk-through to facilitate the correct use of the device. It prompts the user to warm their hands; open up a new, unused cartridge; properly place the cartridge into the device; and insert their nondominant 4th (ring) finger all the way into the cartridge while properly supporting their arm on a flat, stable surface (Figure 2). A final checklist ensures that the most critical steps have been completed and the user is able to start the 1-minute measurement. The version of the device used in this study was a beta prototype (version 4).

Figure 2. Screenshots of PointCheck’s user interface depicting the user walk-through tutorial in English for taking a measurement and device function via the touch screen interface. Language support for Spanish- and Haitian-speaking populations was implemented to translate the instructions.
View this figure

Participants and Setting

Usability data was gathered from a cohort including both healthy volunteers and outpatients with cancer receiving chemotherapy. Patients were recruited at both the Boston Medical Center and Hospital Universitario 12 de Octubre before their routine chemotherapy administration. The healthy volunteers were recruited at the Massachusetts Institute of Technology’s Center for Clinical and Translational Research via advertisements displayed on Massachusetts Institute of Technology’s campus and social media and via email lists. The study visits took place in a simulated home environment, and testing was conducted without supervision from a medical professional. No participants had prior experience with the tested device.

A total of 154 participants (85 patients and 69 volunteers) participated in this study. According to standard usability sample size models, this sample size provides a 99% chance of detecting errors with the probability of occurrence of 3% at least once [41].

To ensure the generalizability of the results, we included younger (aged <65 years) and older (aged ≥65 years) adults, patients with diverse cancer types (lymphoma, leukemia, and myeloma, among other tumor types), men and women, and different education levels (≥8th grade or <8th grade). This allowed us to better understand the links between certain characteristics of the potential patients (ie, age, education, technophilia, and health literacy) and the usability [42,43].

Ethics Approval

Institutional review board (IRB) approvals were obtained from the Boston Medical Center IRB (H-39964), Hospital Universitario 12 de Octubre IRB (20/049), and the New England IRB (1290027) to conduct the study prior to recruitment. Participants provided written consent before agreeing to participate in the study according to good clinical practice guidelines (ICH E6:R2) [44].

Study Design

We used a mixed method approach to gather usability data through (1) user observation, (2) a short-answer qualitative e-questionnaire, and (3) a standardized quantitative SUS to assess perceived usability and satisfaction.

Regarding user observation, study coordinators observed participants while they used the device to document any errors that could potentially lead to imaging errors on the device. For example, an unsupported arm or incorrect hand placement could result in too much movement during a measurement and cause an error in the reading. Study coordinators also observed participants to identify and document any points of confusion during the walk-through steps that could be improved. All documented observations were collated into a list to be manually categorized by the type and frequency of occurrence (see Qualitative Results).

A subset (n=120) of the participants were given the opportunity to give feedback and document their thoughts, feelings, and experience using the device through an e-questionnaire containing 4 questions (Multimedia Appendix 1). We used this questionnaire to assess any potential confusion or difficulties participants may have had using the device or the UI, their attitude toward the product, and any potential features they would like to see added to improve user friendliness. The feedback from the questionnaires was collated into a spreadsheet to be manually categorized into themes (see Qualitative Results).

Finally, the SUS survey was used as a method of assessing subjective usability. The SUS is a Likert-type questionnaire comprising 10 questions with 5 response options ranging from “strongly disagree” to “strongly agree,” allowing for a subjective assessment of usability [45]. Scores range from 0 to 100 with higher scores indicating favorable user perceptions of the device and lower scores indicating low usability. The success criterion for a favorable evaluation in the SUS was defined by a mean group score of greater than 80.8 (see Quantitative Results). This threshold was selected based on previously published cutoffs to define the promoters of a technology [46].

Usability Testing Procedure

Baseline assessments conducted by research staff included a brief physical examination and collection of demographic information. Study coordinators read a short script that provided the participants with information about how to use the device and emphasized that the aim of the study was to test the user friendliness of the device and not the participants’ ability to use the device correctly (Multimedia Appendix 2). In addition, the participants were provided with a 1-page guide containing device instructions before attempting to take a measurement on their own (Figure 3).

Figure 3. One-page quick start guide provided to participants before attempting to take a measurement on their own.
View this figure

Participants were then asked to follow the instructions presented to them on the device screen, guiding them through the critical steps required to obtain high-quality measurements. The study coordinators did not intervene or answer questions related to device use to reproduce the conditions of unsupervised home use. A second observer monitored and recorded a subset of visits either in person or through the Zoom teleconferencing platform (Zoom Video Communications). Observers documented participant errors, feedback, and tendencies. After completing the initial measurement, participants were immediately asked to complete the SUS and questionnaire to evaluate their first impressions about the user friendliness of the system to prevent any bias introduced from repeating measurements and becoming familiar with the measurement process. Participants performed additional trials, each lasting about 1.5 minutes (1-minute measurement plus 30-second setup and walk-through) for a total of 2 to 6 repeat measurements to evaluate the device precision. These subsequent trials were not used for the perceived usability evaluation and are not reported here.

Data Analysis

Basic demographic characteristics were summarized using descriptive statistics. A final SUS score was computed in accordance with Brooke [47], and responses to the e-questionnaire were tokenized for their content and categorized into themes for qualitative analysis. Statistical comparisons were made between the different group categories stratified by age and literacy to evaluate usability differences using nonparametric techniques (Mann-Whitney U test). All quantitative data were processed using RStudio (version 1.3.1093; RStudio Team) [48].


User Statistics

Table 1 represents the breakdown of study participant characteristics by age, education level, gender, and race. Of the 154 participants, 118 (76.6%) were aged <65 years, with an average age of 44.8 (range 18-88) years. A majority (n=102, 66.2%) of the participants had an education level exceeding 8th grade, whereas 43 (27.9%) participants had an education level below 8th grade, and 9 (5.8%) did not provide educational level information.

Table 1. Basic demographics of participants. Educational, race, and ethnicity level data were missing for 9 (5.8%), 11 (7.1%) and 10 (6.5%) out of 154 participants, respectively.
DemographicParticipants (N=154)
Age (years)

Mean (SD)44.8 (20.5)

Median (range)38.3 (18.0-88.5)
Gender, n (%)

Male67 (43.5)

Female87 (56.5)
Educational level, n (%)

<8th grade43 (27.9)

≥8th grade102 (66.2)

Missing9 (5.8)
Race, n (%)

American Indian or Alaska Native1 (0.6)

Asian28 (18.2)

Black or African American28 (18.2)

More than 1 race5 (3.2)

Unknown11 (7.1)

White81 (52.6)
Ethnicity, n (%)

Hispanic or Latino49 (31.8)

Not Hispanic nor Latino95 (61.7)

Unknown10 (6.5)

Quantitative Results

The average SUS score across all participants was 86.1. In total, 70.1% (108/154) of the participants scored above the goal of 80.8 (Tables 2 and 3), which indicated that they would be early promoters and more likely to recommend the device to a friend [49]. When stratifying the SUS results by education level, we found that participants exceeding the 8th grade level scored slightly higher than those with an 8th grade level education and below—but only by a margin of 2.5 points, which was not found to be statistically significant (P=.27; Table 4). When stratifying SUS results by age categories, we found that participants aged <65 years also scored higher than participants aged ≥65 years by a margin of 3.4 points, showing a nonstatistically significant trend (P=.06). Both groups had a mean score above the predefined threshold of 80.8 (Table 5). When evaluating the SUS results by the individual survey questions, we found that 96% (145/151) of the participants that completed the survey found PointCheck easy to use and felt that they could learn to use it very quickly (Figure 4).

Table 2. Quantitative System Usability Scale (SUS) results across all participants. SUS surveys were incomplete for 3 participants and could not be computed.
SUS score (range 0-100)Overall (N=154)
Mean (SD)86.1 (12.2)
Median (range)87.5 (20.0-100)
Missing, n (%)3 (1.9)
Table 3. Total percent of promoters (defined as participants scoring >80.8 points on the System Usability Scale). System Usability Scale surveys were incomplete for 3 participants and could not be computed.
Cutoff used (points)Overall (N=154), n (%)
≤80.843 (27.9)
>80.8108 (70.1)
Missing3 (1.9)
Table 4. System Usability Scale (SUS) results stratified by educational level (<8th grade and ≥8th grade). The results were generated using the data available from a total of 145 participants. Educational level data was missing for 9 participants (N=154, 5.8%).
SUS score (range 0-100)<8th Grade (N=43)≥8th Grade (N=102)Overall (N=154)
Mean (SD)84.0 (12.7)86.5 (12.2)86.1 (12.2)
Median (range)85.0 (57.5-100)87.5 (20.0-100)87.5 (20.0-100)
Table 5. System Usability Scale (SUS) results stratified by age category (<65 years and ≥65 years).
SUS score (range 0-100)<65 years (N=118)≥65 years (N=36)Overall (N=154)
Mean (SD)86.9 (12.5)83.5 (11.0)86.1 (12.2)
Median (range)90.0 (20.0-100])82.5 (57.5-100)87.5 (20.0-100)
Figure 4. SUS survey responses assessed individually. Percentage values are calculated using available data from a total of 151 participants. SUS surveys were incomplete for 3 participants. SUS: System Usability Scale.
View this figure

Qualitative Results

User Observation

Error observation notes were collated into a list and then manually categorized by the type and frequency of occurrence. The primary error sources included skipping or misreading instructions and on-screen instruction accessibility. The majority (70/86, 81%) of these errors occurred only on the first use and were shown to be correctable by interventional guidance. Such guidance was given after the SUS survey had been completed, and improvement in most cases was demonstrated in subsequent trials. This shows that although the device performs well in independent use, the monitoring of first use by an experienced operator may have further benefit for catching usability errors.

e-Questionnaire Feedback

The feedback from the questionnaires was collated into a spreadsheet and then manually categorized into the following themes: UI/user experience, aesthetic design/logical design, hand rest, cartridge, cleaning/sanitation, and software/bugs. The themes were then broken down into the following subthemes: confidence in use, training effectiveness, UI design/clarity of UI instructions, ergonomic design, foreseeable home use issue, and accessibility. The instances of feedback falling within these subthemes were counted and generated the 3 overarching themes: pretraining effectiveness, user friendliness of PointCheck (related to ease of use, accessibility, and clarity of UI elements), and ergonomic design.

Pretraining Effectiveness

A portion of participants initially expressed some uncertainty when using the device for the first time (Participants #38 and #18; Table 6). With repeated use, however, most participants felt that they could catch on quickly (Participant #40; Table 6).

Table 6. Illustrative quotes for the 3 overarching themes.
Theme/categoryIllustrative quote
Pretraining effectiveness
  • “Would like a YouTube channel/clip to watch in advance that will explain the device.” (Participant #38)
  • “The most difficult step was probably removing the cartridge from the box. I was not sure if I had to keep the cartridge clean for measurements and the instructions did not tell how I should be holding the cartridge or if I should even be careful or not about touching it too much and getting it dirty” (Participant #18)
  • “The device was cumbersome (awkward) to use because it was the first time. After the first time, it would be easier to use.” (Participant #40)
User friendliness of PointCheck
  • “Others can’t see as well, may need others to help them if they have dementia or are forgetful.” (Participant #53)
  • “Could not read font of the three step instructions.” (Participant #65)
  • “If I were to use the device on a daily basis, I would be relatively annoyed by the fact the three repeat questions are timed lag to press yes.” (Participant #15)
  • “One thing I was confused about was checking off ‘was my finger in all the way.’ It was just a circle and I was confused what I was supposed to do on this step.” (Participant #21)
  • “The cartridge lid is shown to be peeled from the flat side (facing down) but then needs to be rotated to be put in with the flat side up. If the peel could be opened in the right orientation, that would have helped.” (Participant #24)
  • “Very crisp, and clear how to start using it...the screen has good contrast, good font choice given resolution and size, and the purple/grey color scheme is also calming.” (Participant #30)
  • “Even though I knew beforehand that I should rest my elbow, I forgot to do so after inserting my finger because that was my main focus, so I really appreciated the reminder to have my elbow rested right after the step where I inserted my ring finger.” (Participant #34)
Ergonomic Design
  • “The display seems angled a little high, considering that the machine needs to be placed a considerable distance to rest my arm. The steps were fairly intuitive.” (Participant #3)
  • “I was expecting the device to be a bit smaller. I think the device is designed for a very big hand, I think it would be probably better to try to make it the size closer to a computer mouse.” (Participant #16)
  • “My fingers are pretty small and narrow but the soft spiked insides of the capture cylinder still left marks on my finger afterwards. This was completely non-painful but just noting this here for other users who might have thicker fingers! It was a bit bigger than I expected (the size of the device) but it doesn’t impact usability. It also produced quite a loud hum but again, doesn’t impact usability.” (Participant #11)
  • “I liked the brush-like texture inside the tube. It helped me feel that I had my finger in the right position.” (Participant #33)
  • “My first impression was that it looked pretty compact and that it’s very straight to the point in its features--nothing fancy, just functional.” (Participant #34)
User Friendliness of UI Design

Older participants (aged ≥65 years) discussed the need for improved screen readability, mentioning increasing the font size or needing additional assistance (Participants #53 and #65; Table 6).

There were mixed opinions on the overall design of the UI, but the majority (145/151, 96%) of the participants found the overall system to be easy to use and that they could learn quickly. Some participants did comment on the elements of UI design, such as buttons, on-screen instructions, or color choices, that made them feel frustrated, confused, or uncertain about whether they were performing the measurement correctly (Participants #15, #21, and #24; Table 6). Other participants expressed satisfaction with the UI design (Participants #30 and #34; Table 6).

Ergonomic Design

Finally, participants also addressed the changes they wished to see in the ergonomic design to better meet the needs of end users (Participants #3, #16, and #11; Table 6). Other participants expressed satisfaction with the ergonomic design (Participants #33 and #34; Table 6).


In this study, we aimed to evaluate the usability and design of PointCheck, a novel technology for noninvasive, home-based neutropenia detection. Through a mixed method approach of user observation, questionnaires, and a SUS survey, we have validated the hypothesis that PointCheck is easy to use by first-time users in a simulated home environment with a mean SUS score of 86.1 (Table 2), classified as a score of A (ie, excellent; net promoter score: promoter level) [50,51] and falling within the top tenth percentile of systems as evaluated by the SUS [49].

Although the majority of first-time users expressed high satisfaction with the overall design and user friendliness of PointCheck (Table 3), a number of areas for improvement were identified through the feedback and observation of users and will be implemented into future designs. The main changes to be implemented to enhance the usability of the device and reduce instances of errors include the addition of a tutorial video and walk-through image animations as additional training methods, improved screen readability, improved button design to make them easily identifiable to users, and a modification of the cartridge to be more size inclusive.

In observing the use of the device in context, correct positioning during the use of the device may be more difficult for nonambulatory patients. Ideally, patients will have a training session with their health care professional prior to bringing this device home for normal use. This training would allow patients to familiarize themselves with the device beforehand, ask any questions related to use, and receive the support needed to ensure confidence in using the device alone for weeks at a time.

It is to be noted that a majority of study participants were aged <65 years and have an educational level of ≥8th grade level, both of which are factors that increase the likelihood of technological proficiency and willingness to adopt new technology [52]. Emerging technologies such as smartphones and tablets have raised concerns about their ease of use in older and untrained populations [53]; however, we found that it did not affect the perceived usability of PointCheck, considering that there was no significant differences in the SUS scores among users across educational levels and age categories (Tables 4 and 5). This is consistent with prior literature which has demonstrated that older populations are interested and capable of using modern technologies for managing health and can learn how to use a touch screen after a few tries [53,54]. Although all participants in the study had no previous training or experience using PointCheck, all of them were able to become proficient after 1 or 2 measurements guided by experienced clinicians through repetition by the end of the study visit. This is an indication that training, while necessary for building intuition and confidence when using the device [55], does not need to be extensive and the on-screen walk-through is effective in guiding the user through a measurement alone.

Although this study aimed to evaluate a variety of usability factors in a simulated home environment, a single study cannot claim to assess these factors in all use cases and situations. The perceived usability of PointCheck should be tested further in real-world home environments with users who receive prior training to identify context-related issues in the future.

Overall, this study demonstrated that PointCheck, a novel digital device for noninvasive WBC monitoring, can be easy to use for unsupervised patients in the home setting. By enabling continuous home-monitoring for severe neutropenia, PointCheck has the potential to change the standard of care for patients with cancer and substantially improve their clinical outcomes.

Acknowledgments

Funding for the study was supported through the National Cancer Institute Small Business Innovation Research program (grant R44CA228920), the National Institutes of Health (grant U54HL143541), the Massachusetts Technology Collaborative Digital Health Sandbox Program, and the Centro de Desarrollo Tecnológico e Industrial in Spain (grant SNEO-20191188). We are also grateful to Catherine Ricciardi, DNP, ANP-BC (Nurse Director and Clinical Operations); and Tatiana Urman, RN, MSN (Clinical Research Nurse Coordinator) at the Massachusetts Institute of Technology Center for Clinical and Translational Research. We extend a special thank you to all volunteers and patients who participated in this study.

Conflicts of Interest

GL, APT, IB, NB, RB, AB, ASF, and CCG are current employees and holders of stock options in a privately held company, patents, and royalties of Leuko Labs, Inc. IB and CCG have membership on the board of directors or advisory committees of Leuko Labs, Inc. JML received research funding from Roche, Novartis, Incyte, Astellas, and BMS and consulted for Janssen, BMS, Novartis, Incyte, Roche, GSK, and Pfizer. None of these items are related to this work. JMS has membership on the board of directors or advisory committees of Pharmacosmos and Astra Zeneca and received honoraria from Abbvie and Stemline. None of these items are related to this work. All other authors declared no other conflicts of interest.

Multimedia Appendix 1

Short-answer qualitative e-questionnaire.

PDF File (Adobe PDF File), 14 KB

Multimedia Appendix 2

Study coordinator training script.

PDF File (Adobe PDF File), 17 KB

  1. Lyman GH, Abella E, Pettengell R. Risk factors for febrile neutropenia among patients with cancer receiving chemotherapy: a systematic review. Crit Rev Oncol Hematol 2014 Jun;90(3):190-199. [CrossRef] [Medline]
  2. Lyman GH. Impact of chemotherapy dose intensity on cancer patient outcomes. J Natl Compr Canc Netw 2009 Jan;7(1):99-108. [CrossRef] [Medline]
  3. Isidori A, Cerchione C, Daver N, DiNardo C, Garcia-Manero G, Konopleva M, et al. Immunotherapy in acute myeloid leukemia: where we stand. Front Oncol 2021;11:656218 [FREE Full text] [CrossRef] [Medline]
  4. Wilson BE, Jacob S, Yap ML, Ferlay J, Bray F, Barton MB. Estimates of global chemotherapy demands and corresponding physician workforce requirements for 2018 and 2040: a population-based study. Lancet Oncol 2019 Jun;20(6):769-780. [CrossRef] [Medline]
  5. Tai E, Guy GP, Dunbar A, Richardson LC. Cost of cancer-related neutropenia or fever hospitalizations, United States, 2012. J Oncol Pract 2017 Jun;13(6):e552-e561 [FREE Full text] [CrossRef] [Medline]
  6. Wang W, Li E, Campbell K, McBride A, D'Amato S. Economic analysis on adoption of biosimilar granulocyte colony-stimulating factors in patients with nonmyeloid cancer at risk of febrile neutropenia within the oncology care model framework. JCO Oncol Pract 2021 Aug;17(8):e1139-e1149. [CrossRef] [Medline]
  7. Lyman GH, Poniewierski MS, Crawford J, Dale DC, Culakova E. Cost of hospitalization in patients with cancer and febrile neutropenia and impact of comorbid conditions. Blood 2015 Dec 03;126(23):2089. [CrossRef]
  8. Kuderer NM, Dale DC, Crawford J, Cosler LE, Lyman GH. Mortality, morbidity, and cost associated with febrile neutropenia in adult cancer patients. Cancer 2006 May 15;106(10):2258-2266 [FREE Full text] [CrossRef] [Medline]
  9. Trotti A, Colevas AD, Setser A, Rusch V, Jaques D, Budach V, et al. CTCAE v3.0: development of a comprehensive grading system for the adverse effects of cancer treatment. Semin Radiat Oncol 2003 Jul;13(3):176-181. [CrossRef] [Medline]
  10. Lyman GH, Lyman CH, Agboola O. Risk models for predicting chemotherapy-induced neutropenia. Oncologist 2005;10(6):427-437 [FREE Full text] [CrossRef] [Medline]
  11. de Naurois J, Novitzky-Basso I, Gill MJ, Marti FM, Cullen MH, Roila F, ESMO Guidelines Working Group. Management of febrile neutropenia: ESMO Clinical Practice Guidelines. Ann Oncol 2010 May;21 Suppl 5:v252-v256 [FREE Full text] [CrossRef] [Medline]
  12. Kyriacou DN, Jovanovic B, Frankfurt O. Timing of initial antibiotic treatment for febrile neutropenia in the emergency department: the need for evidence-based guidelines. J Natl Compr Canc Netw 2014 Nov;12(11):1569-1573. [CrossRef] [Medline]
  13. Seltzer JA, Frankfurt O, Kyriacou DN. Association of an emergency department febrile neutropenia intervention protocol with time to initial antibiotic treatment. Acad Emerg Med 2022 Jan;29(1):73-82. [CrossRef] [Medline]
  14. Wijeratne DT, Wright K, Gyawali B. Risk-stratifying treatment strategies for febrile neutropenia-tools, tools everywhere, and not a single one that works? JCO Oncol Pract 2021 Nov;17(11):651-654. [CrossRef] [Medline]
  15. Zimmer AJ, Freifeld AG. Optimal management of neutropenic fever in patients with cancer. J Oncol Pract 2019 Jan;15(1):19-24. [CrossRef] [Medline]
  16. Meza L, Baselga J, Holmes F, Liang B, Breddy J, Pegfilgrastim Study Group. Incidence of febrile neutropenia (FN) is directly related to duration of severe neutropenia (DSN) after myelosuppressive chemotherapy. Proceedings of the American Society of Clinical Oncology. Vol 21 2002;21:255b.
  17. Hartmann LC, Tschetter LK, Habermann TM, Ebbert LP, Johnson PS, Mailliard JA, et al. Granulocyte colony-stimulating factor in severe chemotherapy-induced afebrile neutropenia. N Engl J Med 1997 Jun 19;336(25):1776-1780. [CrossRef] [Medline]
  18. Gafter-Gvili A, Fraser A, Paul M, Vidal L, Lawrie TA, van de Wetering MD, et al. Antibiotic prophylaxis for bacterial infections in afebrile neutropenic patients following chemotherapy. Cochrane Database Syst Rev 2012 Jan 18;1:CD004386 [FREE Full text] [CrossRef] [Medline]
  19. Egan G, Robinson PD, Martinez JPD, Alexander S, Ammann RA, Dupuis LL, et al. Efficacy of antibiotic prophylaxis in patients with cancer and hematopoietic stem cell transplantation recipients: a systematic review of randomized trials. Cancer Med 2019 Aug;8(10):4536-4546 [FREE Full text] [CrossRef] [Medline]
  20. Bachar N, Benbassat D, Brailovsky D, Eshel Y, Glück D, Levner D, et al. An artificial intelligence-assisted diagnostic platform for rapid near-patient hematology. Am J Hematol 2021 Oct 01;96(10):1264-1274. [CrossRef] [Medline]
  21. Bond MM, Richards-Kortum RR. Drop-to-drop variation in the cellular components of fingerprick blood: implications for point-of-care diagnostic development. Am J Clin Pathol 2015 Dec;144(6):885-894. [CrossRef] [Medline]
  22. Yang ZW, Yang SH, Chen L, Qu J, Zhu J, Tang Z. Comparison of blood counts in venous, fingertip and arterial blood and their measurement variation. Clin Lab Haematol 2001 Jun;23(3):155-159. [CrossRef] [Medline]
  23. Daae LN, Halvorsen S, Mathisen PM, Mironska K. A comparison between haematological parameters in 'capillary' and venous blood from healthy adults. Scand J Clin Lab Invest 1988 Nov;48(7):723-726. [CrossRef] [Medline]
  24. Hollis VS, Holloway JA, Harris S, Spencer D, van Berkel C, Morgan H. Comparison of venous and capillary differential leukocyte counts using a standard hematology analyzer and a novel microfluidic impedance cytometer. PLoS One 2012;7(9):e43702 [FREE Full text] [CrossRef] [Medline]
  25. Holden RJ, Karsh B. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010 Feb;43(1):159-172 [FREE Full text] [CrossRef] [Medline]
  26. IEC 62366-1:2015(en) medical devices — part 1: application of usability engineering to medical devices. International Electrotechnical Commission. 2015.   URL: https://www.iso.org/obp/ui/#iso:std:iec:62366:-1:ed-1:v1:en [accessed 2022-05-31]
  27. DE Bleser L, DE Geest S, Vincke B, Ruppar T, Vanhaecke J, Dobbels F. How to test electronic adherence monitoring devices for use in daily life: a conceptual framework. Comput Inform Nurs 2011 Sep;29(9):489-495. [CrossRef] [Medline]
  28. Bangor A, Kortum PT, Miller JT. An empirical evaluation of the System Usability Scale. Int J Hum Comput Interact 2008 Jul 30;24(6):574-594. [CrossRef]
  29. Kortum P, Peres SC. Evaluation of home health care devices: remote usability assessment. JMIR Hum Factors 2015 Jun 05;2(1):e10 [FREE Full text] [CrossRef] [Medline]
  30. Chamberlain JJ, Gilgen E. Do perceptions of insulin pump usability impact attitudes toward insulin pump therapy? a pilot study of individuals with type 1 and insulin-treated type 2 diabetes. J Diabetes Sci Technol 2015 Jan;9(1):105-110 [FREE Full text] [CrossRef] [Medline]
  31. Arnet I, Rothen JP, Hersberger KE. Validation of a novel electronic device for medication adherence monitoring of ambulatory patients. Pharmacy (Basel) 2019 Nov 20;7(4):155 [FREE Full text] [CrossRef] [Medline]
  32. Elkefi S, Choudhury A, Strachna O, Asan O. Impact of health perception and knowledge on genetic testing decisions using the health belief model. JCO Clin Cancer Inform 2022 Jan;6:e2100117 [FREE Full text] [CrossRef] [Medline]
  33. ElKefi S, Asan O. How technology impacts communication between cancer patients and their health care providers: a systematic literature review. Int J Med Inform 2021 May;149:104430 [FREE Full text] [CrossRef] [Medline]
  34. Pablo-Trinidad A, Butterworth I, Ledesma-Carbayo MJ, Vettenburg T, Sánchez-Ferro Á, Soenksen L, et al. Automated detection of neutropenia using noninvasive video microscopy of superficial capillaries. Am J Hematol 2019 Aug;94(8):E219-E222 [FREE Full text] [CrossRef] [Medline]
  35. Bourquard A, Butterworth I, Sanchez-Ferro A, Giancardo L, Soenksen L, Cerrato C, et al. Analysis of white blood cell dynamics in nailfold capillaries. Annu Int Conf IEEE Eng Med Biol Soc 2015;2015:7470-7473 [FREE Full text] [CrossRef] [Medline]
  36. Bourquard A, Pablo-Trinidad A, Butterworth I, Sánchez-Ferro Á, Cerrato C, Humala K, et al. Non-invasive detection of severe neutropenia in chemotherapy patients by optical imaging of nailfold microcirculation. Sci Rep 2018 Mar 28;8(1):5301 [FREE Full text] [CrossRef] [Medline]
  37. Dinsdale G, Roberts C, Moore T, Manning J, Berks M, Allen J, et al. Nailfold capillaroscopy-how many fingers should be examined to detect abnormality? Rheumatology (Oxford) 2019 Feb 01;58(2):284-288. [CrossRef] [Medline]
  38. Smith V, Thevissen K, Trombetta AC, Pizzorni C, Ruaro B, Piette Y, EULAR Study Group on Microcirculation in Rheumatic Diseases. Nailfold capillaroscopy and clinical applications in systemic sclerosis. Microcirculation 2016 Jul;23(5):364-372. [CrossRef] [Medline]
  39. McKay GN, Mohan N, Butterworth I, Bourquard A, Sánchez-Ferro Á, Castro-González C, et al. Visualization of blood cell contrast in nailfold capillaries with high-speed reverse lens mobile phone microscopy. Biomed Opt Express 2020 Apr 01;11(4):2268-2276 [FREE Full text] [CrossRef] [Medline]
  40. Chojnowski MM, Felis-Giemza A, Olesińska M. Capillaroscopy - a role in modern rheumatology. Reumatologia 2016;54(2):67-72 [FREE Full text] [CrossRef] [Medline]
  41. Sauro J, Lewis J. Chapter 4 - did we meet or exceed our goal? In: Inauro J, Lewis JR, editors. Quantifying the User Experience. 2nd ed. Boston, MA: Morgan Kaufmann; 2016:39-60.
  42. Toonders SAJ, van Westrienen PE, Konings S, Nieboer ME, Veenhof C, Pisters MF. Patients' perspectives on the usability of a blended approach to an integrated intervention for patients with medically unexplained physical symptoms: mixed methods study. J Med Internet Res 2021 Sep 28;23(9):e19794 [FREE Full text] [CrossRef] [Medline]
  43. Chaniaud N, Megalakaki O, Capo S, Loup-Escande E. Effects of user characteristics on the usability of a home-connected medical device (Smart Angel) for ambulatory monitoring: usability study. JMIR Hum Factors 2021 Mar 17;8(1):e24846 [FREE Full text] [CrossRef] [Medline]
  44. E6(R2) Good Clinical Practice: Integrated Addendum to ICH E6(R1) Guidance for Industry. United States Food and Drug Administration. 2018.   URL: https:/​/www.​fda.gov/​files/​drugs/​published/​E6%28R2%29-Good-Clinical-Practice--Integrated-Addendum-to-ICH-E6%28R1%29.​pdf [accessed 2022-07-29]
  45. Lewis J, Sauro J. The Factor Structure of the System Usability Scale. In: Lecture Notes in Computer Science. 2009 Jul Presented at: HCD 2009: Human Centered Design; July 19-24, 2009; San Diego, CA p. 94-103. [CrossRef]
  46. Sauro J, Lewis JR. Chapter 8 - standardized usability questionnaires. In: Sauro J, Lewis JR, editors. Quantifying the User Experience. 2nd ed. Boston, MA: Morgan Kaufmann; 2016:185-248.
  47. Brooke J. SUS: a 'quick and dirty' usability scale. In: Usability Evaluation In Industry. London, UK: Taylor and Francis; 1996:1-22.
  48. RStudio Team. RStudio: integrated development environment for R. RStudio. 2021.   URL: http://www.rstudio.com/ [accessed 2022-05-31]
  49. Sauro J. 5 ways to interpret a SUS score. Measuring U. 2018 Sep 19.   URL: https://measuringu.com/interpret-sus-score/ [accessed 2022-05-31]
  50. Lewis JR, Sauro J. Item benchmarks for the system usability scale. J Usability Stud 2018 May 01;13(3):158-167 [FREE Full text] [CrossRef]
  51. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud 2009 May 01;4(3):114-123 [FREE Full text] [CrossRef]
  52. Older Americans and the Internet. Pew Research Center. 2004 Mar 28.   URL: https://www.pewresearch.org/internet/2004/03/28/older-americans-and-the-internet/ [accessed 2022-05-23]
  53. Zapata BC, Fernández-Alemán JL, Idri A, Toval A. Empirical studies on usability of mHealth apps: a systematic literature review. J Med Syst 2015 Feb;39(2):1. [CrossRef] [Medline]
  54. Yang HH, Yu C, Huang CH, Kuo LH, Yang HJ. Teaching information technology for elder participation: a qualitative analysis of Taiwan retirees. WSEAS Transactions on Information Science and Applications 2010 Sep 01;7(9):1190-1199 [FREE Full text] [CrossRef]
  55. Keller SC, Gurses AP, Werner N, Hohl D, Hughes A, Leff B, et al. Older adults and management of medical devices in the home: five requirements for appropriate use. Popul Health Manag 2017 Aug;20(4):278-286 [FREE Full text] [CrossRef] [Medline]


FN: febrile neutropenia
IRB: institutional review board
SUS: System Usability Scale
UI: User Interface
WBC: white blood cells


Edited by G Eysenbach; submitted 18.02.22; peer-reviewed by N Chaniaud, S El kefi; comments to author 11.04.22; revised version received 03.06.22; accepted 18.07.22; published 09.08.22

Copyright

©Ganimete Lamaj, Alberto Pablo-Trinidad, Ian Butterworth, Nolan Bell, Ryan Benasutti, Aurelien Bourquard, Alvaro Sanchez-Ferro, Carlos Castro-Gonzalez, Ana Jiménez-Ubieto, Tycho Baumann, Antonia Rodriguez-Izquierdo, Elizabeth Pottier, Anthony Shelton, Joaquin Martinez-Lopez, John Mark Sloan. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 09.08.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.