Published on in Vol 23, No 3 (2021): March

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/22099, first published .
Design Guidelines of a Computer-Based Intervention for Computer Vision Syndrome: Focus Group Study and Real-World Deployment

Design Guidelines of a Computer-Based Intervention for Computer Vision Syndrome: Focus Group Study and Real-World Deployment

Design Guidelines of a Computer-Based Intervention for Computer Vision Syndrome: Focus Group Study and Real-World Deployment

Original Paper

1Human Computer Interaction and Design Lab, Seoul National University, Seoul, Republic of Korea

2Seoul National University, Seoul, Republic of Korea

Corresponding Author:

Joonhwan Lee, PhD

Human Computer Interaction and Design Lab

Seoul National University

64-406, Gwanak ro 1, Gwanak gu

Seoul, 08826

Republic of Korea

Phone: 82 02 880 6450

Email: joonhwan@snu.ac.kr


Background: Prolonged time of computer use increases the prevalence of ocular problems, including eye strain, tired eyes, irritation, redness, blurred vision, and double vision, which are collectively referred to as computer vision syndrome (CVS). Approximately 70% of computer users have vision-related problems. For these reasons, properly designed interventions for users with CVS are required. To design an effective screen intervention for preventing or improving CVS, we must understand the effective interfaces of computer-based interventions.

Objective: In this study, we aimed to explore the interface elements of computer-based interventions for CVS to set design guidelines based on the pros and cons of each interface element.

Methods: We conducted an iterative user study to achieve our research objective. First, we conducted a workshop to evaluate the overall interface elements that were included in previous systems for CVS (n=7). Through the workshop, participants evaluated existing interface elements. Based on the evaluation results, we eliminated the elements that negatively affect intervention outcomes. Second, we designed our prototype system LiquidEye that includes multiple interface options (n=11). Interface options included interface elements that were positively evaluated in the workshop study. Lastly, we deployed LiquidEye in the real world to see how the included elements affected the intervention outcomes. Participants used LiquidEye for 14 days, and during this period, we collected participants’ daily logs (n=680). Additionally, we conducted prestudy and poststudy surveys, and poststudy interviews to explore how each interface element affects participation in the system.

Results: User data logs collected from the 14 days of deployment were analyzed with multiple regression analysis to explore the interface elements affecting user participation in the intervention (LiquidEye). Statistically significant elements were the instruction page of the eye resting strategy (P=.01), goal setting of the resting period (P=.009), compliment feedback after completing resting (P<.001), a mid-size popup window (P=.02), and CVS symptom-like effects (P=.004).

Conclusions: Based on the study results, we suggested design implications to consider when designing computer-based interventions for CVS. The sophisticated design of the customization interface can make it possible for users to use the system more interactively, which can result in higher engagement in managing eye conditions. There are important technical challenges that still need to be addressed, but given the fact that this study was able to clarify the various factors related to computer-based interventions, the findings are expected to contribute greatly to the research of various computer-based intervention designs in the future.

J Med Internet Res 2021;23(3):e22099

doi:10.2196/22099

Keywords



Background

As computer technologies advance rapidly, an increasing number of people spend their time in front of computer screens and mobile phones. According to the current population survey, 89% of US households have a computer, which includes smartphones, and 81% have a broadband internet subscription [1]. There is also computer use increase in the workplace. It is estimated that more than 75% of all jobs involve computer use [2]. Before personal computers revolutionized the workplace, office work had involved a range of activities, including typing, filing, reading, and writing. Each activity was adequately varied in the requirements of posture and vision, posing a natural “break” from the previous activity.

However, the introduction of personal computers has combined these tasks to where most can be performed without moving from the desktop, thereby improving quality, production, and efficiency, but also increasing computer-related health issues [2].

Computer vision syndrome (CVS) is one of the typical computer-related health issues [3-6]. Approximately 70% of computer users have CVS-related problems. The American Optometric Association defines CVS as the combination of eye and vision problems associated with the use of computers. The ocular complaints made by computer users typically include eye strain, eye fatigue, burning sensation, irritation, redness, blurred vision, and dry eyes, among others. The condition of a person experiencing one or more of these ocular complaints as a result of operating a computer and looking at a computer monitor is generally referred to as CVS. Symptoms of CVS also include extraocular symptoms, such as neck pain, back pain, and shoulder pain [7,8]. All these symptoms negatively affect the performance of everyday tasks, such as reading, driving, and computer use, which lowers quality of life [9].

Designing an appropriate intervention involving eye rest is one of the technology-based solutions to reduce the prevalence of CVS [10-13]. Among the various forms of interventions, a computer-based intervention is an appropriate form of intervention for CVS [9]. A computer-based intervention offers a great variety of options for assessing individuals, creating and delivering customized health messages, and providing individuals with the methods necessary to maintain or change their health-related behaviors [14]. Maximizing benefits and minimizing costs are important when designing health interventions, including digital health interventions such as computer-based interventions [15-19]. Inadequate design is one of the reasons for increasing costs in the process of modifying and re-evaluating interventions [19,20]. Therefore, a design study for effective computer-based interventions should be performed before introducing a prototype to users [21].

Objectives

In this study, we aimed to explore interface elements that affect participation in a computer-based intervention helping the eye resting behavior of users with CVS. For this, first, we investigated effective interface elements in existing computer-based interventions for users with CVS during a focus group study. Second, to further investigate the effectiveness of interface elements, we conducted a deployment study with our prototype LiquidEye having multiple interface options. Interface options included interface elements that were evaluated higher than the average in the focus group study. To demonstrate how the included elements affect user participation in eye resting behavior, we deployed LiquidEye in the real world with 12 participants.


Study Procedure

This study included a focus group study and a deployment study. To research existing computer-based interventions for vision protection, screening was conducted before the focus group. In this phase, researchers screened and listed the interface elements from existing systems. In the focus group consisting of an evaluation session and a redesign session, participants with CVS (n=7) evaluated each element by discussing its pros and cons. The evaluation session was conducted with a focus group interview. In the redesign session, participants discussed additional interface elements that could affect system participation and all participants evaluated the elements. With interface elements rated higher than the average, we developed LiquidEye and conducted a deployment study with 12 participants.

Screening Existing Systems

During the screening phase, researchers aimed to list feasible interface elements from existing systems. Our system selection criteria were designed through a four-step procedure. In step 1, we collected all previously studied systems regarding CVS in the human-computer interaction community or other related fields. There were systems such as EyeGuardian [22], EyePhone [23], DualBlink [9], LiDAR [5], BlinkBlink [24], and EyeProtector [25]. In step 2, apps from the app store or web-based interventions were collected, such as ProtectYourVision, RestOnTime, and EyeBreak. In step 3, systems with all the intervening effects were collected and preanalyzed by researchers. Finally, in step 4, three systems were selected to include as many elements as possible and minimize overlap between systems. Step 4 was conducted because showing too many systems in one place could confuse participants. In the end, three systems were included in the focus group session. One was a prior academic prototype (Eye Protector [25]) and two were obtained from a commercial app store (Protect Your Vision [26] and Rest on Time [27]).

Phase 1: Focus Group Study

To evaluate and discuss interface elements in existing computer-based interventions for CVS, we conducted a focus group discussion. We recruited seven participants (three male and four female participants; P1-P7) aged from 21 to 37 years (Table 1). All participants reported that they had frequently experienced CVS symptoms, such as blurred vision, dry eyes, eye strain, headache, neck pain, and back pain [4,7]. Recruited participants experienced at least three of these symptoms. Additionally, they were using a computer for at least 3 hours a day. We recruited participants through the university’s online community and clinical recruiting sites. Each participant was given a US $30 voucher after completing the final session. The focus group discussion consisted of two major sessions (evaluation session and redesign session).

Table 1. Information of the participants in the focus group study.
Participant numberAge (years)GenderAverage computer use per dayRelated symptoms
P129Female≥4 hBlurred vision, dry eyes, eye irritation, and neck and back pain
P228Male≥4 hBlurred vision, dry eyes, headache, and neck and back pain
P324Female≥4 hBlurred vision, double vision, dry eyes, eye irritation, headache, and neck and back pain
P422Female≥2 hBlurred vision, dry eyes, eye irritation, headache, and neck and back pain
P526Male≥3 hBlurred vision, dry eyes, eye irritation, and neck and back pain
P621Female≥4 hBlurred vision, dry eyes, and neck and back pain
P737Male≥4 hBlurred vision, double vision, and dry eyes
Evaluation Session

In this session, participants were asked to evaluate the interface elements in existing systems with other participants. We introduced the three intervention systems for CVS. For each system, the included interface elements and their functions were described to participants in detail with a simulation. Thereafter, participants discussed each interface element in detail and mainly discussed its acceptability, which is an important consideration for health technologies and interventions [28]. Acceptable interventions make users more likely to engage and adhere to the system. Participants evaluated the acceptability of each interface element on a 7-point Likert scale.

Redesign Session

The objective of the redesign session was to explore additional interface elements that were not included in previous systems. Participants were instructed to draw their ideal intervention system on a paper. They were told that they could take some of the factors they evaluated in the previous session or add new ones if needed. This enabled us to further discover and evaluate new important elements that could not be considered in the previous session. The participants drew what they thought was a desirable system on a given blank sheet of paper. In this session, participants were also instructed to focus on the acceptability of the system. To consider as many factors as possible, a researcher did not give participants a preannounced time and waited until all participants had finished their drawings. It took a total of 20 minutes. Thereafter, the participants explained their desirable system and interface elements to other participants and two of the authors (YJ and DH). Newly suggested elements were listed by the authors, and participants evaluated these elements as they did in the evaluation session.

Phase 2: Deployment Study

LiquidEye: Computer-Based Intervention for Users With CVS

LiquidEye is a computer-based intervention system that helps achieve an adequate amount of eye rest among users with CVS. It helps users’ eye resting behavior by providing an intervening screen with a black/white full-screen window. The goal of LiquidEye is to minimize vision-related symptoms and prevent the occurrence of vision-related symptoms. For this, LiquidEye attempts to manage a user’s prolonged time of computer use, which is one of the critical causes of CVS [7].

LiquidEye consists of multiple interface options. Interface options include interface elements that were evaluated higher than the average in the focus group study. As shown in Figure 1, users can select these options on the settings menu or can adjust the degree of each interface element (frequency, size, etc). Based on the user’s settings, LiquidEye intervenes in prolonged computer use at the scheduled time with selected interface elements.

Figure 2 shows example scenarios of LiquidEye. At the scheduled time, a notification interface pops up and asks the user to participate in eye rest (Figure 2A). Depending on the user’s customized settings, this element can accompany symptom-like visual effects. Users can choose among “start,” “5 min later,” and “skip.” If the user clicks “start,” LiquidEye records the user’s behavior as “1 (participated),” and if the user clicks “skip,” it records the user’s behavior as “0 (did not participate).” When the user clicks “5 min later,” the notification window interface pops up 5 minutes later. If the user turns off the notification window option, the eye resting scenario starts without a notification interface. Before starting eye rest, LiquidEye shows an instruction page that explains the need for eye rest and how to use LiquidEye (Figure 2B) and provides health information related to CVS (eg, less eye blinking can cause CVS, foods with beta-carotene can help improve eye conditions, etc) (Figure 2C). The instruction page and health information are optional depending on the user’s customized settings. During eye rest, the word “break” appears and the word “look away” appears next and remains on the screen (Figure 2D). There is a timer in the middle that shows the remaining time for the user’s eye resting behavior. Characters could be presented on the screen depending on the option settings. After the resting time, the user can receive a sound-based alarm (optional). If the user quits the LiquidEye window (button on the top right) before the eye resting time is over, it records the user’s behavior as “0 (did not participate).” If the user finishes eye resting without quitting, LiquidEye shows the user’s eye resting accomplishment report for the day (Figure 2E). A rotated feedback message (compliment) can be sent to the user depending on the settings (Figure 2F).

Figure 1. Settings menu of LiquidEye. In the settings menu, users can customize their interface options for LiquidEye.
View this figure
Figure 2. Example scenarios of LiquidEye. The interface elements (A-F) in the example scenarios are customizable in the settings menu.
View this figure

LiquidEye was implemented on the MacOS system, and the system was developed with Swift in Xcode 10.3 IDE. After the development of LiquidEye, we conducted a 14-day real-world study with 12 recruited participants.

Participants and Procedure

A total of 12 participants (seven male and five female participants; A1-A12) aged from 22 to 40 years with CVS were recruited (Table 2). They all had at least three CVS-related symptoms and were using the computer for at least 3 hours a day. Participants were recruited through an online community. Participants in the focus group study did not overlap with participants in the deployment study. LiquidEye was used by participants for 2 weeks. Before participants started to use LiquidEye, they visited our lab to participate in a prestudy interview. Additionally, we helped participants to download LiquidEye on their personal computers and check if LiquidEye works properly on the device. To make participants use as many elements as possible, we instructed participants to use the system by changing the setting environment at least twice a day for an everyday task. Moreover, each time they changed the settings, they were asked to enter feedback for the previously used interface elements on an automatically appearing feedback page. Participants also visited the authors after the 14 days of participation for a postuse interview.

Table 2. Information of the participants in the 14-day deployment study.
Participant numberAge (years)GenderAverage computer use per dayNumber of reported CVSa symptomsOccupation
A128Female≥4 h4Student
A231Male≥7 h4Data scientist
A322Female≥5 h4Student
A424Female≥3 h3Student
A523Male≥3 h3Office worker
A628Female≥5 h3Student
A728Male≥3 h3Data scientist
A824Male≥4 h4Student
A930Male≥7 h3Programmer
A1033Male≥5 h3Office worker
A1124Female≥4 h3Student
A1240Male≥3 h4Office worker

aCVS: computer vision syndrome.

Thematic Analysis of Qualitative Data

Qualitative data that were collected through semistructured interviews and users’ real-time feedback during LiquidEye use were integrated and analyzed. With these data, we conducted a thematic analysis to increase understanding. To build up themes from the data, we referred to the seven-element constructs of the Theoretical Framework for Acceptability developed by Sekhon et al [28]. These constructs consist of (1) affective attitude (how an individual feels about the intervention), (2) burden (perceived amount of effort that is required to participate in the intervention), (3) ethicality (extent to which the intervention has a good fit with an individual’s value system), (4) intervention coherence (extent to which the participant understands the intervention and how it works), (5) opportunity cost (extent to which benefits, profits, or values must be given up to engage in the intervention), (6) perceived effectiveness (extent to which the intervention is perceived as likely to achieve its purpose), and (7) self-efficacy (participants’ confidence that they can perform the behaviors required to participate in the intervention).

Statistical Analysis of Participation Depending on Interface Elements

With the user data logs collected during LiquidEye use, we conducted quantitative analysis. To investigate if each interface element significantly affects user participation in eye resting behavior suggested by the LiquidEye system (1: participated, 0: did not participate), multiple regression analysis was conducted. Interface elements were analyzed as independent variables, and participation was analyzed as a dependent variable. We conducted statistical analyses using R software (R Foundation for Statistical Computing).


Results From Phase 1 (Focus Group Study)

Table 3 shows the list of interface elements evaluated in the focus group study and their ratings, including newly suggested interface elements in the redesign session during the focus group discussion. We classified these elements into subthemes and themes. During the classification process, we referred to the behavioral intervention technology model that involves frameworks integrating the conceptual framework into the technological framework [29].

Table 3. List and scores of interface elements that resulted from the focus group study.
ThemeSubthemesInterface elements (score)aRelated systems
Behavior change strategiesEducationInstruction (4.0)Rest on Time
Behavior change strategiesGoal settingGoal selection (5.7)Protect Your Vision
Behavior change strategiesMonitoringParticipation (4.8)Rest on Time
Behavior change strategiesFeedbackDescriptive message (4.0)
Comparative message (+) (4.2)
Evaluative message (+) (4.2)
Compliment message (4.8)
Rest on Time
Protect Your Vision
Behavior change strategiesRewardMonetary reward (+) (5.0)
Score reward (+) (4.2)
N/Ab
ElementsInformation deliveryHealth information (+) (6.0)N/A
ElementsNotificationPopup (4.8)
Full screen (4.3)
Rest on Time
CharacteristicsMediumPhysical signal (−) (2.2)
Sound (4.0)
Screen based (6.2)
Protect Your Vision
Eye Protector
Rest on Time
CharacteristicsComplexityRotated message (+) (6.0)N/A
CharacteristicsAestheticsAn agent with robot appearance (4.0)
An agent with expert appearance (+) (5.2)
Spot effect (−) (2.8)
Flashing effect (−) (2.8)
Blurred effect (4.8)
Symptom-like effect (+) (5.7)
Protect Your Vision
Eye Protector
WorkflowUser definedCustomization (+) (6)N/A
WorkflowConditions20-20-20 (4.8)
60-5 (4.0)
Just-in-time (+) (4.2)
Protect Your Vision
Eye Protector
Rest on Time

aInterface elements with low effectiveness are marked with “−,” and interface elements newly added during the focus group discussion are marked with “+.”

bN/A: not applicable.

Interface Elements With Low Effectiveness

Elements with low effectiveness (score below 4.0) were not included in LiquidEye. Interface elements with low effectiveness were visual effects of spot and flashing, and physical signal popups at the scheduled resting time (marked with “−” in Table 3). The spot is a feature that involves a colored dot icon intended to minimize interruption depending on the user’s condition. Participant P6 made the following statement:

I don't think it's going to be noticeable. It only takes up a small part of the screen.

The flashing effect was also discussed as below effective for the same reason as the spot. For physical signals, such as blowing wind toward the user’s eyes, it was discussed as effective for grabbing the user’s attention, but most participants rated it with a low score owing to its annoying interruption.

Newly Added Interface Elements

Additional interface elements were discussed in the focus group discussion (marked as “+” in Table 3). Symptom-like effects were suggested by participant P6. The participant made the following statement:

If the visual effect in the screen-based intervention come up with the CVS symptom like effects such as blurred vision or black spot, it will increase susceptibility to CVS, thus increase participation.

The reward element was suggested by participant P2, participant P4, and participant P5. Participant P4 made the following statement:

Like playing the game, the rewards of making virtual money or getting high scores will affect not only early acceptability but also motivation for long-term use.

For health information, participant P1, participant P5, and participant P6 indicated the need for this element. Participant P5 made the following statement:

Medical center does not usually give detailed eye-resting instructions. If we can get health information through this system, we can eventually make more efforts to improve CVS related symptoms.

Participant P2 and participant P6 suggested a character with an expert-like appearance in the stage of eye resting instructions. Participant P6 made the following statement:

Expert-like character will increase the credibility of the information follows.

A just-in-time function was suggested from the paper prototype of participant P4. Participant P4 made the following statement:

It would be more acceptable if the system has the function of avoiding important time such as meeting time.

A customization option was suggested by most of the participants. Participant P1, participant P2, participant P4, participant P5, and participant P6 added customization options to their paper prototypes. Participant P6 made the following statement:

Different people have different demands for designs and functions, so it would be better if we could select the elements at the beginning of the system use.

Rotation of messages in the system was suggested by participant P1, participant P2, participant P3, participant P6, and participant P7. Participant P1 made the following statement:

Rotated messages will make the system more useful.
Additional Comments

After listing all interface elements in the focus group discussion, additional comments were collected to design LiquidEye. We conducted a focus group interview to discuss how the final elements (score above 4.0; to be implemented in LiquidEye) should be customized for the users. There existed several comments about varying frequency, varying interface size or design, and adding customization options. We present these results in line with the themes in Table 3.

For the education element (instruction on the system and how to use it), participants anticipated that the presence of this element matters more than how the element itself is organized. Some participants said they do not need it at all, while others said they want it to be for a specific period of time. Thus, two options, one with and one without the element, were implemented as customizable in LiquidEye.

For goal setting, which is setting resting frequency and the time of the day, most participants insisted that it should be customizable. In our case, reducing symptoms of CVS was our major clinical aim. To prevent CVS caused by prolonged computer use, clinical optometrists suggest users follow the 20/20/20 rule [30], which is that one should look at something 20 feet away for at least 20 seconds after 20 minutes of computer use [31]. However, since it is not easy to follow these guidelines, participants mentioned that they need flexibility with eye resting frequency and time, depending on their context. Therefore, we added an adjustable goal-setting element in our system.

Monitoring of participation was required or not depending on the individual. Thus, two options, one with and one without this element, were implemented as customizable in LiquidEye.

With regard to feedback, the kinds of feedback were not distinguishable. However, there was a difference between compliment feedback and others (descriptive message, comparative message, and evaluative message) according to most of the participants. Thus, we separated these two large categories in the setting options (Figure 1) and rotated the descriptive message, comparative message, and evaluative message.

There were opinions that there was no need to adjust the reward element depending on the context. If this element shows up in the system, it needs to keep showing up. All participants agreed that this element does not need to be customizable. Thus, it was kept as a basic setting.

The information delivery element (delivering health information) was required or not depending on the individual’s preference. Thus, two options, one with and one without this element, were implemented as customizable in LiquidEye.

For the notification window, the size of the window can influence acceptability. Preference regarding the notification element (popup and full screen) varied. Participants commented as follows:

If my previous work environment is paused by the system anyway, I rather prefer full-screen.
[Participant #P1 and participant #P6]
Interruption has to be as small as possible.
[Participant #P7]

On the settings page of the system (Figure 1), the following four options were provided: full-screen notification window, mid-size window, small message popup on the top right of the screen, and none.

Physical signals, such as blowing winds, were eliminated from our final list since they were rated below our borderline (score 4.0). In the end, only the sound element was implemented in LiquidEye. Users can select the sound option or not in the settings menu. Regarding message rotation, all participants insisted that it is a necessary function for all time points. Therefore, this was set as a basic function. For the aesthetic element (presence of the character, color, etc), preferences varied among the participants. For this reason, we made it customizable for users. Users can select the color of the screen and the kind of character they like or can eliminate it.

Statistical Analysis With User Data Logs From Phase 2 (Real-World Deployment)

With LiquidEye, which was developed based on the focus group results, we collected user data logs during the 14 days of the experiment (n=680). To investigate interface elements that greatly affected the participation rate in the deployment, a multiple regression analysis was conducted with the users’ overall data logs. Each interface element (total 14 elements) was analyzed as an independent variable, and participation (1: participated, 0: did not participate) was analyzed as a dependent variable. Table 4 shows the results from the multiple regression analysis. We present results in line with our themes and subthemes defined above. The relevant elements included the instruction page of the eye resting strategy, goal setting for eye resting, compliment feedback after completing eye resting, mid-size popup window, and symptom-like visual effects that provide an alarm for the eye resting time.

Table 4. Results of multiple regression analysis.
Interface element (themes)Interface element (subthemes)EstimateStandard deviationZ valueP value
Intercept
−1.11140.6428−1.73.08
EducationIntroduction page0.60720.24692.46.01
Goal settingDefault setting−0.66470.2550−2.61.009
Goal settingAdjusted setting−0.06770.3730−0.18.86
MonitoringParticipation report−0.17850.3697−0.48.63
FeedbackDefault message0.24600.28570.86.39
FeedbackCompliment after eye resting1.29770.34433.77<.001
Information deliveryHealth information−0.64900.2824−2.30.02
NotificationLarge-size window−0.25720.3571−0.72.47
NotificationMid-size window−0.88730.3766−2.36.02
NotificationSmall-size window−0.67310.3798−1.77.08
MediumSound0.25800.22971.12.26
AestheticExpert agent−0.12770.3529−0.36.72
AestheticRobot agent−0.12700.3437−0.37.71
AestheticSymptom-like effects with a notification window0.78170.27142.88.004

Overview

Through two studies (ie, focus group study and deployment study), we explored the interface elements of computer-based interventions for CVS. Additionally, we collected real-world user data by deploying LiquidEye with customizable interface elements. With results from the deployment study, we could analyze how interface elements included in LiquidEye affected user participation with eye resting behavior. We will discuss the results while suggesting design guidelines for computer-based interventions for CVS.

Guidelines for Important Interface Elements

A summary of design guidelines for interface elements is presented in Table 5.

Based on our results, we will discuss the effect of each interface element on user participation with LiquidEye. We will also share the user feedback from the 14-day experiment with LiquidEye to discuss the results. We will first discuss the interface elements that greatly affected participation in the eye resting behavior, including the instruction page of the eye resting strategy, goal setting for eye resting, compliment feedback after completing eye resting, mid-size popup window, and symptom-like visual effects that provide an alarm for the eye resting time.

Regarding the instruction page, most participants agreed that it helped a lot at the beginning of the experiment, but was no longer needed after participants got used to it. As participants mentioned, the adaptation level affects the consequences of the interface element instruction page by increasing user intervention coherence or increasing user burden. System designers should consider how fast users adapt to the system and, at the same time, how easy or hard the system has been designed since these factors influence a user’s need for the instruction page.

Table 5. Summary of design guidelines for interface elements.
Interface element
(theme)
Example of interface element
(subtheme)
Summary of design guidelines
EducationIntroduction pageSystem designers should consider how fast users adapt to the system and, at the same time, how easy or hard the system was designed since these factors influence the user’s need for an instruction page.
Goal settingDefault setting, adjusted setting (customizable)System designers should consider the user’s willingness to manage the eye condition since it decides a need for customization of goal settings. Default setting is the predefined setting regardless of the user’s autonomy.
MonitoringParticipation reportThis element can be a double-edged sword for the motivation of the user. It can increase or decrease the self-efficacy of the user depending on the level of participation.
FeedbackDefault message, compliment after eye restingDepending on the context of the user, it can be either effective or ineffective. However, the preference for this element was high among users.
Information deliveryHealth informationSystem designers should consider the user’s intention to manage the symptoms. If the user intention is high, the need for health information is also high at most times. However, low user intention can make users feel that this element is a burden.
NotificationSize of the windowThe size of the popup influenced the forcefulness of the computer-based intervention. The full-screen notification with the high forcefulness was evaluated as most effective, but, at the same time, a high burden. Mid-size notifications positively affected user participation among other options.
MediumSoundThe social context largely affected the user experience. Most of the participants insisted that it does not need to be in the system.
AestheticPresence of characters (expert agent or robot agent) or visual effects (symptom-like effects)Most of the time aesthetic elements rarely affect user participation, except when they strengthen the intervention effects by accompanying other intervention elements, such as the notification window in our case.

Goal setting for eye resting is another element that greatly affects user participation in eye resting behavior. It was interesting that the default setting for goal setting was relevant, while the adjusted setting was not. The default setting is a predefined setting based on the 20/20/20 rule (one should look at something 20 feet away for at least 20 seconds after 20 minutes of computer use) [30] for preventing or reducing CVS symptoms. Since the 20/20/20 rule is strict for long-time computer users as they have to rest three times per hour, we expected that customized settings (adjusted by users) would be more effective at increasing the participation rate. However, the customizable setting did not affect the user’s participation rate according to our statistical data. From user feedback, we found out that customizable goal setting (“adjusted” in Table 4) can result in increased effectiveness or increased burden depending on the attitude of the user. Users evaluated the system with the goal setting element more effectively when they were willing to manage their symptoms compared with those who were not willing to manage their symptoms. For example, participant P7 with a low attitude level showed a negative opinion. This participant made the following statement:

It is too annoying to set goals since I feel no need to manage my symptoms.

Compliment feedback after completing eye resting greatly affected user participation, but the qualitative results implied that it can sometimes be a burden for users. In particular, what users were doing right before the intervention affected the consequences of the feedback element. Participant A6 made the following statement:

I was working hard and then they told me to take a rest. I want to go back to my working environment as soon as the break is over. I don't feel like the extra things which are annoying and unnecessary.

However, most participants said that this element plays a positive role when they are not busy. Participant A5 made the following statement:

Compliment feedback was really helpful. I always turn this element on as my basic setting. It makes me feel good!

Regarding the interface element notification window, the mid-size popup window was related to user participation. The element was implemented in LiquidEye with four options (small-size popup, mid-size popup, full-screen popup, and no popup). Most of the participants insisted that the size of the popup influenced the forcefulness of LiquidEye. The full-screen popup with high forcefulness was evaluated as the most effective, but, at the same time, as having a high burden. Participant A10 made the following statement:

If I make up my mind to take a break anyway, I'd rather be forced to do it on time.

On the other hand, participant A6 made the following statement:

Small pop-up is barely noticeable, which makes me miss the participation.

Based on user feedback during and after the deployment study, we could infer that a mid-size popup window could be an alternative for the full-screen window and the small-size window with low effectiveness.

Another relevant interface element was symptom-like visual effects that provide an alarm for the eye resting time. An interesting opinion about this element was that its effectiveness depends on the size of the window and how often the effect is being rotated. Participant A9 made the following statement:

When it comes to this element, how much it grabs my attention matters. When it accompanies a full-screen popup window, it does not grab additional attention, because the popup window already fills my whole screen. However, when it accompanies small or middle size popup window, it strengthens the system to grab additional attention.

Guidelines on Other Interface Elements

We are going to discuss additional findings on other interface elements even though they were not found to be relevant. They did not show significance, but monitoring the user’s participation and making a report on daily progress (“monitoring” in Table 4) can increase or decrease the self-efficacy of the user depending on the level of participation. Participant A3 made the following statement:

When I participated a lot, it was helpful for motivation but when I participated less, it was a burden to see.

Additionally, participant A2 made the following statement:

I just want it to show me the number of times I participated, not the rate of participation. It only gets lower if I do not participate in 100 percent.

The effectiveness of health information seems to depend on the intention to manage the symptoms. It could be effective if users are highly willing to manage their symptoms. Participant A11 made the following statement:

Getting this information makes me feel like I'm taking good care of my eyes. I spent more time thinking about my eyes.

However, for those who have a low intention of participating in eye resting behavior, health information could be a bothersome interface element. Participant A2 expressed the following negative opinion:

Whether it is health information or anything else, a lot of text could be the burden to use the system.

When it comes to a sound-based alarm (“sound” in Table 4), the social context largely affected the user experience. Most of the participants insisted that it does not need to be in the system. Participant A5 made the following statement:

I did not use it at least once since I always use my computer in my workplace.

Few participants mentioned that it can be assistive but it must be optional.

For the character-like agent (“expert agent” and “robot agent” in Table 4), there rarely existed comments from users. Participant A1 made the following comment:

It is barely noticeable. It does not affect my participation.

Additionally, users could customize their interfaces in the LiquidEye settings menu by themselves. This function of customization can increase effectiveness, but can be a burden depending on the clinical goal of the user. Most of the participants were satisfied with the customization options. Participant A4 made the following statement:

Depending on whether it is night or day, the desired setting is different since we are usually doing important things during the daytime and less important things during the nighttime.

On the other hand, participant A7 made the following statement:

It is a burden to change the options frequently. I want it to just recommend me the best option which is not very disturbing.

Avoiding work interruption was one of the major issues regarding user context. On the other hand, there is a need for a “right-on-time” intervention when it comes to clinical management of eye health. Participant A5 made the following statement:

I want it to show up right on time which is a most effective way for my eye health.

However, most participants agreed with the idea that LiquidEye needs to avoid critical moments (eg, sharing the monitor with colleagues in the middle of a conference). Participant A4 made the following statement:

Adding the do-not-disturb function to the LiquidEye will make the system more acceptable.

Application to Other Clinical Symptoms

For developing our computer-based intervention, CVS was chosen as our condition of interest. Before expanding our results to other clinical domains that require computer-based interventions, designers or system developers should consider the below-mentioned steps.

First, when choosing interface elements for computer-based interventions, the initial thing to do is feature the clinical aim and the usage aim [29,32]. System designers need to decide on these aims and the intervention medium before they choose the interface elements. Depending on the clinical aim and target behavior, the intervention medium can be different, which means that a computer-based intervention is not the best medium for all cases.

Second, understanding the target clinical group is crucial [33]. Even if the same interface element is being used, implementation strategies have to differ depending on users’ unique features. Elements should be applied depending on the users’ personal and health behavior–related factors, such as attitude, behavior intention, and ultimate health goals. If the target user group is too heterogeneous, a computer-based intervention can be an option since it offers a great variety of options for assessing individuals, creating and delivering customized health messages, and providing individuals with the methods necessary to maintain or change their health-related behaviors [14].

Third, the evaluation of a computer-based intervention has to be completed before final implementation in a large population. Even when two computer-based interventions use the same framework, the consequences can be different. In our study, we evaluated the interface elements in LiquidEye with statistical analyses to better understand the consequences of choosing the interface elements.

If designers take all of the above points into consideration, our work is expected to decrease the cost of choosing interface elements in computer-based interventions by minimizing trial and error, even when implemented in other clinical domains.

Conclusions

To reduce the prevalence of CVS in computer users, designing appropriate interventions that induce eye rest is one of the technology-based solutions. In this study, we suggested design implications to consider when designing a computer-based intervention for CVS. The sophisticated design of a customizable interface can make it possible for users to use the system more interactively, which can result in higher engagement. Among the various interface elements that are being implemented in computer-based interventions for CVS, we found that the instruction page of the eye resting strategy, goal setting for eye resting, compliment feedback after completing eye resting, mid-size popup window, and symptom-like visual effects that provide an alarm for the eye resting time greatly affected user participation in the eye resting behavior. We manually defined how these elements affected user participation based on the framework of acceptability. In a further study, we will explore the opportunities of automated technologies, such as facial expression recognition [34], deep sentiment analysis [35], and gaze-tracking algorithms [36], to detect positive or negative user experiences with the computer-based intervention. There are important technical challenges that still need to be addressed, but given the fact that this study was able to clarify the various factors related to computer-based interventions, the findings are expected to contribute greatly to the research of various computer-based intervention designs in the future.

Acknowledgments

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2018S1A3A2075114).

Conflicts of Interest

None declared.

  1. Ryan C, Lewis J. Computer and Internet Use in the United States: 2016. United States Census Bureau. 2018.   URL: https://www.census.gov/content/dam/Census/library/publications/2018/acs/ACS-39.pdf [accessed 2021-03-13]
  2. Neumann PG. Computer-Related Risks. Boston, MA, USA: Addison Wesley; 1994.
  3. Widyantara P, Puspasari M. Breakpoint of Attention Media Evaluation as Countermeasure for Computer Vision Syndrome. In: Proceedings of the 2019 5th International Conference on Industrial and Business Engineering. 2019 Presented at: 5th International Conference on Industrial and Business Engineering; September 2019; Hong Kong p. 188-192. [CrossRef]
  4. Yan Z, Hu L, Chen H, Lu F. Computer Vision Syndrome: A widely spreading but largely unknown epidemic among computer users. Computers in Human Behavior 2008 Sep 01;24(5):2026-2042. [CrossRef]
  5. Comeau J, Godnig E. Computer use and vision. In: Proceedings of the 27th annual ACM SIGUCCS conference on User services: Mile high expectations. 1999 Presented at: 27th annual ACM SIGUCCS conference on User services: Mile high expectations; November 1999; Denver, CO, USA p. 31-34. [CrossRef]
  6. Sheedy JE. Vision problems at video display terminals: a survey of optometrists. J Am Optom Assoc 1992 Oct;63(10):687-692. [Medline]
  7. Blehm C, Vishnu S, Khattak A, Mitra S, Yee RW. Computer vision syndrome: a review. Surv Ophthalmol 2005;50(3):253-262. [CrossRef] [Medline]
  8. Chawla A, Lim TC, Shikhare SN, Munk PL, Peh WCG. Computer Vision Syndrome: Darkness Under the Shadow of Light. Can Assoc Radiol J 2019 Feb;70(1):5-9. [CrossRef] [Medline]
  9. Dementyev A, Holz C. DualBlink: A Wearable Device to Continuously Detect, Track, and Actuate Blinking For Alleviating Dry Eyes and Computer Vision Syndrome. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2017 Presented at: ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; March 2017; USA. [CrossRef]
  10. Kawashima M, Uchino M, Kawazoe T, Kamiyashiki M, Sano K, Tsubota K. A field test of Web-based screening for dry eye disease to enhance awareness of eye problems among general Internet users: a latent strategy to promote health. J Med Internet Res 2013 Sep 27;15(9):e209 [FREE Full text] [CrossRef] [Medline]
  11. Inomata T, Nakamura M, Iwagami M, Midorikawa-Inomata A, Sung J, Fujimoto K, et al. Stratification of Individual Symptoms of Contact Lens-Associated Dry Eye Using the iPhone App DryEyeRhythm: Crowdsourced Cross-Sectional Study. J Med Internet Res 2020 Jun 26;22(6):e18996 [FREE Full text] [CrossRef] [Medline]
  12. Rodin A, Shachak A, Miller A, Akopyan V, Semenova N. Mobile Apps for Eye Care in Canada: An Analysis of the iTunes Store. JMIR Mhealth Uhealth 2017 Jun 14;5(6):e84 [FREE Full text] [CrossRef] [Medline]
  13. Rono H, Bastawrous A, Macleod D, Bunywera C, Mamboleo R, Wanjala E, et al. Smartphone-Guided Algorithms for Use by Community Volunteers to Screen and Refer People With Eye Problems in Trans Nzoia County, Kenya: Development and Validation Study. JMIR Mhealth Uhealth 2020 Jun 19;8(6):e16345 [FREE Full text] [CrossRef] [Medline]
  14. Lustria MLA, Cortese J, Noar SM, Glueckauf RL. Computer-tailored health interventions delivered over the Web: review and analysis of key components. Patient Educ Couns 2009 Feb;74(2):156-173. [CrossRef] [Medline]
  15. Ritterband LM, Andersson G, Christensen HM, Carlbring P, Cuijpers P. Directions for the International Society for Research on Internet Interventions (ISRII). J Med Internet Res 2006 Sep 29;8(3):e23 [FREE Full text] [CrossRef] [Medline]
  16. Griffiths F, Lindenmeyer A, Powell J, Lowe P, Thorogood M. Why are health care interventions delivered over the internet? A systematic review of the published literature. J Med Internet Res 2006 Jun 23;8(2):e10 [FREE Full text] [CrossRef] [Medline]
  17. Beintner I, Vollert B, Zarski A, Bolinski F, Musiat P, Görlich D, et al. Adherence Reporting in Randomized Controlled Trials Examining Manualized Multisession Online Interventions: Systematic Review of Practices and Proposal for Reporting Standards. J Med Internet Res 2019 Aug 15;21(8):e14181 [FREE Full text] [CrossRef] [Medline]
  18. Wantland DJ, Portillo CJ, Holzemer WL, Slaughter R, McGhee EM. The effectiveness of Web-based vs. non-Web-based interventions: a meta-analysis of behavioral change outcomes. J Med Internet Res 2004 Nov 10;6(4):e40 [FREE Full text] [CrossRef] [Medline]
  19. Brouwer W, Kroeze W, Crutzen R, de Nooijer J, de Vries NK, Brug J, et al. Which intervention characteristics are related to more exposure to internet-delivered healthy lifestyle promotion interventions? A systematic review. J Med Internet Res 2011 Jan 06;13(1):e2 [FREE Full text] [CrossRef] [Medline]
  20. Etter J. Comparing the efficacy of two Internet-based, computer-tailored smoking cessation programs: a randomized trial. J Med Internet Res 2005 Mar 08;7(1):e2 [FREE Full text] [CrossRef] [Medline]
  21. Eysenbach G. What is e-health? J Med Internet Res 2001 Jun;3(2):E20 [FREE Full text] [CrossRef] [Medline]
  22. Han S, Yang S, Kim J, Gerla M. EyeGuardian: a framework of eye tracking and blink detection for mobile device users. In: HotMobile '12: Proceedings of the Twelfth Workshop on Mobile Computing Systems & Applications. 2012 Presented at: Twelfth Workshop on Mobile Computing Systems & Applications; February 2012; San Diego, CA, USA p. 1-6. [CrossRef]
  23. Miluzzo E, Wang T, Campbell A. EyePhone: activating mobile phones with your eyes. In: Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds. 2010 Presented at: Second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds; August 2010; New Delhi, India p. 15-20. [CrossRef]
  24. Trutoiu LC, Carter EJ, Matthews I, Hodgins JK. Modeling and animating eye blinks. ACM Trans. Appl. Percept 2011 Aug;8(3):1-17. [CrossRef]
  25. Ho J, Pointner R, Shih H, Lin Y, Chen H, Tseng W, et al. EyeProtector: Encouraging a Healthy Viewing Distance when Using Smartphones. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. 2015 Presented at: 17th International Conference on Human-Computer Interaction with Mobile Devices and Services; August 2015; Copenhagen, Denmark p. 77-85. [CrossRef]
  26. Protect Your Vision. Breaks For Eyes.   URL: https://breaksforeyes.app [accessed 2021-03-13]
  27. Rest on Time. Breaks For Eyes.   URL: https://breaksforeyes.app [accessed 2021-03-13]
  28. Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res 2017 Jan 26;17(1):88 [FREE Full text] [CrossRef] [Medline]
  29. Mohr DC, Schueller SM, Montague E, Burns MN, Rashidi P. The behavioral intervention technology model: an integrated conceptual and technological framework for eHealth and mHealth interventions. J Med Internet Res 2014 Jun 05;16(6):e146 [FREE Full text] [CrossRef] [Medline]
  30. Anshel J. Visual Ergonomics Handbook. Boca Raton, FL, USA: CRC Press; 2005.
  31. Reddy SC, Low CK, Lim YP, Low LL, Mardina F, Nursaleha MP. Computer vision syndrome: a study of knowledge and practices in university students. Nepal J Ophthalmol 2013;5(2):161-168. [CrossRef] [Medline]
  32. Taylor TK, Webster-Stratton C, Feil EG, Broadbent B, Widdop CS, Severson HH. Computer-based intervention with coaching: an example using the Incredible Years program. Cogn Behav Ther 2008;37(4):233-246 [FREE Full text] [CrossRef] [Medline]
  33. Hetzroni OE, Tannous J. Effects of a Computer-Based Intervention Program on the Communicative Functions of Children with Autism. J Autism Dev Disord 2004 Apr;34(2):95-113. [CrossRef] [Medline]
  34. Leo M, Carcagnì P, Mazzeo PL, Spagnolo P, Cazzato D, Distante C. Analysis of Facial Information for Healthcare Applications: A Survey on Computer Vision-Based Approaches. Information 2020 Feb 26;11(3):128. [CrossRef]
  35. Zhang L, Wang S, Liu B. Deep learning for sentiment analysis: A survey. WIREs Data Mining Knowl Discov 2018 Mar 30;8(4):e1253. [CrossRef]
  36. Cazzato D, Leo M, Distante C, Voos H. When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking. Sensors (Basel) 2020 Jul 03;20(13):3739 [FREE Full text] [CrossRef] [Medline]


CVS: computer vision syndrome


Edited by R Kukafka; submitted 05.10.20; peer-reviewed by S Guness, M Leo; comments to author 18.11.20; revised version received 05.01.21; accepted 25.02.21; published 29.03.21

Copyright

©Youjin Hwang, Donghoon Shin, Jinsu Eun, Bongwon Suh, Joonhwan Lee. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.03.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.