Original Paper
Abstract
Background: Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively.
Objective: We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies.
Methods: We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method.
Results: The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions.
Conclusions: The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.
doi:10.2196/jmir.1750
Keywords
Introduction
eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools. The objective of this study was to explore how eHealth literacy can be systematically analyzed, measured, and quantified. We proposed a methodological and theoretical framework that systematically maps skill sets to successful performance of eHealth tasks. We employed a microanalytic strategy in which complex competencies can be broken down into constituent skills or local task demands. In our view, systematic understanding of the necessary competencies can inform development of targeted solution strategies to overcome skill- and knowledge-related barriers.
Background
The term eHealth refers to “health services and information delivered or enhanced through the Internet and related technologies” [
]. Consumer-oriented eHealth tools engage consumers in managing their own health care, communicating with providers and support networks, meeting their information needs, making health decisions, using patient education resources, and promoting healthy lifestyles [ - ]. Unfortunately, most of these eHealth tools have not been designed with the consideration of the needs and characteristics of diverse user groups. These tools may even increase the complexity of health care engagement for those lacking the prerequisite abilities [ ].Many different factors can inhibit consumers’ meaningful use of eHealth tools, including environmental barriers [
], physical access barriers [ ], resource-related barriers [ - ], and individual-level barriers [ , , , ]. Underserved and vulnerable populations face additional challenges that exacerbate these obstacles [ ]. Different types of tools offer varied resources and functionalities, enabling performance on a wide range of eHealth tasks. Hence, different types of challenges arise depending on the tool. Specifically, interaction with different eHealth tools and tasks makes different kinds of demands on skills and knowledge. [ , - ] lists some examples of documented skill-related challenges that may lead to barriers to the use of different eHealth tools.eHealth tool | Example of tasks | Examples of skill-related challenges in completing eHealth tasks |
Health information portals | Looking up information about treatment options for a health condition |
|
| ||
| ||
| ||
Personal health records | Entering personal information into medical record |
|
| ||
Telemedicine or teleconsultation applications | Communicating with health care providers |
|
| ||
Decision-support tools | Evaluating and weighing evidence to inform a decision |
|
| ||
Online support or chat groups | Participating in discussion forum |
|
|
There is a divide between what consumers can reasonably be expected to do and the demands and available resources of different tools. Various research efforts, in areas such as educational media, health literacy, and numeracy research, have tried to bridge this gulf by addressing user knowledge and competence, and improving resources. Addressing access and skills barriers has helped underserved and vulnerable populations to use technology in terms of managing their health concerns [
, ]. Therefore, it is important to identify barriers and devise solution strategies to eliminate obstacles that reinforce eHealth disparities.Our approach is an effort to make this a more tractable problem by identifying candidate explanatory constructs and employing cognitive task analysis (CTA) methods to new applications. To the best of our knowledge, this is a unique effort to introduce systematicity to this complex and ill-defined research space. In our view, the success of consumer health informatics initiatives is partially predicated on an understanding of eHealth literacy demands and competencies.
Theoretical Framework
In this research, we endeavored to develop a systematic approach to analyzing competencies across eHealth interventions. The objective of this research was to understand the core skills and knowledge needed to productively use eHealth tools and to develop a set of methods for analyzing eHealth literacy. Previously, we presented a preliminary sketch of our framework for characterizing eHealth literacy task demands [
]. In this study, we explored further application of the framework to characterize human task performance.The approach draws on two established models: the eHealth literacy model and Bloom’s taxonomy of educational objectives. eHealth literacy is defined as “the ability to seek, find, understand, and appraise health information from electronic sources and apply the knowledge gained to addressing or solving a health problem” [
]. The eHealth literacy model describes the set of “fundamental skills consumers require to derive direct benefits from eHealth” [ ]. This set of skills establishes an important starting point but does not provide us with a means to discern how different cognitive functions or processes are engaged by different tasks. In addition, eHealth skills may be acquired in different stages, and thus individuals may display different degrees of competence in these skills. We incorporated a second model that is designed to discriminate between various kinds of cognitive processes and describe dimensions of cognitive complexity. Bloom’s taxonomy describes the increasing progression in complexity of cognitive aspects of learning, skill acquisition, and performance, and it has been applied to a range of different topic domains [ - ]. Incorporating this model allows us to characterize the central cognitive processes that constitute each literacy type. The eHealth literacy model defines a literacy type or content domain, while the cognitive taxonomy provides a means of realizing how this can be expressed in the context of task performance.In our framework, we adapted the eHealth literacy model proposed by Norman and Skinner in 2006 as a point of departure. Their model describes six components of eHealth literacy [
]:- Computer literacy describes a wide range of skills from basic knowledge of using a computer, such as opening a browser window, to participating in social networking activities.
- Information literacy encompasses the skills to articulate information needs, to locate, evaluate, and use information, and to apply information to create and communicate knowledge [ ].
- Media literacy is the ability to select, interpret, evaluate, contextualize, and create meaning from resources presented in a variety of visual or audio forms [ ].
- Traditional literacy and numeracy encompasses reading and understanding written passages, communicating and writing a language coherently, quantitative skills, and the ability to interpret information artifacts such as graphs, scales, and forms [ , ].
- Science literacy includes familiarity with basic biological concepts and the scientific method, as well as the ability to understand, evaluate, and interpret health research findings using appropriate scientific reasoning [ ].
- Health literacy is the acquisition, evaluation, and appropriate application of relevant health information that allows consumers to communicate about health, make health decisions, and use health services [ , ].
Although this model of eHealth literacy is not inclusive of all factors that may influence the use of eHealth (e.g., knowledge of the social and cultural norms involved in participating in a support forum), it is our contention that these six literacy types constitute the set of core skills and knowledge domains.
We selected a second model that explains variation in task performance along an increasing continuum of cognitive demands. Bloom’s taxonomy is a well-known taxonomy developed in 1956 and was revised and updated in 2002 [
]. The taxonomy classifies levels of intellectual behavior in learning and has been applied to develop educational objectives and curriculum, assess learning, and create test items [ ]. The cumulative hierarchy structure requires achievement of a prior skill before acquiring the next dimension of complexity, but the boundaries between these levels are not rigid. These six cognitive process dimensions are defined [ ] as follows:- Remembering is retrieving, recognizing, and recalling relevant knowledge from long-term memory.
- Understanding includes constructing meaning from oral, written, and graphic messages through interpreting, classifying, summarizing, inferring, comparing, and explaining.
- Applying involves using knowledge to execute a procedure.
- Analyzing comprises breaking material into constituent parts, and determining how the parts relate to one another and to an overall structure or purpose.
- Evaluating involves making judgments based on criteria and standards.
- Creating consists of putting elements together to form a coherent or functional whole in a new pattern or structure.
An overlay of Bloom’s taxonomy across the six eHealth literacies provides a framework to characterize and describe the different levels of cognitive demands within each of the six facets of eHealth literacy. It provides a structure to the analysis of human performance on eHealth tasks, allowing a differentiation of cognitive processes as well as of level of knowledge and skill.
The aim of this study was to characterize the constituent elements of eHealth literacy in performing tasks. The hypothesis was that this method can be used to elucidate the barriers to effective task performance.
Methods
Overview of the Framework and Method
The framework can be expressed as a matrix with the six facets of eHealth literacy along one axis and the six levels of complexity along the other axis, resulting in 36 combined categories. In our framework, we further separated the category of traditional literacy and numeracy into reading, writing, and numeracy and analyzed each separately, as shown in
, such that there are a total of eight different literacy types. In our preliminary application of the framework, it was evident that this revision was necessary to achieve sufficient level of detail for analysis. We defined the criteria for each of the cells through an iterative process of review and adaptation, drawing on evidence from peer-reviewed articles discussing eHealth and each type of literacy. This matrix of eHealth literacy and complexity definitions constituted the framework and codebook, providing the foundation for analysis. The framework coding can be used in two complementary ways. As shows, we employed a CTA and used it to characterize the demands of eHealth tasks with reference to specific tools. We also used the same categorical scheme to describe human performance on these tasks. The basis of the methodological framework involved coordinating task analysis and analysis of human performance.Literacy type | Increasing levels of cognitive complexity (Bloom’s taxonomy) | |||||
Remembering | Understanding | Applying | Analyzing | Evaluating | Creating | |
Computer | ||||||
Information | ||||||
Media | The contents of this table are intentionally left blank. This table illustrates the structure of the framework coding tool, which can be used by researchers to map skill demands to the corresponding framework code in each cell of the table. | |||||
Reading | ||||||
Writing | ||||||
Numeracy | ||||||
Science | ||||||
Health |
Application 1: Cognitive Task Analysis
To characterize eHealth literacy demands, we employed CTA, a cognitive engineering method that decomposes a task to uncover knowledge, goal structures, thought processes, and strategies underlying task completion [
, ]. Expert analysts carried out CTA by performing the task themselves, eliciting both information-processing demands of a task and the kinds of domain-specific knowledge required [ ]. In the study of health technologies, CTA is most commonly used to study system usability, devise training protocols, or analyze technology-mediated work [ ]. We applied CTA in a novel application, to characterize the actions, either behavioral or cognitive, and the knowledge needed to execute an eHealth task. For each task, CTA was used to enumerate the action and knowledge steps used to complete the specified task and to identify the constituent skills required to complete each step. Next, the codebook was used to select the types of literacy that describe the knowledge used in each step. We then determined the kinds of cognitive operations involved in the task that would provide us with a complexity level. For example, a step may require reading a text passage in order to follow the directions in the passage. To apply the framework code, we first identified that this step requires reading literacy, and then determined that reading is required at the applying level of complexity to use the information in the passage appropriately. The step also required information literacy at the understanding level of complexity to be able to meet the appropriate information need while reading the passage. Most steps require more than one type of literacy.In prior work, we illustrated the application of the framework analysis with CTA of three information-seeking tasks [
]. When applied to eHealth tasks, the framework provided illuminating representations of a task, displaying the configurations of eHealth literacy and cognitive demands for each task. The preliminary findings suggested that the approach enabled deeper exploration of the complex relationships and interactions of the different types of literacy. Our current research explored further applicability of the framework by applying the approach to a new task category, decision making, and to a wider range of health domains. We also applied the method to analysis of human performance and explored how the framework elucidates and diagnoses barriers encountered.Application 2: Analysis of Human Performance in Task Completion
In the second step of our method, we recruited 20 users to perform the same tasks and observed their performance. These individuals were active computer users but had no previous experience using the website employed in the study, the Consumer Reports Health website (http://www.consumerreports.org/health), a resource that helps consumers make evidence-based decisions related to health issues.
Participants were asked to verbalize their thoughts (a think-aloud protocol) while completing the task and to explain their answers. The think-aloud protocol can reveal any hesitation, confusion, or misunderstanding while completing the task [
]. It can also reveal insights into reasoning and decision-making processes [ ]. While the participants were completing the task, guidance was provided when necessary to help them complete a task or to reroute them from a potentially fruitless path. Each session was audio recorded, and we used Morae 3.1 video-analytic software (TechSmith Corporation, Okemos, MI, USA) to capture all actions on the computer screen.A step-by-step analysis of the participants’ performance was done based on the audio recording, video capture, and notes taken during observation of the session. The measures employed were (1) accuracy of response to each question in the task, and (2) any barriers encountered at each step toward completing the task. Task responses were scored according to a scoring scheme comprising specific criteria defining scores of 0 (incorrect), 1 (partially correct), or 2 (correct). In our analysis, barriers were defined as events where participants struggled and may have been unable to make progress in the task or may have required some problem-solving steps before moving forward in the task. Barriers may be indicated when participants required prompts, asked questions, or made errors. A prompt was noted if the researcher provided some verbal assistance to participants, such as directing them to appropriate information or reminding them about the next step of the task. Questions occurred when participants asked a question, expressed confusion, or requested guidance from the researcher. An error was documented if there was a misstep or misinterpretation of information or system response, such as misunderstanding search results. For each barrier event, the framework coding could be applied to categorize the nature of the participant’s problem in terms of a type of literacy. For example, difficulty with scrolling would be categorized as difficulty with a computer literacy skill, whereas struggling with text passages would be categorized as difficulty with reading skills. We also matched each event with the corresponding step in the task completion process in which it occurred.
Example of Applying the Framework and Method
We applied these methods to the analysis of a particular task that required a series of information-seeking and decision-making steps. We selected the Consumer Reports Health website because, in our judgment, it is a high-quality and well-designed site that reflects a genuine understanding of consumers’ needs. The task question (see
) asked users to consider criteria comparing three different hospitals, demonstrate understanding of the information, and interpret the evidence presented.Task Question Requiring a Series of Information-Seeking and Decision-Making Steps
In the Doctors & Hospitals page, read the article “How-to guide to choosing a hospital” which can be found at the bottom of the page.
Look up the hospital ratings for all hospitals in the New York, NY region.
Next, on the ratings page, use the Compare feature to compare New York Presbyterian Hospital, Lenox Hill Hospital, and Bellevue Hospital Center.
- Identify the hospital that is least aggressive on the “Aggressive or Conservative” scale.What do these ratings of “Aggressive or Conservative” tell you about the hospital?
- Identify the hospital with the highest “Average Cost to Patient”.
- Of these 3 hospitals, select the hospital that you would want to go to for a surgical procedure, and discuss what criteria are most important in your decision.
shows a representation of the aggressive/conservative scale needed to interpret the “aggressive or conservative” continuum. These rows were extracted from a table on the pertinent Consumer Reports Health webpage. The aggressive-to-conservative continuum is one way in which the Consumers Union rates its hospitals. Hospitals that keep people with chronic diseases hospitalized for more days during the last 2 years of their lives are rated as aggressive. Hospitals that provide the least amount of doctor’s visits and shorted hospitalizations in those final years of life are considered conservative. We used example tasks to explore the reliability of the framework coding scheme. Two different raters used the codebook to classify task demands on two different tasks: an information-seeking and a decision-making task. The raters later used the codebook to also classify the barriers encountered by a subset of three different participants. For each type of coding, interrater reliability was assessed on two different dimensions of the coding: (1) type of literacy, using Cohen's kappa, and (2) level of cognitive complexity, using Spearman correlation coefficient. The assignment of a cognitive complexity code cannot be coded independent of literacy type. Therefore, cognitive complexity was calculated on the subset of codes in which both raters reached agreement on the literacy code.
Results
To illustrate the application of the framework and method, we present the results from the analysis of the task question asking about hospital ratings.
Application 1: Cognitive Task Analysis
Interrater reliability was calculated for coding of the CTA. Cohen's kappa for literacy was .91 and Spearman correlation coefficient for cognitive complexity was .92, suggesting high levels of agreement for both dimensions.
shows the CTA of an excerpt of this task (steps 10–16 of the entire task), from the steps for selecting the three specific hospitals for comparison to interpreting the aggressive or conservative scale.Step | Skills and knowledge required to complete step | Framework code from CTAa |
10 | Recognize the results page as a table of hospitals and their ratings. Scroll to see whole table. | Computer 3, information 4, numeracy 4, reading 1 |
11 | Recognize the “compare” feature, and that checkboxes for the desired hospitals are required to use this feature. Select the appropriate checkboxes for the three hospitals. | Computer 3, information 3, reading 2 |
12 | Recognize results as a table of the three selected hospitals with their detailed ratings. Scroll to see whole table. | Computer 3, information 4, numeracy 4, reading 2 |
13 | Scroll to locate the “aggressive or conservative” row in the table. Interpret and understand the labels for the aggressive/conservative scale. | Computer 3, information 4, numeracy 4, reading 2 |
14 | Identify the least aggressive rating and answer the information need. | Information 5, numeracy 4, reading 2, writing 2 |
15 | Click on the “learn more” link. Find the newly opened window. Scroll down to find the text about aggressive/conservative hospitals. Read and understand text. | Computer 3, information 4, health 4, reading 3 |
16 | Articulate understanding of what aggressive/conservative means. | Health 4, writing 3 |
a Cognitive task analysis, by increasing complexity: 1 = remembering, 2 = understanding, 3 = applying, 4 = analyzing, 5 = evaluating, 6 = creating.
Completing these series of steps required the participant to navigate to the table, locate the relevant information, and interpret the data in the table. The aggressive/conservative scale (see
) corresponds to step 13 in . Each step was coded with the corresponding framework codes that describe the eHealth literacy and complexity level used to complete that step. For example, step 10 required a combination of four types of eHealth literacy: (1) information literacy at the analyzing level of complexity (information 4) was required to interpret and evaluate the results page, (2) numeracy at the analyzing level of complexity (numeracy 4) was required to interpret the results table, (3) computer literacy at the applying level of complexity (computer 3) was required to navigate and interact with the table, and (4) reading was required at the remembering level (reading 1) to make sense of the information in the table. The steps required different combinations of literacy types, ranging from a combination of two to four types of literacy. The highest complexity level of any eHealth literacy required was level 5, evaluating. Reading and information literacy were required for most of the steps in this excerpt and appeared more frequently than the other literacies.summarizes the results of coding the CTA for the entire task. For the whole task, reading was used most often, in 18 of the 20 steps (90%). Information literacy (17 of 20 steps) and computer literacy (15 of 20 steps) were also used often. The frequent use of these skills suggests that they are essential to completing the task and are useful skills to promote among health care consumers. Media literacy was not required for any steps in this task given that the website was already selected as part of the task. Information literacy was required at level 5 (evaluating) for two of three questions, suggesting that high levels of information literacy are necessary to be able to carry out all components of this task. As this task was primarily an information-seeking task, it is not surprising that information literacy was required frequently and at high levels of cognitive demand. Numeracy was required at level 4 (analyzing) for all three questions, to understand the information in different representational formats and to interpret the data in the table of hospital ratings. Question C was the only question to require science literacy, which was used to weigh evidence in making a decision about selecting a hospital based on the criteria presented. Question C also required the most skills at level 5 (evaluating), for two different types of literacy, suggesting that it had the highest complexity demands across the whole task. Question B was the only question to require any skills at level 2 (understanding); Question B had the lowest complexity demands relative to the other questions. The highest complexity level required across the whole task was level 5 (evaluating).
Literacy type | Question A | Question B | Question C | Whole task |
Media | 0%a | 0% | 0% | 0% |
No complexity | No complexity | No complexity | No complexity | |
Computer | 50% | 50% | 50% | 75% |
Applying (3) | Applying (3) | Applying (3) | Applying (3) | |
Health | 50% | 0% | 100% | 35% |
Analyzing (4) | No complexity | Analyzing (4) | Analyzing (4) | |
Information | 75% | 100% | 50% | 85% |
Analyzing (4) | Evaluating (5) | Evaluating (5) | Evaluating (5) | |
Reading | 75% | 100% | 50% | 90% |
Applying (3) | Understanding (2) | Applying (3) | Applying (3) | |
Writing | 50% | 50% | 50% | 20% |
Analyzing (4) | Understanding (2) | Evaluating (5) | Evaluating (5) | |
Numeracy | 50% | 100% | 50% | 30% |
Analyzing (4) | Analyzing (4) | Analyzing (4) | Analyzing (4) | |
Science | 0% | 0% | 100% | 10% |
No complexity | No complexity | Applying (3) | Applying (3) | |
Total number of steps | 4 | 2 | 2 | 20b |
a For the task, the following is displayed: the proportion (percentage) of steps that use that eHealth literacy and the highest level of cognitive complexity used in that literacy (number and complexity level).
b Total number of steps for whole task includes a series of 12 navigational steps leading up to questions A, B, and C.
Application 2: Analysis of Human Performance in Task Completion
The framework coding was then applied to the task performance. Interrater reliability was calculated for the coding of task performance. Spearman correlation coefficient for cognitive complexity was .88, suggesting high agreement. Cohen's kappa for literacy was .68, suggesting lower but sufficient agreement to meet the minimum standard. The results from a single user are displayed in
.Step | Skills and knowledge required to complete step | Framework code from CTAa | Events that indicate barriers | Framework code for barrier |
10 | Recognize the results page as a table of hospitals and their ratings. Scroll to see whole table. | Computer 3, information 4, numeracy 4, reading 1 | Participant asks: “Aggressive or conservative scale—where’s that?” Participant is not on the correct page yet, needs to navigate to the next page first. | Computer 2, information 2 |
11 | Recognize the “compare” feature, and that checkboxes for the desired hospitals are required to use this feature. Select the appropriate checkboxes for the three hospitals. | Computer 3, information 3, reading 2 | Researcher prompts: “Use the ‘compare’ feature.” | Computer 3, information 1 |
Error: participant clicks on “compare” without having selected the hospitals to compare. | Computer 3 | |||
Researcher prompts: “In order to compare the three, you want to select all three together.” | Computer 3, information 2 | |||
12 | Recognize results as a table of the three selected hospitals with their detailed ratings. Scroll to see whole table. | Computer 3, information 4, numeracy 4, reading 2 | No barrier encountered during this step. | None |
13 | Scroll to locate the “aggressive or conservative” row in the table. Interpret and understand the labels for the aggressive/conservative scale. | Computer 3, information 4, numeracy 4, reading 2 | Participant confused by the multiple parts of the task question. Researcher prompts: “Look at this part of the question first.” | Information 1 |
Participant scrolls up and down, and finds the aggressive/conservative scale. Starts to read ahead to the next question. Researcher prompts again: “Try this question first—the hospital that is least aggressive.” | Information 1 | |||
Participant asks: “Where does it tell you which is least or most aggressive/conservative? In this area here?” (pointing to the scale). | Information 2, numeracy 4 | |||
14 | Identify the least aggressive rating and answer the information need. | Information 5, numeracy 4, reading 2, writing 2 | Participant stares at scale, confused. Researcher prompts: “What do you think the scale is telling you; how are you reading the scale?” | Numeracy 4 |
Participant is very confused by the scale, and answers: “The one that is more conservative is 32%, Bellevue. Least aggressive, Lenox Hill? I’m trying to understand this.” (incorrect) | Numeracy 4 | |||
15 | Click on the “learn more” link. Find the newly opened window. Scroll down to find the text about aggressive/conservative hospitals. Read and understand text. | Computer 3, information 4, health 4, reading 3 | Participant is unsure how to approach the next question. Researcher rewords the question and explains what the question is asking. | Information 1, reading 2, information 2, reading 2 |
Participant clicks on the “learn more” link and scrolls down the page, but cannot find the relevant text. Participant scrolls past the relevant passage. Researcher prompts: “You just missed the description on the page.” | ||||
16 | Articulate understanding of what aggressive/conservative means. | Health 4, writing 3 | Participant reads the text passage, then answers: “More doctors visit overall for aggressive/conservative care...fewer days in the hospital.” (incorrect) | Health 3 |
a Cognitive task analysis, by increasing complexity: 1 = remembering, 2 = understanding, 3 = applying, 4 = analyzing, 5 = evaluating, 6 = creating.
This participant scored low on this task, earning 2 out of a total of 6 possible points. The participant encountered 18 barriers while completing this task. In step 10, the participant was looking for a specific piece of information but was on the wrong page; this barrier can be attributed to problems or deficiencies associated with information and computer literacies. In step 14, the participant was confused by the scale and provided an incorrect answer due to misinterpretation of the information presented in the aggressive/conservative scale. This barrier reflects a struggle with numeracy because the participant demonstrated an understanding of numbers as evidenced by the ability to draw inferences about the scale, but was unable to apply the knowledge and analyze it in different representational formats. Then in step 16, the participant provided an incorrect answer. The participant was unable to read, interpret, and analyze the health text to extract an accurate description of the terms aggressive and conservative as used in this context; this barrier reflects a struggle with health literacy. The participant required several reminders or explanations of task questions, in steps 11, 13, and 15. These reminders and explanations indicated information literacy barriers, reflecting a lack of recognition and understanding of the nature of the information need.
Summary Results From 20 Participants
A summary of 20 users’ task performance results are presented to illustrate the aggregate measures obtained and potential analyses that can be performed using our approach. The users were recruited from the Union Settlement Association and the Columbia Community Partnership for Health Center in New York, NY.
Participants recruited were adults between 18 and 65 years of age; all had basic proficiency with computers and the Internet. A total 14 of 20 (70%) of participants were female, most reported annual incomes below US $30,000, and a majority of participants reported their race as African American or Hispanic. Participants had a range of education backgrounds, with 7 participants reporting high school education, 7 having a college degree, and 6 with a graduate degree.
As
describes, Question B had the lowest literacy and complexity demands relative to the other questions. shows that participants scored highest on this question, with 16 out of 20 correct answers. Participants struggled most with question A, with only 2 correct answers, and 11 partial answers. Although question C had the highest complexity levels of cognitive demands, 10 out of 20 participants (50%) answered this question correctly. Each question varied in terms of domain knowledge, complexity, and types of demands. Scores merely provide a snapshot of user task performance. Although we can use the scores to compare and contrast task performance across the different task questions, analysis of the barriers impeding task performance can yield additional insight into the resulting participant scores.shows the number of barriers encountered by all participants for each step. The most barriers were encountered in step 11, with a total of 51 barriers encountered. This step required users to make the appropriate selections in order to compare the different hospitals selected. Most of the barriers on this step stemmed from unfamiliarity with making the appropriate selections using checkboxes, reflecting inadequate computer literacy. Users encountered a high number of barriers at steps 13 and 15 as well. These steps are both constituents of question A, on which participants scored the lowest of the three questions. This aggregate analysis revealed the steps in which users experienced the most difficulty and exemplifies the patterns of barriers encountered in carrying out those problem steps.
We aggregated the types of barriers encountered by users in a manner similar to the analysis in Clark et al [
], which provided cumulative descriptors of the component barriers encountered across a set of steps and tasks. presents the classifications of literacy type and cognitive demands of barriers encountered in task performance. The same excerpt of steps (steps 10–16) was depicted as in . Most of the barrier classifications in these steps are due to barriers with information and computer literacy. Step 13, which required understanding question A, caused many barriers at levels 1 (remembering) and 2 (understanding) within information literacy. These barriers primarily involved struggling to identify and interpret the information need. Step 14, which required locating and interpreting the aggressive/conservative scale, led to many numeracy level 4 barriers (analyzing based on representation). Step 15 asked users to describe the meaning of aggressive/conservative in the context of hospitals and health care, and users struggled with finding resources to meet this information need. These barriers are reflected by the majority of barriers being information literacy and computer literacy barriers. Step 16 reflected many health literacy as well as some writing barriers; users struggled with understanding, interpreting, and articulating aggressive/conservative in their own words. The majority of barriers fell in the lower ranges of cognitive demands (levels 1–4). The task demands also required mainly literacies at these lower levels. The patterns of barrier types as revealed by the coding reflected the nature of the task demands and provided insight into the types of barriers that participants encountered.Overall, within the hospital ratings task, users scored highest on question B and encountered the most barriers in question A. The barriers identified reflected that users struggled primarily with information literacy, computer literacy, and numeracy skills in answering the question and completing the tasks.
Discussion
In this research, we adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a CTA, and (2) code user performance on tasks. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the CTA can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. In this study, we used the methods to document a range of literacy-related barriers that affected performance on eHealth tasks.
The study found that 20 participants experienced some difficulty completing most tasks on a website designed for consumers without some assistance. The most frequent barriers encountered by our sample were challenges with information literacy and computer literacy skills. Specific examples of frequent barriers encountered are struggling with the ability to understand and successfully act on information needs, to interpret a graphical representation of a severity scale, and to effectively use checkboxes to make selections. Conversely, some activities in which we had predicted barriers were discovered to be easier than anticipated. Evaluating health information to inform decisions can be complex and challenging, but users scored well on the question with a decision point.
There is little existing research that systematically analyzes the combined set of eHealth skills needed to attain proficient performance. Other investigators have expanded the scope of health literacy to describe the combinations of skills needed to interact effectively with health information [
] but did not consider technology-related skills, such as computer literacy, that are a core part of eHealth literacy. Our results largely echo findings in prior health numeracy research that users often struggled with interpreting graphical representations of numerical information, which may constitute significant consumer barriers [ ]. Our findings also support recommendations to develop tools that aid health care consumers in understanding complex health concepts and to use the information to inform a decision [ ]. Usability studies take a similar approach in breaking down task demands to analyze user task performance. Our method is consistent with usability findings that a granular approach to task analysis is essential to reveal potential barriers and inform design improvements, particularly for novice users [ ].Limitations
We view the framework as provisional and subject to more comprehensive validation and elaboration. This will necessitate a larger-scale study with a greater sample size, a more diverse population, and a wider range of tasks. In addition, the participants in the study were not familiar with the Consumer Reports Health website and this may have influenced our findings. Familiarity with content, style, and affordances common to this site would have likely reduced some of the barriers that participants experienced. Further studies should include participants with varying degrees of experience with a particular website or technology.
The analyses in this paper focused on user competencies and did not take into consideration a range of issues, such as usability, or affordances and resources available within specific technology tools. In addition, the methods employed did not take into account individual motivation or attitudes toward technology. Similarly, this cognitive rational framework does not capture emotional and social factors that also play a significant role in decision making. It is well known that health literacy is a major public health issue in the United States affecting a substantial segment of the population [
]. In general, a multitude of environmental and societal factors, such as differential access to the eHealth tools, influence the productive use of technology in health-related contexts. Although these individual and social factors significantly influence task performance, our leading-edge hypothesis is that eHealth literacy is a distinct construct and an important one in consumer health informatics.As previously described in
, there are many different types of eHealth tools and eHealth tasks. The framework was illustrated using an example task on the Consumer Reports Health website. This website aims to present information simply and comparatively. Consumer Reports has been presenting unbiased and evidence-based comparisons in print form for many years. However, evidence in health is often complex and there may be alternative ways for rendering such information as comprehensible to individuals lower in eHealth literacy. The effective presentation of health evidence is a challenge that continues to plague most health communication and decision aid materials [ ]. The website selection was sufficient for the purpose of illustrating the framework. It should be noted that the aggressive/conservative continuum scale is no longer used on the Consumer Union’s health site. Further exploration will apply the framework to a wider array of tasks, tools, and health domains.Further Development of the Framework and Analytic Method
Further studies are needed to determine whether the types of literacy described in this paper sufficiently cover the range of knowledge types that characterize eHealth competency. In addition, although Bloom’s taxonomy has an established history of characterizing cognitive dimensions of tasks in educational contexts, we cannot presuppose that the gradations of complexity will seamlessly transfer to eHealth. The results of this analysis suggest that it can be used meaningfully to differentiate and categorize cognitive demands for different literacy skills and can be used to approximate complexity in a range of eHealth tasks.
As discussed, the tasks used in the study did not delve deeply into media, science, and to some extent health literacy. As health consumers choose what resources to use, media literacy will loom large. We anticipate that our methods will be adequate to model the skills and knowledge needed to demonstrate media literacy competency. The problems associated with low health literacy are well documented [
]. Science literacy is a multifaceted construct, and there is ample evidence to suggest that problems associated with science literacy are equally profound. The general public in the United States and other countries have an impoverished understanding of science [ ]. Norman and Skinner [ ] situate scientific literacy in a broader context, defining it as “understanding of the nature, aims, methods, application, limitations, and politics of creating knowledge in a systematic manner.” The framework employs a CTA approach that places a strong emphasis on skills and action. This may not capture other dimensions of science literacy such as understanding biological mechanisms of disease and critical appraisal of the scientific process. These aspects come into play in situations such as when an individual must understand the consequences of a therapeutic regimen or decide whether to enroll in a randomized controlled clinical trial. Clearly, we would need a broader array of concepts and a richer set of representations than those offered by the CTA stepwise analytic method to model such knowledge and causal inferences associated with its application in the context of health.The proposed framework provides a basis for the development of an eHealth competence model. Such a model would yield insight into the specific skills and knowledge needed to perform at a proficient or higher level on system-specific instances of eHealth tasks, such as seeking information about hypertensive therapies on the WebMD site. The current set of framework-based methodological tools lends greater utility to the consumer health research community than to communities of practitioners and designers. Applying this method is time intensive and requires moderate expertise in the areas of cognition and human–computer interaction. We anticipate that the framework would give rise to simpler, more specific instruments (for example, in the form of a set of questions or heuristics) that could measure eHealth demands for a particular task and population as realized in a particular system or device. An analogy would be Nielsen’s heuristic evaluation method [
], which has made it possible for teams of developers to conduct basic usability evaluations without extensive training or prohibitive time commitments.With further investigation, we envision that the framework and analytic approach can be a potentially powerful generative research tool for development of design guidelines of computer-based tools, evaluation heuristics, task-based eHealth literacy assessment, and educational objectives to increase consumer eHealth skills. For example, the framework could form the basis for development of a matching algorithm to identify appropriate tools for users with different skill sets. In particular, this framework and analysis method can be used with health care consumers with low eHealth skills to better understand barriers and to develop educational media or other mediating tools to facilitate engagement with and benefit from eHealth. Barriers fall on a continuum ranging from routine abilities (recognizing how to use widgets) to complex conceptual challenges (deriving inferences from health text). The proposed framework systematically characterizes eHealth barriers, which in turn enables more precise definition within the solution space of methods to overcome those barriers.
Conclusions
In our view, this framework provides a systematic and potentially rigorous approach for analyzing eHealth competencies, which is a challenge of considerable complexity and great significance. Advances in technologies, such as Web 2.0 and social networking functionalities, offer new and ever-changing modes for consumers to interact with and manage health information. In the current environment where eHealth interventions are being developed without a thorough understanding of the consumers, efforts, and resources can be better focused to improve adoption and use rates as well as benefit from use. Unfortunately, these barriers disproportionately affect those who are most vulnerable and may actually serve to exacerbate disparities rather than bridge them. There is no doubt that consumers will be expected to assume a greater role in their health management in coming years, and low eHealth literacy will continue to be a barrier to productive participation. Progress in eHealth research will be integral to the success of consumer health applications and for reducing barriers to the use of those applications.
Acknowledgments
This work is supported by grants from the National Institute for Nursing Research (1R21NR010710) and the National Library of Medicine (1R21LM01068801) awarded to David Kaufman. We thank Suzanne Bakken, Jessica Ancker, Sara Czaja, Noemie Elhadad, Lisa Matthews and Alisabeth Shine for their help in preparing and reviewing this manuscript. We also thank David Nocenti and Union Settlement Association, Bernadette Boden-Albala and the Columbia Community Partnership for Health in recruiting participants. We appreciate Beth Nash and Consumers Union's support in gaining access to their site. We thank the Department of Biomedical Informatics at Columbia University for their support. Connie Chan was supported by NLM pre-doctoral fellowship T 15-LM007079.
Conflicts of Interest
None declared
Authors' Contributions
This work originated with CC's dissertation work. DK was her thesis adviser. All facets of this work from its inception to the completion of this manuscript represent a close collaboration.
References
- Eysenbach G. What is e-health? J Med Internet Res 2001;3(2):E20 [FREE Full text] [CrossRef] [Medline]
- Jimison H, Gorman P, Woods S, Nygren P, Walker M, Norris S, et al. Barriers and drivers of health information technology use for the elderly, chronically ill, and underserved. Evid Rep Technol Assess (Full Rep) 2008 Nov(175):1-1422. [Medline]
- Demiris G, Afrin LB, Speedie S, Courtney KL, Sondhi M, Vimarlund V, et al. Patient-centered applications: use of information technology to promote disease management and wellness. A white paper by the AMIA knowledge in motion working group. J Am Med Inform Assoc 2008;15(1):8-13 [FREE Full text] [CrossRef] [Medline]
- Karkalis GI, Koutsouris DD. E-health and the Web 2.0. In: Proceedings of the ITAB. 2006 Presented at: The International Special Topic Conference on Information Technology in Biomedicine; Oct 26-28, 2006; Ioannina, Greece.
- Giustini D. How Web 2.0 is changing medicine. BMJ 2006 Dec 23;333(7582):1283-1284. [CrossRef] [Medline]
- Viswanath K, Kreuter MW. Health disparities, communication inequalities, and eHealth. Am J Prev Med 2007 May;32(5 Suppl):S131-S133. [CrossRef] [Medline]
- Fox S. The engaged e-patient population. Washington, DC: Pew Internet & American Life Project; 2008 Aug 26. URL: http://www.pewinternet.org/Reports/2008/The-Engaged-Epatient-Population/The-Engaged-E-patient-Population.aspx [accessed 2011-01-29] [WebCite Cache]
- Chapman G, Elstein A. Cognitive processes and biases in medical decision making. In: Chapman G, Sonnenberg F, editors. Decision Making in Health Care: Theory, Psychology, and Applications. New York, NY: Cambridge University Press; 2000:183.
- Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. JAMA 2002;287(20):2691-2700 [FREE Full text] [Medline]
- Jaffery JB, Becker BN. Evaluation of eHealth web sites for patients with chronic kidney disease. Am J Kidney Dis 2004 Jul;44(1):71-76. [Medline]
- McCray AT. Promoting health literacy. J Am Med Inform Assoc 2005;12(2):152-163 [FREE Full text] [CrossRef] [Medline]
- McNeill LH, Puleo E, Bennett GG, Emmons KM. Exploring social contextual correlates of computer ownership and frequency of use among urban, low-income, public housing adult residents. J Med Internet Res 2007;9(4):e35 [FREE Full text] [CrossRef] [Medline]
- Cashen MS, Dykes P, Gerber B. eHealth technology and Internet resources: barriers for vulnerable populations. J Cardiovasc Nurs 2004;19(3):209-14; quiz 215. [Medline]
- Fox S. Digital divisions. Washington, DC: Pew Internet & American Life Project; 2005 Oct 05. URL: http://www.pewinternet.org/~/media//Files/Reports/2005/PIP_Digital_Divisions_Oct_5_2005.pdf.pdf [accessed 2011-09-22] [WebCite Cache]
- Zeng QT, Kogan S, Plovnick RM, Crowell J, Lacroix EM, Greenes RA. Positive attitudes and failed queries: an exploration of the conundrums of consumer health information retrieval. Int J Med Inform 2004 Feb;73(1):45-55. [CrossRef] [Medline]
- Kim EH, Stolyar A, Lober WB, Herbaugh AL, Shinstrom SE, Zierler BK, et al. Challenges to using an electronic personal health record by a low-income elderly population. J Med Internet Res 2009;11(4):e44 [FREE Full text] [CrossRef] [Medline]
- Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, et al. Usability in the real world: assessing medical information technologies in patients' homes. J Biomed Inform 2003 Apr;36(1-2):45-60. [Medline]
- McCaffery KJ, Smith SK, Wolf M. The challenge of shared decision making among patients with lower literacy: a framework for research and development. Med Decis Making 2010;30(1):35-44. [CrossRef] [Medline]
- Kamel Boulos MN, Wheeler S. The emerging Web 2.0 social software: an enabling suite of sociable technologies in health and health care education. Health Info Libr J 2007 Mar;24(1):2-23. [CrossRef] [Medline]
- Eng TR, Maxfield A, Patrick K, Deering MJ, Ratzan SC, Gustafson DH. Access to health information and support: a public highway or a private road? JAMA 1998 Oct 21;280(15):1371-1375. [Medline]
- Botts N, Horan TA. Bridging care communication and health management within diverse and underserved populations. In: Proceedings. 2008 Presented at: 14th Americas Conference on Information Systems; Aug 14-17, 2008; Toronto, ON, Canada.
- Chan CV, Matthews LA, Kaufman DR. A taxonomy characterizing complexity of consumer eHealth Literacy. AMIA Annu Symp Proc 2009;2009:86-90. [Medline]
- Norman CD, Skinner HA. eHealth Literacy: Essential Skills for Consumer Health in a Networked World. J Med Internet Res 2006;8(2):e9 [FREE Full text] [CrossRef] [Medline]
- Larkin BG, Burton KJ. Evaluating a case study using Bloom's Taxonomy of Education. AORN J 2008 Sep;88(3):390-402. [CrossRef] [Medline]
- Pickard M. The new Bloom's taxonomy: an overview for family and consumer sciences. J Fam Consum Sci Educ 2007;25(1):45-55.
- Reeves M. An application of Bloom's taxonomy to the teaching of business ethics. J Bus Ethics 1990;9(7):609-616. [CrossRef]
- Catts R, Lau J. UNESCO. 2008. Towards Information Literacy Indicators URL: http://www.uis.unesco.org/template/pdf/cscl/InfoLit.pdf [accessed 2011-01-29] [WebCite Cache]
- Thoman E. Skills and strategies for media education. Educ Leadership 1999;56(5):50-54.
- Ancker JS, Kaufman D. Rethinking health numeracy: a multidisciplinary literature review. J Am Med Inform Assoc 2007;14(6):713-721 [FREE Full text] [CrossRef] [Medline]
- Rudd RE, Moeykens BA, Colton TC. Health and literacy: a review of medical and public health literature. In: Comings J, Garners B, Smith C, editors. Annual Review of Adult Learning and Literacy. San Francisco, CA: Jossey-Bass; 2000:158-199.
- Laugksch RC. Scientific literacy: a conceptual overview. Sci Educ 2000;84(1):71-94. [CrossRef]
- Rudd R, Kirsch I, Yamamoto K. Policy Information Center, Educational Testing Service. 2004. Literacy and Health in America: Policy Information Report URL: http://www.ets.org/Media/Research/pdf/PICHEATH.pdf [accessed 2011-01-29] [WebCite Cache]
- Krathwohl DR. A revision of Bloom's taxonomy: an overview. Theory Into Pract 2002;41(4):212-218.
- Amer A. Reflections on Bloom's revised taxonomy. Electronic J Res Educ Psychol 2006(1):213-230 [FREE Full text] [WebCite Cache]
- Roth E, Patterson E, Mumaw RMarciniak JJ. Cognitive engineering: issues in user-centered system design. In: Encyclopedia of Software Engineering. 2nd edition. New York, NY: John Wiley; 2002:163-179.
- Patel VL, Arocha JF, Kaufman DR. A primer on aspects of cognition for medical informatics. J Am Med Inform Assoc 2001;8(4):324-343 [FREE Full text] [Medline]
- Patel V, Kaufman D. Cognitive science and biomedical informatics. In: Shortliffe E, Cimino J, editors. Biomedical Informatics: Computer Applications in Health Care and Biomedicine. 3rd edition. New York, NY: Springer; 2006:133-185.
- Schraagen J, Chipman S, Shalin V. Cognitive Task Analysis. Mahwah, NJ: Erlbaum; 2000.
- Ericsson KA, Simon HA. Protocol Analysis: Verbal Reports as Data. Revised edition. Cambridge, MA: MIT Press; 1993.
- Cotton D, Gresty K. Reflecting on the think-aloud method for evaluating e-learning. Br J Educ Technol 2006;37(1):45-54. [CrossRef]
- Clark MC, Czaja SJ, Weber RA. Older adults and daily living task profiles. Hum Factors 1990 Oct;32(5):537-549. [Medline]
- Zarcadoolas C, Pleasant A, Greer DS. Understanding health literacy: an expanded model. Health Promot Int 2005 Jun;20(2):195-203 [FREE Full text] [CrossRef] [Medline]
- Nielsen-Bohlman L, Panzer AM, Kindig DA. Health Literacy: A Prescription to End Confusion. Washington, DC: National Academies Press; 2004.
- Britto MT, Jimison HB, Munafo JK, Wissman J, Rogers ML, Hersh W. Usability testing finds problems for novice users of pediatric portals. J Am Med Inform Assoc 2009;16(5):660-669 [FREE Full text] [CrossRef] [Medline]
- Epstein RM, Alper BS, Quill TE. Communicating evidence for participatory decision making. JAMA 2004 May 19;291(19):2359-2366 [FREE Full text] [CrossRef] [Medline]
- National Science Board. Science and Engineering Indicators 2006. Arlington, VA: National Science Foundation; 2006. URL: http://www.nsf.gov/statistics/seind06/pdf/volume1.pdf [accessed 2011-05-31] [WebCite Cache]
- Nielsen J. Heuristic evaluation. In: Nielsen J, Mack RL, editors. Usability Inspection Methods. New York, NY: Wiley; 1994:25-62.
Edited by G Eysenbach; submitted 02.02.11; peer-reviewed by C Norman, P Dykes; comments to author 25.04.11; revised version received 05.07.11; accepted 06.07.11; published 17.11.11
Copyright©Connie V Chan, David R Kaufman. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 17.11.2011.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.