Published on in Vol 10, No 1 (2008): Jan-Mar

What Do Evaluation Instruments Tell Us About the Quality of Complementary Medicine Information on the Internet?

What Do Evaluation Instruments Tell Us About the Quality of Complementary Medicine Information on the Internet?

What Do Evaluation Instruments Tell Us About the Quality of Complementary Medicine Information on the Internet?

Original Paper

1School of Nursing and Community Studies, University of Plymouth, Plymouth, UK

Corresponding Author:

Ray Jones, PhD

School of Nursing and Community Studies

University of Plymouth

Faculty of Health and Social Work

Plymouth PL4 8AA

United Kingdom

Phone: +00 44 1752 233886

Fax:+00 44 1752 586748

Email: ray.jones@plymouth.ac.uk


Background: Developers of health information websites aimed at consumers need methods to assess whether their website is of “high quality.” Due to the nature of complementary medicine, website information is diverse and may be of poor quality. Various methods have been used to assess the quality of websites, the two main approaches being (1) to compare the content against some gold standard, and (2) to rate various aspects of the site using an assessment tool.

Objective: We aimed to review available evaluation instruments to assess their performance when used by a researcher to evaluate websites containing information on complementary medicine and breast cancer. In particular, we wanted to see if instruments used the same criteria, agreed on the ranking of websites, were easy to use by a researcher, and if use of a single tool was sufficient to assess website quality.

Methods: Bibliographic databases, search engines, and citation searches were used to identify evaluation instruments. Instruments were included that enabled users with no subject knowledge to make an objective assessment of a website containing health information. The elements of each instrument were compared to nine main criteria defined by a previous study. Google was used to search for complementary medicine and breast cancer sites. The first six results and a purposive six from different origins (charities, sponsored, commercial) were chosen. Each website was assessed using each tool, and the percentage of criteria successfully met was recorded. The ranking of the websites by each tool was compared. The use of the instruments by others was estimated by citation analysis and Google searching.

Results: A total of 39 instruments were identified, 12 of which met the inclusion criteria; the instruments contained between 4 and 43 questions. When applied to 12 websites, there was agreement of the rank order of the sites with 10 of the instruments. Instruments varied in the range of criteria they assessed and in their ease of use.

Conclusions: Comparing the content of websites against a gold standard is time consuming and only feasible for very specific advice. Evaluation instruments offer gateway providers a method to assess websites. The checklist approach has face validity when results are compared to the actual content of “good” and “bad” websites. Although instruments differed in the range of items assessed, there was fair agreement between most available instruments. Some were easier to use than others, but these were not necessarily the instruments most widely used to date. Combining some of the better features of instruments to provide fewer, easy-to-use methods would be beneficial to gateway providers.

J Med Internet Res 2008;10(1):e3

doi:10.2196/jmir.961

Keywords



While the ever-expanding source of health information might be seen as a positive step in consumer empowerment (between 36% and 55% of Internet users access online health information [1-4]), several studies have highlighted problems with the quality of the information [5-8]. Searching for relevant and reputable complementary medicine information is particularly challenging [9], in part due to methodological challenges. Schmidt and Ernst [10] report that claims are made that can put consumers at risk; in some cases, adherence to advice obtained from the Internet has had serious consequences [8,11]. This is of particular concern for people who may be vulnerable, such as those affected by cancer [12]. In the case of hydrazine sulfate poisoning [8], inaccurate and exaggerated claims of effectiveness and lack of information on side effects were blamed for misleading a consumer who assumed the substance was safe.

A 2002 study estimated that 5 million adults in England lack basic literacy [13]. Furthermore, understanding health information may be a complex process requiring more than basic literacy skills as even well-educated people can have difficulties making sense of it [14]. A US study of over 350 health sciences students showed that while many rated themselves as possessing good research skills, only a small proportion were able to demonstrate that they could identify reliable information [15]. While consumers may regularly make judgments of the quality of information received through traditional media such as newspapers, books, or leaflets, quality indicators for Internet content may not be as evident to users [16].

Consumers looking for health information are likely to select the first few links that appear on search engines and tend not to look for information about site authors or disclaimers that sites may make [17]. Studies have found that when consumers evaluate the quality of health information on the Internet, they tend to rely onendorsement by government agencies or professional organizations, their own perception of reliability of the website source, and the understandability of the information [18,19].

Several strategies have been designed to help health information seekers access high-quality information, including codes of conduct, gateway sites (portals), and evaluation instruments. The development of instruments has received the greatest attention, and some suggest that their use by consumers can educate the user as to the characteristics of a good quality website [16], but the behavior of the majority of consumers would suggest that gateways may be the best approach. Evaluation instruments can still provide a method for researchers to help choose links for gateways. Evaluation instruments work on the premise that they can identify “quality” sites on the assumption that sites that conform to indicators of quality are likely to contain accurate information. Accurate information is defined as being based on a gold standard of information in the field. While it is not possible for someone with no domain knowledge to assess accuracy, it may be that instruments can be used to help make judgments on quality and hence predict accuracy.

Gateways are collections of sites that have been prescreened and deemed of high enough quality to be approved by a governing organization. Examples of these are Healthfinder [20] and Intute [21]. Although maintaining portals can be labor intensive, organizations providing services such as complementary medicine for cancer need to be able to recommend sites to their patients. There are many instruments that have been designed, ranging from simple checklists to long and complex documents providing detailed accounts of assessment methodologies, and organizations running a portal need to choose which to use. Three characteristics that would seem important are (1) agreement with other instruments when rating a website, (2) ease of use, and (3) longevity. On the latter, many instruments seem to have a very limited life span. In 1998, for example, Jadad and Gagliardi identified 47 instruments used to rate the quality of health information on the Internet [22], but 4 years later [23], only six of these instruments still existed.

Our study was conducted in a center providing complementary care for people affected by cancer. The aim of this study was to identify website evaluation instruments and to assess their performance when used by a researcher to evaluate a sample of 12 websites on complementary medicine for people with breast cancer. In particular, we asked the following:

  • Do the instruments use the same criteria, and do they agree on the ranking of websites?
  • How easy are the different instruments for a researcher to use?
  • Are these instruments likely to remain in use such that future readers will appreciate the assessment method used?
  • Could we identify a pragmatic approach to identify good quality complementary medicine websites using existing instruments?

Literature Search for Evaluation Instruments

We defined an evaluation instrument as something that an Internet user could use to assess the quality of a website containing health information. To identify evaluation instruments, search terms were based on previous papers that had attempted to identify instruments for evaluating the quality of Internet health information [22,24,25]. The databases Medline, AMED, BNI, CINAHL, EMBASE, and PsychInfo were searched in February 2007 using the following terms: “evaluat* OR assess* OR rating OR rat* OR ranking OR rank* OR quality OR criteria AND website* OR world wide web OR Internet.” This achieved results of 29,622, 233, 123, 14,859, 8678, and 10,593, respectively. When in excess of 1000, the most recent results from each database were examined.

In addition to the above databases, the search engines Google, MSN, Yahoo, and WebCrawler were searched using the following terms: “evaluate OR assess OR rating OR criteria OR quality AND websites OR Internet.” This achieved results of 212,000,000, 25,410,704, 38,400,000, and 28, respectively. With the exception of WebCrawler (28 results), the first 100 results of each were examined. Relevant research papers and bibliographies were also examined for relevant references. Several large studies that had attempted to systematically search for and identify instruments for assessing Internet health information were found [22-25].

Instruments were selected if they provided the user with explicit instructions for evaluating the quality of a website containing health information. While the HON Code of Conduct (HONcode) has been mentioned as a gateway site, its evaluation criteria were included in this report.

Internet Search for Complementary Medicine Websites

To search for websites to be assessed, a search for “complementary (medicine OR therapies) AND breast cancer” was performed in February 2007 using the Google search engine. This resulted in 1,170,000 hits. The first six results were selected on the basis that people are most likely to look at only the first few results produced by a search engine [17]. Another six results were chosen purposively to obtain a selection of sites with different purposes and origins: sites belonging to charities, sponsored sites, and sites selling products.

Assessment of Websites

The 12 websites were evaluated using each of the 12 evaluation instruments (ie, 144 assessments). Each site was given a mark using the individual scoring system for each instrument, which was then converted to a percentage score. Some instruments gave negative scores for failing to meet criteria; therefore, it was possible for a negative score to be obtained. Sites were then ranked from 1 (best) to 12 (worst) based on these scores.

Comparison of Evaluation Instruments

The range of criteria used by the identified instruments was compared to the nine main criteria identified by the Health Improvement Institute and Consumer Reports WebWatch (HIICRW) [26] in 22 health information rating instruments. Agreement between instruments was assessed by a correlation matrix using Spearman rank correlation on the instruments’ ranking of the websites.

Illustrative Comparison of Best and Worst Sites

The range of content on each site made comparison against a gold standard impossible. Nevertheless, we sought some “face validity” in that sites ranked as “good” or “poor” using these evaluation instruments matched with common sense. Statements made on the site ranked the best by the sum of the 12 instruments were compared to those on the site ranked the worst.

Citation Search for Use of Evaluation Instruments

A citation search on Web of Science was carried out using the original papers describing the instruments. A sample of papers that cited the original paper was reviewed, and an estimate was made of the number of papers that had used the tool. A citation search on Google using the instrument’s http address was carried out. A sample of websites that cited the original Web address of the tool was checked to see if the citation was correct. The number of citations on Web of Science or Google was classified as low (less than 10), medium (11-100), or high (greater than 100).

Longevity of Instruments

The URLs of instruments that had been identified in four previous studies were checked (as part of the literature search) to see if they still existed. Instruments were reported as unavailable if the original URL was not found and searching the original site or Google for the instrument did not locate it.


Evaluation Instruments Available

A total of 39 instruments that disclosed their criteria and aimed to help users identify good quality information online were identified. Of these, 12 met our inclusion criteria (Table 1); the other 27 were excluded (Table 2). Instruments were selected if they provided the user with a set of objectives and closed questions that could be applied to a website containing health information by someone with no prior subject knowledge and without having to look at sources other than the website being assessed. Reasons for exclusion of the 27 instruments included the following:

  • A consumer could not apply the instrument without further knowledge (eg, “Is the information written by reputable authors?”).
  • Scoring details were unavailable (eg, Instructions stated to score each criterion on a scale of 1-5, but no further information was given as to how to allocate a value.).
  • Questions were not objective (eg, “Are the graphics attractive?”).
  • Instrument was not designed specifically for health information.
  • Questions were open ended (eg, “What are the author’s qualifications?”).
  • Instrument took the role of a tutorial that gave tips on how to find reliable health information on the Internet but was not applicable as an instrument.

Websites Sampled

Table 3 shows the websites that were rated using the instruments; four of the sites were run by UK charities, two sites were selling products, and three sites were US sites offering cancer treatment. One site was run by a network of health professionals, one site was funded by advertising on its site, and one site was funded by sponsors.

Table 1. Evaluation instruments used in the study
Evaluation InstrumentMethod of AssessmentEase of Use (Researcher Assessment)Comprehensiveness (HIICRW Criteria Met, out of 9)
WEB FEET HEALTH Collection:
Criteria for Site Selection (WEB FEET) [27]
24 statements: agree or disagree+ Straightforward questions
− Time consuming to apply
7
HONcode [28]8 desirable properties+ Short tool, quick to apply
+ Each element includes guidelines
4
Emory University Rollins School of Public Health, Health-Related Web Site Evaluation Form (Emory) [29]36 statements: +1 disagree, +2 agree, 0 N/A
Score: 0-60
+ Interpretation of score
− Time consuming
7
University of Michigan Web Site Evaluation Checklist (Michigan) [30]43 questions, with variety of positive and negative scores
Score: −80 to +80
+ Printable rating form
+ Interpretation of score
− Time consuming
− Complex scoring system
6
Kellogg Library (University of Dalhousie), Evaluation of Health Information on the Internet (Kellogg) [31]31 questions: agree or disagree+ Simple questions, straightforward to use
− Time consuming
8
DISCERN Quality Criteria for Consumer Health Information (DISCERN) [32]16 questions on 5-point analogue scale from “No” to “Yes”
Overall score: 1-5
+ Explanation of criteria
+ Interpretation of score
− Answers on visual analogue scale more difficult / time consuming
5
National Center for Complementary and Alternative Medicine (NCCAM), 10 Things to Know About Evaluating Medical Resources on the Web (NCCAM) [33]10 questions each with explanation+ Clear guidance of how to use criteria
− No scoring system
6
US Pharmacist tool (Pharm) [34]15 questions with yes/no answers+ Easy to apply
− No scoring system
6
Minervation Validation Instrument for Health Care Web Sites (Minervation) [35]Semi-automated tool requires URL of site being assessed
Drop-down menus to answer questions of content and usability
Rating automatically calculated
Score: 0-100%
+ Automated usability check
+ Drop-down menus, fast
+ Interpretation of score
5
Nicoll LH, author’s guidelines (Nicoll) [36]Mnemonic (PLEASED) with yes/no questions, each with author justification of importance+ Quick to use
+ Explanation of criteria
− No scoring system
5
Silberg et al, authors’ guidelines (Silberg) [37]4 items that should be met+ Quick to apply
− Does not assess aspects unique to Internet information
3
Sandvik score (Sandvik) [38]7 questions, each with 3 options scored 0-2
Score: 0-14
+ Quick to apply
+ Simple scoring system
3
Table 2. Evaluation instruments excluded
Evaluation InstrumentRequires Further KnowledgeScoring Details UnavailableQuestions not ObjectiveNot Health SpecificOpen-Ended QuestionsTutorial
Quality Criteria for Health Related Websites [39]x
Net Scoring Criteria to Assess the Quality of Health Internet Information [40]x
Criteria for Evaluating the Quality of Health Information on the Internet [41]xx
Administration Design Quality Web Site Evaluation Method
[42]
x
Evaluating Websites [43]xx
Navigating the Health Care System: How to Evaluate Health Information on the Internet [44]xx
Rating Criteria and Excellence Awards [45]x
Clean Bill of Health Award [46]x
Health Website Rating (HWR) Project: HII Health Website Rating Instrument (HWRI) [47]x
Clearing House*x
Best of the Web in Mental Health: Rating Guidelines [48]x
Commentary: Measuring Quality and Impact of the World Wide Web [49]x
Evaluating Internet Health Information: A Tutorial From the National Library of Medicine [50]x
MedlinePlus Guide to Healthy Web Surfing [51]x
Taking Charge of Health Information [52]x
How to Evaluate Health Information on the Internet: Questions and Answers [53]x
How to Find the Most Trustworthy Health Information on the Internet [54]x
Internet Detective [55]xx
Internet for Health and Well-Being [56]xx
Suggestions for Using the Internet to Find New Cancer Treatments [57]x
Internet Health Coalition*x
How to Judge the Quality of a Web Site [58]xx
Intute: Health and Life Sciences Evaluation Guidelines [59]xx
Best Practice Web Assessments: Evaluation Criteria [60]xx
Evaluation Form Used for LASIK Websites [61]x
Quality Standards for Medical Publishing on the Web [62]x
Evaluating Internet Resources in Complementary and Alternative Medicine [63]xx

*Instruments became unavailable between initial search (February 2007) and final submission of paper (November 2007).

Table 3. Twelve websites on complementary medicine and breast cancer
WebsitePurpose of SiteReason for Inclusion
Breast Cancer Care [64]A UK charity aimed at providing information and support for people affected by breast cancer. National Health Service (NHS) information partner.1st on Google
Breast Cancer Haven [65]A UK charity that runs day centers offering support, information, and complementary therapies to people affected by breast cancer.2nd on Google
CancerHelp UK [66] A UK information service for people with cancer and their families run by the Cancer Research UK charity for cancer and cancer care.3rd on Google
Imaginis [67]An independent resource for information and news on breast cancer and related women’s health topics.4th on Google
MD Anderson Cancer Center [68]An information service run by the University of Texas MD Anderson Cancer Center that offers medical services to people with cancer.5th on Google
Cancer Treatment Centers of America [69]An information service run by Cancer Treatment Centers of America, a network of cancer treatment hospitals and facilities offering conventional and complementary therapies.6th on Google
Cancerbackup [70]A cancer information charity offering information, practical advice, and support for cancer patients, their families, and caregivers.Charity
Heart Spring [71]A resource for alternative and complementary health information funded by advertising and product sales.Sponsored: product advertisements
Issels Treatment [72]Information produced by Issels Medical Center, a private organization offering alternative treatment for cancer.Private cancer center
Alternative Cancer [73]A site run by an individual selling a guide to complementary and alternative cancer treatments.Commercial
MedicineNet [74]Medical information written by a network of medical professionals.Sponsored: product advertisements
Elbee Global [75]A site selling herbal medicines for people with cancer.Commercial

Assessment of Evaluation Instruments

Comprehensiveness

The HIICRW [26] defined nine criteria that an assessment should have. Assessment of each evaluation instrument against the HIICRW criteria showed considerable variation, implying little consensus on quality markers for websites. Although assessment of more criteria may not mean an evaluation instrument is superior, it is interesting that two of the better-known instruments (HONcode and DISCERN) assessed relatively few of the items described by HIICRW (see Table 1).

Ease of Use

Table 1 shows the researcher’s subjective view on the evaluation instruments’ ease of use. Time taken is an important component of ease of use; answering Michigan University’s 43 questions was extremely time consuming, in contrast to the automated Minervation instrument, which could be applied very quickly. Some instruments were not designed to provide numerical scores. It was useful to have some interpretation of how many criteria a website should meet for it to be thought of as being good or bad quality. Instruments varied in the explanation of their criteria. It was helpful to have further guidance available to answer questions, such as provided by HONcode and DISCERN.

Ranking of Websites

Table 4 shows the percentage score for each of the 12 websites and the ranking from best (1) to worst (12) by each instrument and overall. It was notable that the well-known UK charity site Cancerbackup came only 4th in the overall ranking and that the WEB FEET tool ranked it 7th, way behind the Elbee Global website. The HONcode ranked it 5th, on par with the Elbee Global website. Overall, the best site was Imaginis and the worst, Alternative Cancer.

Comparison of Evaluation Instruments

Table 5 shows the agreement (rank correlations) among instruments on the ranking of the 12 websites from best to worst. Where there is a significant correlation (eg, between Michigan and Kellogg), using either tool would give similar results. This showed that WEB FEET and HONcode seemed to assess different characteristics than the other instruments.

Recognition and Use of Instruments

Recognition, citation, and use of instruments are necessary if they are to survive. Table 6 shows the Web of Science level of citation by other papers describing the instruments and the citations of the instruments’ website addresses on Google.

Comparison of Best and Worst Sites

Table 7 shows illustrative extracts of statements made in the best and worst ranked sites. As would be expected, the best site (Imaginis) took a balanced and cautious approach to all claims. The Alternative Cancer site, rated the worst, made claims that were exaggerated or difficult to prove or disprove.

Longevity of Evaluation Instruments

Table 8 shows four studies that previously searched for and identified evaluation instruments and how many of those instruments were still available in November 2007.

Table 4. Ranking and percentage score of websites
Evaluation InstrumentBreast Cancer CareBreast Cancer HavenCancer HelpImaginisMD AndersonCancer Treatment CentersCancerbackupHeart SpringIsselsAlternative CancerMedicineNetElbee Global
WEB FEET6
75%
8
67%
1
92%
2
83%
4
79%
10
58%
7
71%
2
83%
10
58%
12
46%
9
63%
4
79%
HONcode9
50%
2
88%
9
50%
2
88%
1
100%
5
63%
5
63%
2
88%
9
50%
9
50%
5
63%
5
63%
Emory6
89%
7
86%
2
95%
3
92%
3
92%
8
84%
1
97%
8
84%
11
63%
11
63%
3
92%
10
68%
Michigan7
28%
6
45%
2
50%
1
53%
2
50%
8
21%
5
48%
9
16%
10
14%
11
1%
4
49%
12
−14%
Kellogg5
70%
5
70%
1
90%
1
90%
3
77%
9
50%
4
73%
5
70%
10
33%
11
27%
8
63%
11
27%
DISCERN8
66%
4
76%
2
80%
5
74%
2
80%
9
52%
1
89%
5
74%
10
46%
12
31%
7
69%
11
39%
NCCAM5
60%
5
60%
3
70%
1
90%
1
90%
10
50%
3
70%
5
60%
5
60%
11
40%
5
60%
12
20%
Pharm6
80%
7
73%
2
93%
2
93%
4
87%
9
60%
1
100%
8
67%
12
33%
10
47%
4
87%
10
47%
Minervation4
73%
7
62%
2
79%
3
74%
6
71%
5
64%
1
82%
11
46%
9
60%
10
48%
7
62%
12
34%
Nicoll3
71%
7
57%
1
86%
3
71%
3
71%
9
29%
1
86%
8
43%
12
14%
9
29%
3
71%
9
29%
Silberg7
25%
7
25%
3
75%
3
75%
1
100%
7
25%
3
75%
1
100%
7
25%
7
25%
3
75%
12
0%
Sandvik7
64%
7
64%
2
79%
2
79%
1
86%
10
29%
6
71%
2
79%
9
36%
11
21%
2
79%
12
7%
Overall rank872139461012511
Table 5. Spearman nonparametric correlation coefficients between evaluation instruments, based on assessment of the websites
WEB FEETHONcodeEmoryMichiganKelloggDISCERNNCCAMPharmMinervationNicollSilbergSandvik
WEB FEET1.00
HONcode.351.00
Emory.51.251.00
Michigan.48.38.87*1.00
Kellogg.71*.39.84*.89*1.00
DISCERN.55.47.86*.77*.87*1.00
NCCAM.55.39.78*.89*.92*.82*1.00
Pharm.53.28.98*.88*.87*.84*.80*1.00
Minervation.30−.02.85*.79*.79*.70.75*.85*1.00
Nicoll.55.14.97*.82*.83*.82*.74*.97*.82*1.00
Silberg.51.51.61.66.71.71*.75*.64.37.591.00
Sandvik.59.51.72*.83*.82*.77*.85*.73*.48.70.93*1.00

*Correlation is significant at the 0.01 level (2-tailed).

Correlation is significant at the 0.05 level (2-tailed.

Table 6. Number of citations in Web of Science and Google (classified as low, medium and high) suggesting use of instruments (NPI: no paper identified)
Evaluation InstrumentWeb of ScienceGoogle
WEB FEETNPILow
HONcodeMedHigh
EmoryNPIMedium
MichiganNPILow
KelloggNPILow
DISCERNMedHigh
NCCAMNPIHigh
PharmNPILow
MinervationNPILow
NicollNot citedLow
SilbergHighLow
SandvikMedMed
Table 7. Comparison of statements from websites rated best and worst
Best Site (Imaginis)Worst Site (Alternative Cancer)
“...anecdotal evidence reveals that many alternative or complementary medicines may be beneficial to patients, extensive research is still needed to determine whether non-traditional medicines are truly effective.”“Proven Therapies. 
Includes a list of successful, long-standing alternative treatments from around the world going unused by the conventional medical system. There is one reason they are the oldest - in the hands of experienced practitioner they work! For example: the very successful nutritional based Gerson therapy. It has been used by untold thousands of people worldwide for over 50 years.”
“Chinese herbs have been shown to lessen the side effects of chemotherapyand acupuncture has been shown to reduce nausea (a possible side effect of chemotherapy and other drug therapies).”Every day worldwide, quietly behind the scenes, there are over 100 proven alternative therapies used successfullyagainst cancer. (Get a FREE list of the 78 most popular below)  The problem is, nobody bothers to tell the public. Plus, conventional cancer doctors (MD Oncologists) are not taught anything about them in medical schools. This must change!”
“Not all alternative or complementary medicines are safe.”“The one true secret to success: There are six basic types of proven alternative cancer treatments, and you must use them all together.”
“In a recent studypublished in the Journal of the National Cancer Institute, researchers found that advanced breast cancerpatients with high stress levels were less likely to live as long as patients who coped well with stress.”“Anvirzel®
A new weapon against cancer  and AIDS from Ozelle Pharmaceuticals - a herbal extract which is nontoxic and causes no adverse side effects. Closed clinical trials are showing that the drug is especially effective against prostate and breast cancer. The materials of the company promoting Anvirzel. say that Dr Ozel treated 494 cancer patients with the extract, resulting in a high rate of success. The company has organized phase I and II trials in Ireland, and states that the trials confirmed the efficacy of the extract in cancer. They say the patients were improved in their quality of life as well as regression of cancer, reporting no notable side effects. Best results were said to be in prostate, lung and brain cancers. Sarcomas showed stabilization.”
“Some preliminary studies have shown that vitamins may help reduce risk of breast cancer or treat the disease.”“Artemisinin
A Chinese herb, sweet wormwood (qinghao in Chinese). In test tube studies, breast cancer cell research resulted in a 28% reduction of breast cancer cells treated only with artemisinin, and an amazing 98% decrease in breast cancer cells within 16 hours that were treated with artemisinin and an iron-enhancing molecule, transferrin. These treatments had no significant effect on normal human breast cells. This research pointed to the involvement of free iron in the toxic effect of artemisinin toward cancer cells, basically sparing healthy cells. (‘Selective toxicity of dihydroartemisinin and holotransferrin toward human breast cancer cells,’ Life Sciences 70 {2001) 49-56.”
Table 8. Instruments identified in previous studies still available in 2007
StudyYear of StudyNo. of Instruments IdentifiedNo. of Instruments Available in November 2007
Jadad and Gagliardi [22]1998143
Kim et al [25]1999277
Gagliardi and Jadad [23]200251
Bernstam et al [24]2005173

Limitations of This Study

Our study has some limitations. Selection of instruments, website ratings, and HIICRW criteria comparison were performed by only one researcher. Possible interobserver variation may mean that some instruments eligible for inclusion may have been missed and that some excluded may have been included by other reviewers. Due to the nature of the instruments being searched, they do not lend themselves to very specific search terms, meaning that our searches produced many results. Nevertheless, we may have found more tools by examining a greater number of search results or by searching other databases. Two instruments were excluded only for the reason that they were not health specific and, in retrospect, that exclusion criterion may not have been warranted.

Application of the evaluation tools to particular websites may also have produced different results with other researchers. Bernstam et al, in a recent study [76], suggested that some quality criteria may have poor interobserver reliability. However, there is likely to be more variation (both intraobserver and interobserver) in the values attributed to individual characteristics of an assessment tool. When combined to give an overall rank, as we have done in this study, tools are more likely to give consistent results.

What This Study Offers

Although our study has limitations, our experience has a useful message for several groups of people:

  • For those assessing or developing gateways who may wish to use an evaluation instrument, this study provides information that may help select an instrument.
  • For authors of evaluation instruments, we identified those features that may be desirable to ensure their instrument is useable and useful.
  • For information seekers, we show which properties to look for when selecting an instrument and suggest which instruments may be preferable to others.
  • For developers of complementary medicine websites, we show the need to use “technical markers of quality” to ensure that their site achieves high scores when assessed by instruments.

Our study also suggests that the popular HONcode may assess quality in a different way than other instruments.

The Quality of Websites

Developers of websites or gateways on complementary medicine need some method to check the quality of what they are presenting, and users of their websites need to be able to assess for themselves, and to believe, the claim that this is a quality website. What does quality mean? Provost et al [77] define quality as the levels of excellence which characterize the content of the site based on accepted standards of quality. At the very least, it should mean that the information presented is evidence based and the evidence is available to be checked.

The Gold Standard Approach

Impicciatorre et al [78] were among the first to assess the reliability of Web page information by comparing it against a gold standard. Others have followed this approach [7,79,80], but in every case, they have been able to focus on specific pieces of information or advice that have an available gold standard. For example, Pandolfini et al [81] compared information on the management of cough in children against a gold standard. Assessing quality in this way is time consuming, and in cases where websites present information on a broader range of topics, not a feasible option. Having some sort of evaluation that allows a quicker test of quality is therefore an attractive option, and for this reason, numerous evaluation instruments have been devised.

Does the Evaluation Instrument Approach Act as Good Proxy for Quality of Information?

Pandolfini et al [81] examined 19 Web pages and noted that no relationship was found between technical aspect, content completeness, and quality of information as compared to a gold standard. However, only one page received a high score on comparison against the gold standard, and this page also scored high on the other two measures. In our study, we have not assessed against a gold standard, but a simple comparison of the content of the best and worst sites using evaluation instruments shows our approach to have face validity. However, we should remain cautious. While instruments are designed to assess the quality of information, they are concerned with quality indicators and can therefore not take into account the accuracy of an individual piece of information. Eysenbach et al [5] are of the opinion that it is unlikely that a universal set of criteria could be developed that would predict the quality of health information websites as there are complex relationships between quality indicators and actual quality of information. While the results of our study suggest that websites rated higher by the evaluation instruments seem typically less likely to contain exaggerated claims, Walji et al [82] analyzed 150 websites dealing with the use of ginseng, ginkgo, and St. John’s wort and concluded that domain-independent criteria may not be appropriate for identifying complementary and alternative medicine websites, suggesting that consumers should rely on authoritative providers of information. There may be specific challenges in accessing high-quality information on complementary medicine, but there are several initiatives aimed at providing high-quality, evidence-based information, including the Cancer Specialist Library [83], National Center for Complementary and Alternative Medicine (NCCAM) [84], and Complementary and Alternative Medicine Evidence OnLine (CAMEOL) [85].

Validity, Reliability, and Agreement of Evaluation Instruments

The majority of available instruments have not been tested for reliability [24,86] or validity [86], and few include information describing the development process [82]. DISCERN and the Minervation tool appear to be the only ones that discuss the fact that their instruments have been tested for reliability and validity. Even among researchers, there is likely to be observer variation on various criteria. Bernstam et al [76] examined the degree to which two raters could reliably assess 22 popularly cited quality criteria on a sample of 42 complementary and alternative medicine websites and found poor agreement on 8/22. Good definition of the quality criteria should improve agreement, but the level of agreement between most of the instruments used in this study shows that complete “accuracy” may not be that important. Two of the instruments, HONCode and WEB FEET, did not have good agreement with the other 10 in ranking the best to worst sites. It is not clear why this is. So although HONcode is used frequently, we felt it safer to use those instruments that agreed as most of the other instruments seemed to address most aspects identified by the HIICRW.

Ease of Use

Five of the 12 instruments were time consuming to apply. Bernstam et al [24] took the view that any tool containing more than 10 criteria was too long for routine use and that the majority of available instruments are not user friendly. Although instruments should be comprehensive, and while it may be useful to ask a wide range of questions about a site, it is important that the application of an instrument is practical. Our study suggests that greater coverage of criteria is not necessarily achieved by asking a large number of questions, although if a tool is too short it is unlikely that it could cover a wide range of criteria. There was a great deal of variation in usability of the instruments. The Minervation tool contains an automated feature that allows entry of an URL. It produces an accessibility rating, leaving the user to select answers to questions of reliability and usability from drop-down menus. It then allocates scores for each section, an overall score, and gives a rating of the site in terms of “poor,” “fair,” or “good.” These automated features are in contrast to an instrument such as the one developed by Emory University, which was very time consuming to apply. Some instruments feature further guidance to assist the user in answering the questions, which was considered a useful attribute.

Range of Criteria

Eight of the instruments contained criteria concerning accessibility; although differing between instruments, this element asked questions about website design, layout, and if there was a search engine included on the page or appropriate links for navigation. While accessibility might not seem directly related to the quality of the information contained in the pages, it is extremely important in terms of the usefulness of the site.

Many websites “lost marks” as they did not display information concerning authorship. Eysenbach et al feel that this may be more related to convention than quality as it is not usual for organizations to display names of individual authors, and this is not necessarily an indicator of quality [5]. The way that the instrument’s question is phrased may be crucial in informing users of the quality of a site. Concerning authorship, some would ask “Is the name of the author disclosed?” which, in itself, may show that the site has a good transparency policy, but it does not add clarity to questions of quality as it is still not known if the author is suitably qualified to write on a particular topic. Similarly, regarding currency of the information, “Does the site display the date on which it was last updated?” is not as valuable as “Has the site been updated in the last 6 months?” Hence, an instrument covering the same criteria as another may achieve a different rating due to different wording of its questions.

Number and “Shelf Life” of Evaluation Instruments

Bernstam et al [24] apparently identified 273 instruments; however, they included tools such as “top traffic” that could not be utilized by an Internet user. They identified only seven instruments that could be applied by Internet users. We did not attempt to identify instruments that could not be applied to individual sites by an information seeker.

One problem with any technology assessment method is that if the method is no longer supported or in use, citation of the results by the gateway developer becomes obsolete. Studies [22,23] and examination of previous reviews have shown that tools previously developed are no longer in use. Our study also found that the number of instruments has been reduced. It may be that people have begun to use instruments already in existence rather than to develop new ones. We examined citation of papers and Web addresses to estimate the current popularity of instruments on the basis that more popular technologies are more likely to survive. (In another field, the story of the VHS tape outliving the apparently technically better Betamax provides an example of the importance of “being popular.”) Some of the instruments that we reviewed (eg, Kellogg), although they showed agreement with other instruments and were easy to use, may not survive because they have no critical mass of use.

The Ultimate Evaluation Instrument

We aimed to identify the best method for assessing websites for inclusion in a gateway on complementary medicine for breast cancer. No one tool seemed to be the answer. The three most-cited instruments on Google appeared to be DISCERN, HONcode, and NCCAM. HONcode does not seem to agree with the rankings produced by other instruments and seemed to have some quirks in its rankings. DISCERN seemed more difficult to apply than NCCAM, so if we chose one tool, it would be NCCAM. (This supports Walji’s assertion that complementary medicine requires domain-specific criteria.) However, we think that the authors of instruments might benefit from merging their methods to produce one tool. This has recently been argued by Provost et al [77] in reporting the development of the WebMedQual scale. They argued that harmonization of Internet-based health information evaluative efforts would benefit all users and international researchers. They reviewed the literature on rating scales and identified 384 different items used by 26 scales. Four expert reviewers rated items, eliminated duplicates, and reworded or deleted items that were not clear, meaningful, or measurable, that were thought unimportant, too general, or vague, or that could not be feasibly ascertained by an experienced but nonmedical Internet user. They ended up with the following constructs: content (19 items), authority of source (18 items), design (19 items), accessibility and availability (6 items), links (4 items), user support (9 items), confidentiality and privacy (17 items), and e-commerce (6 items). They claimed that their scale, consisting of 8 categories, 8 subcategories, 95 items, and 3 supplemental items to assess website quality, was the first step toward a standard tool that would be easy to use. However, from our experience of using NCCAM and other instruments, we question whether an instrument requiring 98 items would be quick and easy to use.

A recently developed method of assessing websites containing health information, CLUE W (personal communication, Philippe Desjardins, Laval University, 2007), is designed to assess the clinical usefulness of information to a health professional. Interestingly, this instrument calculates the usefulness of a site from a formula that incorporates validity and relevance of the information on the site as well as the work required to use this information. This instrument has undergone an extensive development process involving many health professionals. With many instruments already in existence, it will be interesting to see how much attention this new assessment method will attract.

Another new method, FA4CT [87], published after our search, differs from the checklist approach by asking users to compare information they find with information on other sites; only if discordant information is found, a checklist (the CREDIBLE checklist) is used. This is referred to by the authors as a second generation educational model. Although this approach does not guarantee that information will be compared to a gold standard, it is claimed that this method of assessment is similar to the process that experts go through when searching for, and checking, the accuracy of information on the Internet. New methods such as FA4CT may make the checklist approach obsolete, but in the meantime, this study gives those developing gateways a practical guide as to which assessment instruments may be useful.

Acknowledgments

We thank Helen Cooke, Helen Seers, Karen Pilkington, and other colleagues at Penny Brohn Cancer Care for helpful comments. MB is funded as a Knowledge Transfer Partnership Associate by the Economic & Social Research Council (ESRC).

Conflicts of Interest

The Penny Brohn Centre is a charity providing complementary care to people affected by cancer.

Authors' Contributions

Matthew Breckons contributed to the study design, carried out all data collection, most data analysis, and co-wrote the paper. Ray Jones contributed to the study design, carried out some of the data analysis, and co-wrote the paper. Jenny Morris contributed to the study design and edited the paper. Janet Richardson contributed to the study design, edited the paper, secured the KTP partner, designed the KTP project, and is responsible for the academic management of the KTP project.

  1. Satterlund MJ, McCaul KD, Sandgren AK. Information gathering over time by breast cancer patients. J Med Internet Res 2003 Aug 27;5(3):e15 [FREE Full text] [Medline] [CrossRef]
  2. Cole JI, Suman M, Schramm P, van Bel D, Lunn B, Maguire P, et al. The UCLA Internet Report: Surveying the Digital Future. Los Angeles, CA: UCLA Center for Communication Policy.   URL: http://www.digitalcenter.org/pdf/InternetReportYearOne.pdf [accessed 2008 Jan 11] [WebCite Cache]
  3. Fox S, Rainie L. The online health care revolution: how the Web helps Americans take better care of themselves. Washington, DC: Pew Internet & American Life Project; 2003.   URL: http://www.pewinternet.org/pdfs/PIP_Health_Report.pdf [accessed 2008 Jan 11] [WebCite Cache]
  4. Basch EM, Thaler HT, Shi W, Yakren S, Schrag D. Use of information resources by patients with cancer and their companions. Cancer 2004 Jun 1;100(11):2476-2483 [FREE Full text] [Medline] [CrossRef]
  5. Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. JAMA 2002;287(20):2691-2700 [FREE Full text] [Medline] [CrossRef]
  6. Pérez-López FR, Pérez Roncero GR. Assessing the content and quality of information on the treatment of postmenopausal osteoporosis on the World Wide Web. Gynecol Endocrinol 2006 Dec;22(12):669-675. [Medline] [CrossRef]
  7. Butler L, Foster NE. Back pain online: a cross-sectional survey of the quality of web-based information on low back pain. Spine 2003 Feb 15;28(4):395-401. [Medline] [CrossRef]
  8. Hainer MI, Tsai N, Komura ST, Chiu CL. Fatal hepatorenal failure associated with hydrazine sulfate. Ann Intern Med 2000 Dec 5;133(11):877-880 [FREE Full text] [Medline]
  9. Pilkington K, Richardson J. Evidence-based complementary (and alternative) medicine on the Internet. He@lth Information on the Internet 2003;34(1):7-9 [FREE Full text] [WebCite Cache]
  10. Schmidt K, Ernst E. Assessing websites on complementary and alternative medicine for cancer. Ann Oncol 2004 May;15(5):733-742 [FREE Full text] [Medline] [CrossRef]
  11. Weisbord SD, Soule JB, Kimmel PL. Poison on line--acute renal failure caused by oil of wormwood purchased through the Internet. N Engl J Med 1997 Sep 18;337(12):825-827. [Medline] [CrossRef]
  12. Richardson J, Pilkington K. Complementary and alternative medicine evidence online for cancer. Br J Cancer Manage 2005;2(2):9-11.
  13. Williams J, Clemens S, Oleinikova K, Tarvin K. The Skills for Life Survey: A National Needs and Impact Survey of Literacy, Numeracy and ICT Skills. London, UK: Department for Education and Skills; 2003.   URL: http://www.dfes.gov.uk/research/data/uploadfiles/RB490.pdf [accessed 2008 Jan 11] [WebCite Cache]
  14. Nielsen-Bohlman L, Panzer AM, Kindig DA. Health Literacy: A Prescription to End Confusion. Washington, DC: The National Academies Press; 2004.
  15. Ivanitskaya L, O'Boyle I, Casey AM. Health information literacy and competencies of information age students: results from the interactive online Research Readiness Self-Assessment (RRSA). J Med Internet Res 2006;8(2):e6 [FREE Full text] [Medline] [CrossRef]
  16. ; Commission of the European Communities, Brussels. eEurope 2002: Quality Criteria for Health Related Websites. J Med Internet Res 2002 Nov 29;4(3):E15 [FREE Full text] [Medline] [CrossRef]
  17. Eysenbach G, Köhler C. How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ 2002 Mar 9;324(7337):573-577 [FREE Full text] [Medline] [CrossRef]
  18. Marshall LA, Williams D. Health information: does quality count for the consumer? How consumers evaluate the quality of health information materials across a variety of media. J Libr Inf Sci 2006;38(3):141-156. [CrossRef]
  19. Schwartz KL, Roe T, Northrup J, Meza J, Seifeldin R, Neale AV. Family medicine patients' use of the Internet for health information: a MetroNet study. J Am Board Fam Med 2006;19(1):39-45 [FREE Full text] [Medline]
  20. Healthfinder.   URL: http://www.healthfinder.gov/ [accessed 2008 Jan 15] [WebCite Cache]
  21. Intute: Health and Life Sciences.   URL: http://www.intute.ac.uk/healthandlifesciences/medicine [accessed 2008 Jan 15] [WebCite Cache]
  22. Jadad AR, Gagliardi A. Rating health information on the Internet: navigating to knowledge or to Babel? JAMA 1998 Feb 25;279(8):611-614 [FREE Full text] [Medline] [CrossRef]
  23. Gagliardi A, Jadad AR. Examination of instruments used to rate quality of health information on the internet: chronicle of a voyage with an unclear destination. BMJ 2002 Mar 9;324(7337):569-573 [FREE Full text] [Medline] [CrossRef]
  24. Bernstam EV, Shelton DM, Walji M, Meric-Bernstam F. Instruments to assess the quality of health information on the World Wide Web: what can our patients actually use? Int J Med Inform 2005 Jan;74(1):13-19. [Medline] [CrossRef]
  25. Kim P, Eng TR, Deering MJ, Maxfield A. Published criteria for evaluating health related web sites: review. BMJ 1999 Mar 6;318(7184):647-649 [FREE Full text] [Medline]
  26. ; Health Improvement Institute and Consumer Reports WebWatch. A Report on the Evaluation of Criteria Sets for Assessing Health Web Sites. Bethesda, MD: Health Improvement Institute; 2003.   URL: http://www.consumerwebwatch.org/dynamic/health-report-assessing-health-sites-abstract.cfm [accessed 2008 Jan 15] [WebCite Cache]
  27. Web Feet Health Collection Criteria for Site Selection. Web Feet.   URL: http://www.webfeetguides.com/criteria_WFforHealth.html [accessed 2008 Jan 15] [WebCite Cache]
  28. HON Code of Conduct (HONcode) for Medical and Health Web Sites. Health On the Net Foundation.   URL: http://www.hon.ch/HONcode/Conduct.html [accessed 2008 Jan 15] [WebCite Cache]
  29. Health-Related Web Site Evaluation Form. 1998. Emory University Rollins School of Public Health.   URL: http://www.sph.emory.edu/WELLNESS/instrument.html [accessed 2008 Jan 15] [WebCite Cache]
  30. Web Site Evaluation Checklist. 1999. University of Michigan.   URL: http://www-personal.umich.edu/~pfa/pro/courses/WebEvalNew.pdf [accessed 2008 Jan 15] [WebCite Cache]
  31. Evaluation of Health Information on the Internet. 2001. Kellogg Library of Dalhousie University.   URL: http://www.library.dal.ca/kellogg/internet/evaluate.htm [accessed 2008 Jan 15] [WebCite Cache]
  32. The DISCERN Instrument. 1997. DISCERN.   URL: http://www.discern.org.uk/discern_instrument.php [accessed 2008 Jan 15] [WebCite Cache]
  33. 10 Things to Know About Evaluating Medical Resources on the Web. 2006. National Centre for Complementary and Alternative Medicine (NCCAM).   URL: http://nccam.nih.gov/health/webresources/ [accessed 2008 Jan 15] [WebCite Cache]
  34. Anderson-Harper HM, Boley KO. Internet Enhances Patient Education. US Pharmacist.   URL: http:/​/www.​uspharmacist.com/​oldformat.asp?url=newlook/​files/​Feat/​internet.​cfm&pub_id=8&article_id=418 [accessed 2008 Jan 15] [WebCite Cache]
  35. LIDA: Minervation Validation Instrument for Health Care Web Sites. Minervation.   URL: http://www.minervation.com/mod_product/LIDA/validation.aspx?o=1101 [accessed 2008 Jan 15] [WebCite Cache]
  36. Nicoll LH. Quick and effective Website evaluation. Lippincotts Case Manag 2001;6(5):220-221. [Medline] [CrossRef]
  37. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor--Let the reader and viewer beware. JAMA 1997 Apr 16;277(15):1244-1245. [Medline] [CrossRef]
  38. Sandvik H. Health information and interaction on the internet: a survey of female urinary incontinence. BMJ 1999 Jul 3;319(7201):29-32 [FREE Full text] [Medline]
  39. Quality Criteria for Health Related Websites. eEurope.   URL: http://ec.europa.eu/information_society/eeurope/ehealth/quality/draft_guidelines/index_en.htm [accessed 2008 Jan 15] [WebCite Cache]
  40. Net Scoring Criteria to Assess the Quality of Health Internet Information. 2000. chu-rouen.fr.   URL: http://www.chu-rouen.fr/netscoring/netscoringeng.html [accessed 2008 Jan 15] [WebCite Cache]
  41. Criteria for Evaluating the Quality of Health Information on the Internet. Library of the Health Sciences-Urbana, University of Illinois at Chicago.   URL: http://www.uic.edu/depts/lib/lhsu/resources/guides/web-evaluation.shtml [accessed 2008 Jan 15] [WebCite Cache]
  42. Tomita M. Administration Design Quality Web Site Evaluation Method. 1999. steinhardt.nyu.edu.   URL: http://steinhardt.nyu.edu/hepr/resources/online/adq.pdf [accessed 2008 Jan 15] [WebCite Cache]
  43. Evaluating Websites. The University of Auckland.   URL: http://www.library.auckland.ac.nz/instruct/evaluate.htm [accessed 2008 Jan 15] [WebCite Cache]
  44. Navigating the Health Care System: How to Evaluate Health Information on the Internet. 2005. Our Bodies Ourselves Health Resource Center.   URL: http://www.ourbodiesourselves.org/book/excerpt.asp?id=39 [accessed 2008 Jan 15] [WebCite Cache]
  45. Rating Criteria and Excellence Awards. Growthhouse.org.   URL: http://growthhouse.org/award.html [accessed 2008 Jan 15] [WebCite Cache]
  46. Clean Bill of Health Award. University of Iowa.   URL: http://www.lib.uiowa.edu/hardin/md/cbh.html [accessed 2008 Jan 15] [WebCite Cache]
  47. ; Health Improvement Institute. Health Improvement Institute.   URL: http://www.hii.org/documents/7004.pdf [accessed 2008 Jan 15] [WebCite Cache]
  48. Best of the Web in Mental Health: Rating Guidelines. 2002. PsychCentral.   URL: http://psychcentral.com/rateguid.htm [accessed 2008 Jan 15] [WebCite Cache]
  49. Wyatt JC. Commentary: measuring quality and impact of the World Wide Web. BMJ 1997 Jun 28;314(7098):1879-1881 [FREE Full text] [Medline]
  50. Evaluating Internet Health Information: A Tutorial from the National Library of Medicine. Medline Plus.   URL: http://www.nlm.nih.gov/medlineplus/webeval [accessed 2008 Jan 15] [WebCite Cache]
  51. MedlinePlus Guide to Healthy Web Surfing. Medline Plus.   URL: http://www.nlm.nih.gov/medlineplus/healthywebsurfing.html [accessed 2008 Jan 16] [WebCite Cache]
  52. Taking Charge of Health Information. 2004. Health Insight.   URL: http://www.health-insight.harvard.edu/guide.html [accessed 2008 Jan 16] [WebCite Cache]
  53. How to Evaluate Health Information on the Internet: Questions and Answers. National Cancer Institute.   URL: http://www.cancer.gov/cancertopics/factsheet/Information/internet [accessed 2008 Jan 16] [WebCite Cache]
  54. How to Find the Most Trustworthy Health Information on the Internet. Canadian Health Network.   URL: http:/​/www.​canadian-health-network.ca/​servlet/​ContentServer?cid=1042668266229&pagename=CHN-RCS/​Page/​ShellStaticContentPageTemplate&c=Page&lang=En [accessed 2008 Jan 16] [WebCite Cache]
  55. Internet Detective.   URL: http://www.vts.intute.ac.uk/detective/ [accessed 2008 Jan 16] [WebCite Cache]
  56. ; Internet for Health and Well-Being. Intute.   URL: http://www.vts.intute.ac.uk/acl/tutorial/wellbeing [accessed 2008 Jan 16] [WebCite Cache]
  57. Suggestions for Using the Internet to Find New Cancer Treatments. McGill Centre for Translational Research in Cancer.   URL: http://www.mctrc.org/en/rp/suggestions.htm [accessed 2008 Jan 16] [WebCite Cache]
  58. How to Judge the Quality of a Web Site. 2006. Judge: Web Sites for Health.   URL: http://www.judgehealth.org.uk/how_to_judge.htm [accessed 2008 Jan 16] [WebCite Cache]
  59. ; Intute: Health & Life Sciences. Intute: Health and Life Sciences Evaluation Guidelines. Intute.   URL: http://www.intute.ac.uk/healthandlifesciences/BIOME_Evaluation_Guidelines.doc [accessed 2008 Jan 16] [WebCite Cache]
  60. Best Practice Web Assessments. Evaluation Criteria. 2005. e-Gov Watch.   URL: http://www.e-govwatch.org.nz/criteria/ [accessed 2008 Jan 16] [WebCite Cache]
  61. Fahey DK, Weinberg J. LASIK complications and the Internet: is the public being misled? J Med Internet Res 2003 Mar 31;5(1):e2. [Medline] [CrossRef]
  62. Quality Standards for Medical Publishing on the Web. Bhia.org.   URL: http://www.bhia.org/ [accessed 2008 Jan 16] [WebCite Cache]
  63. Beckner WM, Berman BM. Complementary Therapies on the Internet. London, UK: Churchill Livingstone; 2003:45-53.
  64. Complementary Therapies: A Brief Guide. Breast Cancer Care UK.   URL: http://www.breastcancercare.org.uk/content.php?page_id=1013 [accessed 2008 Jan 17] [WebCite Cache]
  65. ; Breast Cancer Haven. Links. Breast Cancer Haven UK.   URL: http://www.breastcancerhaven.org.uk/policy.php?parent=87 [accessed 2008 Jan 17] [WebCite Cache]
  66. Research Into Living With Breast Cancer. Cancer Research UK.   URL: http://www.cancerhelp.org.uk/help/default.asp?page=10251 [accessed 2008 Jan 17] [WebCite Cache]
  67. Breast Cancer Treatment. Imaginis.   URL: http://imaginis.com/breasthealth/alternative.asp [accessed 2008 Jan 17] [WebCite Cache]
  68. Complementary/Integrative Medicine. MD Anderson Cancer Center.   URL: http://www.mdanderson.org/departments/cimer/ [accessed 2008 Jan 17] [WebCite Cache]
  69. Complementary Therapies. Cancer Treatment Centers of America.   URL: http://www.cancercenter.com/complementary-alternative-medicine.cfm [accessed 2008 Jan 17] [WebCite Cache]
  70. Complementary Therapies for Dealing With Secondary Breast Cancer. Cancerbackup.   URL: http://www.cancerbackup.org.uk/Cancertype/Breastsecondary/Livingwithcancer/Complementarytherapies [accessed 2008 Jan 17] [WebCite Cache]
  71. Heart Spring.   URL: http://heartspring.net/ [accessed 2008 Jan 17] [WebCite Cache]
  72. Issels Treatment.   URL: http://www.issels.com/?c1=ppc&source=google&kw=breastcancertreatment [accessed 2008 Jan 17] [WebCite Cache]
  73. Alternative Cancer.   URL: http://www.alternative-cancer.net/index.html [accessed 2008 Jan 17] [WebCite Cache]
  74. Cancer Treatment: Complementary and Alternative Medicine. MedicineNet.   URL: http://www.medicinenet.com/script/main/art.asp?articlekey=63338 [accessed 2008 Jan 17] [WebCite Cache]
  75. Focus on Health. Elbee Global.   URL: http://www.elbeeglobal.com/ [accessed 2008 Jan 17] [WebCite Cache]
  76. Bernstam EV, Sagaram S, Walji M, Johnson CW, Meric-Bernstam F. Usability of quality measures for online health information: Can commonly used technical quality criteria be reliably assessed? Int J Med Inform 2005 Aug;74(7-8):675-683. [Medline] [CrossRef]
  77. Provost M, Koompalum D, Dong D, Martin BC. The initial development of the WebMedQual scale: domain assessment of the construct of quality of health web sites. Int J Med Inform 2006 Jan;75(1):42-57. [Medline] [CrossRef]
  78. Impicciatore P, Pandolfini C, Casella N, Bonati M. Reliability of health information for the public on the World Wide Web: systematic survey of advice on managing fever in children at home. BMJ 1997 Jun 28;314(7098):1875-1879 [FREE Full text] [Medline]
  79. Greene DL, Appel AJ, Reinert SE, Palumbo MA. Lumbar disc herniation: evaluation of information on the internet. Spine 2005 Apr 1;30(7):826-829. [Medline] [CrossRef]
  80. Selman TJ, Prakash T, Khan KS. Quality of health information for cervical cancer treatment on the internet. BMC Womens Health 2006;6:9 [FREE Full text] [Medline] [CrossRef]
  81. Pandolfini C, Impicciatore P, Bonati M. Parents on the web: risks for quality management of cough in children. Pediatrics 2000 Jan;105(1):e1 [FREE Full text] [Medline] [CrossRef]
  82. Walji M, Sagaram S, Sagaram D, Meric-Bernstam F, Johnson C, Mirza NQ, et al. Efficacy of quality criteria to identify potentially harmful information: a cross-sectional survey of complementary and alternative medicine web sites. J Med Internet Res 2004 Jun 29;6(2):e21 [FREE Full text] [Medline] [CrossRef]
  83. Cancer Specialist Library. National Library for Health.   URL: http://www.library.nhs.uk/cancer/ [accessed 2008 Jan 16] [WebCite Cache]
  84. National Center for Complementary and Alternative Medicine (NCCAM).   URL: http://nccam.nih.gov/ [accessed 2008 Jan 16] [WebCite Cache]
  85. School of Integrated Health. Complementary and Alternative Medicine Evidence OnLine (CAMEOL). The Research Council for Complementary Medicine.   URL: http://www.rccm.org.uk/cameol/ [accessed 2008 Jan 16] [WebCite Cache]
  86. Ademiluyi G, Rees CE, Sheard CE. Evaluating the reliability and validity of three tools to assess the quality of health information on the Internet. Patient Educ Couns 2003 Jun;50(2):151-155. [Medline]
  87. Eysenbach G, Thomson M. The FA4CT algorithm: a new model and tool for consumers to assess and filter health information on the Internet. Medinfo 2007;12(Pt 1):142-146 [FREE Full text] [Medline]


HIICRW: Health Improvement Institute and Consumer Reports WebWatch
HONcode: HON Code of Conduct


Edited by G Eysenbach; submitted 01.08.07; peer-reviewed by E Bernstam; comments to author 13.11.07; revised version received 03.12.07; accepted 04.01.08; published 22.01.08

Copyright

© Matthew Breckons, Ray Jones, Jenny Morris, Janet Richardson. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.01.2008. Except where otherwise noted, articles published in the Journal of Medical Internet Research are distributed under the terms of the Creative Commons Attribution License (http://www.creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited, including full bibliographic details and the URL (see "please cite as" above), and this statement is included.