Introduction Background and Purpose of the Study 1 2 3 4 5 6 8 Internet users may tend to underestimate the effort and competence required for obtaining trustworthy health information. A decade ago, communication researchers who compared print and television media described this paradox: could 9 minimum average Higher education institutions in the United States provide access to an unprecedented quantity of digital information via library archives, licensed online databases, and the public-access Internet. To differentiate between publicly accessible Web documents and password-protected scholarly databases, which can be accessed by paid members via the Web, we refer to the former as the “the public-access Internet.” Our study explores three basic questions: How proficient are university students at finding and evaluating health-related information? How well do they understand the difference between peer-reviewed scholarly resources and opinion pieces or sales pitches? How aware are they of their own level of health information competencies? The main goal of this project was to identify approaches to building Information Age competencies of young health consumers, specifically a cohort of 18- to 23-year-old students enrolled in higher education programs. Literature Review: Health Information and the Internet 3 10 14 10 15 16 17 17 18 10 15 19 20 21 controlled processing automatic processing 22 23 24 22 23 proceduralization composition 24 An Interdisciplinary Research Partnership 25 26 27 28 29 30 Methods Participants A sample of 400 college-age students was selected because this cohort is the first Information Age generation that has been exposed, for up to one-half of their lives, to the Internet. Students enrolled in three courses in the College of Health Sciences at a Midwestern university were invited to participate in the study. The first class was a high-enrollment introductory course on the determinants of health. Although only undergraduate students (n = 354) participated in this course, they represented all levels of undergraduates—freshman (59%), sophomores (22%), juniors (9%), and seniors (10%). The second class was an advanced course in health administration in which both undergraduate (n = 19) and graduate students (n = 3) were enrolled. The third class was a mid-level health education course (n = 25) for undergraduate students. All students enrolled in the advanced health administration course and the mid-level health education course were majoring in health professions. About one third of the introductory course students with declared majors were majoring in a health-related discipline, and 31% of students had not made up their minds about a major field of study. Introductory course students completed the assessment for extra credit, while others did it to learn more about their own skills. The instructors emphasized that the purpose of the assessment was to help students become competent consumers of health-related information. Measures Health Information Competencies 2 31 foundational 31 research 31 32 an annotated list of references used in the article a summary of the article’s content a summary of other research on this topic a note or paragraph about the authors of the article a glossary of abstract concepts included in the researcher’s model 32 and, or, not stress medical In addition, students evaluate the quality of research publications, make judgments about website trustworthiness, and detect plagiarism. For example, the following item is used to measure evaluation of the trustworthiness of websites: You are looking for information on various nutritional supplements. You found three websites. Click on the links below to examine each site and to evaluate its content. Which of these websites is the most trustworthy? a) cognitogenic aids [a hyperlink]; b) dormitogenic aids [a hyperlink]; c) vescorogenic (gustatogenic) aids [a hyperlink]. Instrument Piloting and Validation To pilot test an earlier version of the RRSA instrument and to gather initial evidence about its validity and reliability, we administered a 60-item assessment to undergraduates (n = 100) and doctoral students (n = 45), as well as professional librarians (n = 5) and health professionals (n = 3). The feedback from librarians and health professionals offered preliminary evidence in support of the instrument’s face validity and content validity. Specifically, the librarians confirmed that the items included in the RRSA assessment conformed to the Information Literacy Competency Standards and addressed knowledge and skills important to health information consumers. The wording of several items, both stems and response options, was revised based on librarians’ recommendations. In addition, the librarians completed the assessment themselves. Their scores were then compared to the scores of students at two academic levels, undergraduate and doctoral. The results indicated that individuals with greater training and experience in managing digital health information performed better than individuals with less experience. Undergraduate students’ overall scores were the lowest (about 66% correct responses), followed by doctoral students’ scores (73%) and librarians’ scores (95%). These results offer preliminary evidence of the assessment’s criterion-related validity. The pilot test indicated an acceptable internal consistency value (Cronbach alpha > .70), although it could be improved (approach .80) if four items were removed. Therefore, four RRSA items that reduced the overall internal consistency were deleted. The revised assessment contains 56 items, including 16 multiple-choice questions and 40 true/false questions grouped under 7 stems (Multimedia Appendix 1). For example, knowledge of information sources is measured by a stem that states, “Which of these citations are to journal articles?” The participants then check all that apply from the list of 6 true/false items (3 references to journal articles, 1 book reference, and 1 book chapter reference). Items are scored as +1 if the answer is a correct positive or a correct negative and +0 if the answer is a false positive or a false negative. Further description of the development of the stimulus materials used in website evaluation appears in the Results section, under Proficiency in Evaluating Health Information. The RRSA assessment was designed to be useable by more than one institution. Its content can be adapted to the needs of various educational programs. Specifically, instructions to participants, the text of individual questions, detailed feedback, links to additional resources, and disclaimers (e.g., about participants’ rights and how the information they provide will be used) can be revised, without help from programmers, using the password-protected online control panel. This has been done by three US universities and one Canadian university that adopted the RRSA for use in their academic programs. For example, all four institutions revised search questions to enable their students to search for documents in their own university’s library catalog. The original RRSA designers provide coaching and training in order to ensure that the changes made to the RRSA do not have a negative impact on its reliability and validity. Ongoing validation studies provide a quality control mechanism and allow the testing of new or revised questions suggested by the partner institutions. The administration of the RRSA to partner institutions is supported through grants, partner donations, and volunteer efforts by the RRSA design team members. Other Measures We asked the study participants to share information about their age, gender, and education. Self-reported level of research skills was measured with a single item, “How do you rate your research skills?” with six response options ranging from 1 (nonexistent) to 6 (excellent). Procedures The RRSA instrument was administered online. Each student was issued a unique pass to access RRSA questions. The students had the option of submitting an incomplete survey and then returning to it at a later time to finish the remaining questions. This feature promoted better information processing and relieved the students from the need to rush and finish the entire assessment on their first attempt. The average estimated RRSA completion time was 26 minutes. Upon answering all questions, the students received an individualized results page that summarized their performance in different areas by providing a score, a maximum possible score, and percent attained. In addition to the numerical RRSA results, the Web page displayed individually tailored feedback composed by an experienced librarian. The Web page was programmed to compare, within each performance category, each individual student’s performance to the performance of a norm group. In accordance with the student’s competency level, the feedback provided suggestions for skill improvement and an explanation of factors that may have contributed to low, average, or high performance in each area. Finally, students who completed the RRSA were given the option to request additional materials for remedial learning, such as an explanation of the difference between scholarly and nonscholarly resources. The links to these additional materials were delivered to students via email. Data Analyses Descriptive statistics were used to examine respondents’ performance in four areas—searching for health-related information, understanding plagiarism, evaluating health information, and self-reported skill level. To examine the relationship between self-reported skill level and actual performance, we computed composite scores. A composite overall score, which is indicative of the health information competency level, was created by adding points for 56 items, which were either true/false or multiple choice. Composite score calculations were preceded by an internal consistency reliability analysis that determined the appropriateness of combining responses from multiple items. We used a Spearman correlation to assess the relationship between the actual skill level (overall score) and self-reported skill level. A multiple regression analysis was used to examine the relationship between actual performance and perceived skill while holding the amount of education (number of credit hours earned) constant. Results Our research questions were the following: How proficient are university students at searching for and evaluating health-related information? How well do they understand the difference between peer-reviewed scholarly resources and opinion pieces or sales pitches? How aware are they of their own level of health information competencies? The results for each question are presented below, preceded by a sample description. Respondent Characteristics t 400 P Proficiency in Searching for Health Information Table 1 summarizes performance in searching for health information. The data indicate that most students recognize common health journal titles and can perform a basic search in a library catalog, for example, by entering an exact book title into the title search. Few students, however, can perform an advanced search for a book when they know the book’s author (with a very common last name), general topic, and publication date. We call this search advanced because imprecise book specifications make it hard to find the book without performing a search that takes into account all or nearly all of the available information. and or not Proficiency in Evaluating Health Information One of the most important markers of a competent health information consumer—critical judgment of information—is assessed in two ways: (1) the first set of questions calls for a review of three full-text articles from journals, and (2) the second set of questions calls for a comparison of three health-related websites. Table 1 The three Web pages about nutritional supplements are realistic looking interactive screens that appear to be live websites. The content of these mock websites, developed specifically for the RRSA, includes graphics, hyperlinks, and text about nonexistent classes of nutritional supplements—cognitogenics, dormitogenics, and gustatogenics. Each website is dedicated to one class of supplement and explains its purpose (e.g., cognitogenics help people with learning disabilities), prevalence (e.g., “gustatogenic aids have been available in Germany and Canada for over five years”), and safety. Even though the descriptions of nutritional supplements were fictitious, all three websites accurately stated that the US Food and Drug Administration did not evaluate the safety or benefits of these nutritional supplements. Table 1 Searching and evaluating health information: performance on select measures (n = 308) Respondents With Correct Answers n % Searching for Health Information Journal of American Medical Association 293 95 Demonstration of a skill in locating a book in a university library catalogue based on its exact title (16) 286 93 Understanding that a one-keyword generic search may return too many documents—an overwhelmingly large number of resources on a variety of topics (4) 275 89 Use of a proper research strategy—thinking about a broad topic to identify a sub-area of interest (2) 268 87 Ability to detect a journal citation that is incomplete—lacks a year of publication (17) 241 78 Understanding of a term “article abstract”—a summary of the article’s content (8) 234 76 Knowledge that a journal is a source of scholarly (analytical) information on a narrowly specialized topic (6) 214 70 Understanding of a term “bibliography”—a list of references or citations (9) 213 69 Identification of a primary source of health information: medical record (14) 195 63 Identification of references to journal articles from a list of references that includes both book references and article references (11) 187 61 Knowledge of a peer-reviewed journal article as an authoritative source of specialized health information (12) 185 60 Identification of a primary source of health information: hospital annual report (14) 173 56 Demonstration of a skill in locating a book in a university library catalogue based on a non-unique authors’ name and a general topic (15) 111 36 and, not, or 105 34 and, not, or 98 32 Evaluation of Information: Full-Text Journal Articles Evaluation of journal articles: Identification of an article published prior to year 2000 (22) 248 80 Evaluation of journal articles: Identification of an article based on opinion rather than well-supported evidence (19) 242 79 Evaluation of journal articles: Identification of an article based on a review of existing research (20) 166 54 Evaluation of journal articles: Identification of an article written by an author whose affiliation is unknown (21) 148 48 Evaluation of Information: Websites on Nutritional Supplements Evidence-based decision-making: Disagree that “all three websites make a good case for taking nutritional supplements” (25) 187 61 Evaluation of health-related websites: Identification of the most trustworthy website (23) 154 50 Evaluation of health-related websites: Ability to identify the purpose of a website—to sell services (24) 42* 46 Evidence-based decision-making: Agree that “none of the websites makes a good case for taking nutritional supplements” (25) 67 22 * Note: RRSA question numbers are shown in parentheses; see Multimedia Appendix 1 for exact question wording. 11 These standard features, rather than the text content, are intended to differentiate the websites in terms of their credibility. Because all respondents are equally uninformed about the nutritional supplements described in the text, they must attend to other features when making quality-related judgments. This purposeful design was motivated by the desire to avoid the confounding influence of pre-existing knowledge about the subject matter described in the document that is being judged. A good measure of one’s ability to critically evaluate Web pages is being able to disentangle the judgment of a website’s features from the judgment of its content. Study participants may have had preconceived notions about the quality of nutritional supplements depending on their purpose (e.g., cognitogenics are for sleeping disorders and gustatogenics are for appetite suppression). To avoid a possible interaction between the untrustworthy features of a website and the believable description of the nutritional supplement, we asked a group of students (n = 52) to judge the trustworthiness of the supplements’ descriptions presented as Microsoft Word documents rather than as websites. Although the level of trustworthiness was about the same for all nutritional supplement descriptions, the least trusted nutritional supplements were placed on the website with the highest number of untrustworthy features. 11 Table 1 Understanding the Difference Between Scholarly Resources and Sales Pitches Less than half of respondents determined the purpose of the least trustworthy website, which was to sell products and services. The visitors to this .com website are charged for reprints of the content, offered discounted products, and provided with multiple prompts (e.g., a running line) to book a consulting appointment with a private nutritionist who has few relevant qualifications. Customer testimonials posted on this site describe fantastic outcomes achieved within an unrealistically short time frame. Less than a quarter of study participants reached the correct conclusion that none of the websites made a good case for taking the nutritional supplements, whereas 39% of respondents thought that all three websites made a good case for taking the supplements. Understanding Plagiarism Table 2 Table 3 Table 2 Understanding plagiarism: when references are needed (n = 308) Which of the following can be reproduced without proper reference? Check all that apply: Respondents With Correct Positive or Negative Answers n % * 294 96 Hospital board member’s point of view 264 86 My classmate’s ideas 232 75 Unpublished works 223 73 Spoken word 209 68 My dad’s political opinions 156 51 * Note: Items are scored as +1 if the answer is a correct positive or a correct negative and +0 if the answer is a false positive or a false negative. Table 3 Defining plagiarism (n = 308) Which of the following are plagiarism examples? Check all that apply: Respondents With Correct Positive or Negative Answers n % * 290 95 * 276 90 Enclosing the word-for-word sentence in quotation marks, accompanied by a citation. 271 88 * 215 70 * 201 65 * 169 55 * Note: Items are scored as +1 if the answer is a correct positive or a correct negative and +0 if the answer is a false positive or a false negative. Health Affairs 33 Table 3 Awareness of Personal Health Information Competencies When asked “How do you rate your research skills overall?” most respondents (84%) believed that their skills were good, very good, or excellent. To compare self-reported and actual skill levels, we computed an overall health information competency score for each participant. An acceptable level of internal consistency reliability (Cronbach alpha = .78) for 56 right/wrong items indicates that it is appropriate to calculate the overall score as the sum of points of these 56 items. The overall scores ranged from 20 to 54 with a mean of 37 (SD = 6.35) and did not significantly depart from a normal distribution. Table 4 within Table 4 Means for health information competency overall score by self-reported skill level How do you rate your research skills? n Mean Overall Score SD Nonexistent 0 - 0 Poor 3 36.33 4.04 Fair 47 34.89 5.52 Good 162 36.89 6.29 Very good 83 37.64 6.89 Excellent 13 36.77 6.10 Total 308 36.78 6.35 2 P < P Discussion Interpretation of Findings The present study represents a systematic effort to measure health information competencies using a standardized and reliable measurement tool, the Research Readiness Self-Assessment (RRSA). The data were obtained from a diverse sample of 308 respondents (77% response rate). Nonrespondents (n = 92) differed from respondents (n = 308) in terms of their academic level: freshmen were slightly more likely not to participate in the RRSA than higher-level students. The most likely explanation for nonparticipation is a lack of interest in extra credit rather than the computer-assisted administration of the RRSA. It is possible, of course, that students with particularly poor computer skills found the online administration a barrier. However, a semester after we collected the data reported in this paper, there was a 100% participation rate by 180 undergraduates in two introductory courses where the instructors required RRSA completion. The two course instructors reported no student complaints about not being able to follow emailed instructions on how to complete the assessment. The data indicate that many students lack important competencies that may limit their ability to make informed health choices. We observed deficiencies in the areas of conducting advanced searches, discriminating among different types of information sources, referencing other people’s ideas, and evaluating information from Web pages and journal articles. Our data suggest that undergraduate students are inaccurate judges of their own competencies and hold a very positive view of their ability to do research. This finding may reveal an important barrier to building health information competencies of college-age students. We found that there is a large competency gap between the average and the best information consumer. An average undergraduate in our sample is able to solve only 68% of problems that are solved by the best performing study participant (an average score of 37 versus a maximum score of 54). Health information competencies are applied to transform health-related information into knowledge that is consistent with the most current medical practice. High competence variability is a proxy indicator of students’ varying ability to make evidence-based decisions. In the past, limited access to information may have prevented health information consumers from acquiring knowledge and making informed choices. The new generation of health information consumers has, for the most part, easy access to information; yet it may not be able to take full advantage of this convenient access. Our study shows that individuals with limited health information competencies may fail to locate the best available information due to employing poor search strategies. Searches that do not take into account all of the important criteria often produce low-relevancy documents or documents from commercial websites that promote products or services. These sites often present one-sided evidence, which can be detrimental to making a good decision about one’s health. Overall, many students are rather unsophisticated information consumers who rely on basic searchers and the easiest ways of retrieving information. health 11 14 17 Indeed, there is no substitute for good judgment when it comes to navigating information. Because this good judgment is a product of both critical thinking and extensive knowledge of the subject matter being researched, we believe that higher education programs are uniquely positioned to develop health information competencies. However, initial work on developing Information Age competencies needs to be done at the K-12 level when children are beginning to be exposed to various sources of information, including the Internet. 34 exact words ideas 9 9 Implications for Health Promotion Practice 35 36 computer literacy informatics awareness computer experience Among the limitations of the present study is the narrowly focused sample, which limits our ability to generalize the study’s findings to the broader population of health information consumers. The students from a Midwestern university may not be completely representative of the entire population of US Information Age students, due to, for example, the relatively homogeneous ethnic composition and possible overrepresentation of individuals raised in rural communities. In our future studies, we intend to broaden the pool of RRSA participants by including multiple educational institutions as well as urban and rural communities located in different geographic regions. In contrast with many health information literacy studies, this research presents the results obtained via direct measure of skills and knowledge rather than via self-reports by health information consumers. While the reliability of the RRSA assessment reaches acceptable levels, it is necessary to further assess its unidimensionality, content validity, and criterion-related validity. A comprehensive validation study of the RRSA instrument is currently under way. Conclusions 37 The assessment tool used in the present study is a self-administered instrument that provides a reliable account of health information competencies related to managing electronic health information. Data acquired through this research can be used to suggest curriculum improvements and estimates of the higher end level of skill held by health information consumers. It can also be used to educate health information consumers about their levels of skill necessary for managing health information from electronic sources. RRSA findings suggest that health information competencies of undergraduate students, many of whom will soon enter a variety of health professions, are limited. Health literacy educators can utilize RRSA findings to design educational interventions that impact information consumers’ skills and prepare them for the challenges of living and working in the Information Age.