Website evaluation is now a vital, lifelong learning skill. While libraries have a vast amount of resources, and while we do encourage the students to use those very same resources, we spend time with first-year college students teaching website evaluation. It is particularly important to learn evaluation since most students will no longer have access to library resources upon leaving their institution. Additionally, even when the students are in high school or college, there are situations where the best information is not available through the library but via the internet.
With the students using websites for both academic and personal use, we want them to critically understand the information they are seeking. Finally, at the collegiate level, the new ACRL Framework has allowances for website evaluation. The frame Authority Is Constructed and Contextual contains the following knowledge practices and dispositions:
• Use research tools and indicators of authority to determine the credibility of sources, understanding the elements that might temper this credibility
• Recognize that authoritative content may be packaged formally or informally and may include sources of all media types
• Develop awareness of the importance of assessing content with a skeptical stance and with a self-awareness of their own biases and worldview
–“The Framework for Information Literacy for Higher Education,” Association of College and Research Libraries; ala.org/acrl/standards/ilframework, February 2015
The above statements provide an insight into website evaluation—the idea that because something is on the open web doesn’t mean it’s necessarily bad or good, that that judgment needs to come from evaluation itself, and that biases exist even in what appear to be credible sources.
The CRAAP Test
At William Patterson University, the backbone of our website evaluation instruction is the often-used CRAAP test adapted from California State University–Chico (“Evaluating Information—Applying the CRAAP Test,” California State University–Chico, www.csuchico.edu/lins/handouts/eval_websites.pdf; September 2010). The CRAAP test is a website evaluation tool that asks users to look at a website for currency, relevance, authority, accuracy, and purpose (i.e., CRAAP). We found that this was an easy acronym for the students to remember (!) and that it provided a clear overall picture of website evaluation.
How We Brought It to the Students
At our institution, library instruction varies between disciplines and classes, although in a given year, approximately 300 different classes receive library instruction. Library instruction is heavily marketed in some departments but is not mandatory in any class. In addition, classes that receive library instruction traditionally have a research-based assignment the students need to find sources for. In some research assignments, the professors allow the students the use of websites, but that is not the case for all of them.
To teach students website evaluation, we packaged it as part of the overall library instruction/orientation lesson for first-year seminar (FYS) classes. Most incoming first-year students at our institution need to take FYS as a graduation requirement. The library works closely with the director of FYS to ensure that most classes receive library instruction. We are able to package it and promote it as a library orientation; for most students, this will be the first time they attend a library instruction session. Library instruction for FYS is 75 minutes.
In FYS, the students traditionally do not come with a pre-existing research assignment, so the co-coordinators of user education designed one using themes found in the Common Reader, the book assigned to all incoming first-year students. The assignment was created using Google Forms and was easy to access for both librarians and students. Up through spring 2014, the format of the class was as follows:
Part I: Finding a book using the catalog
Part II: Finding an article using one of the databases
Part III: Website evaluation
In most cases, the website evaluation piece felt rushed; sometimes the librarian would not be able to cover it at all. To correct the problem of time, in the fall 2014, a new “flipped” classroom method was implemented. Students would watch a lecture-based video on using the newly implemented discovery layer, which covered Parts I and II listed above, thereby reducing the time librarians needed to cover those two sections. Eleven of the classes were flipped.
The flipped method generally allowed librarians more time to cover Part III, website evaluation. With that said, there were still some issues remaining. In the classes that were not flipped, some librarians still struggled to find the time to cover evaluation. In both cases, the website evaluation piece was a full lecture on the CRAAP test. The assignment used in class asked students to look at a website assigned to them by the librarian, and “evaluate it for currency, relevance, authority, accuracy, and purpose.” The students were asked if they would recommend the website based on the CRAAP test and to explain why or why not.
Problems arose with the assignment. Several students did not understand they were evaluating the website and instead looked at the content. Many students did an evaluation, but left out parts of the CRAAP test. And, in many cases, students did not adequately explain what they were evaluating.
Changes in 2015
The instruction of website evaluation went through a complete overhaul for the fall of 2015. First, to ensure more time would be given to website evaluation, all the classes were “flipped.” With all of the classes flipped, active learning exercises were in place for students to find articles and books. It also allowed for time to instruct website evaluation; in most cases, librarians finished the overall lesson 5–10 minutes before the end of the 75-minute class time.
As the students were already in an active learning mode of sorts, website evaluation was less lecture-based. Instead of the librarian demonstrating a resource and manually putting it through the CRAAP test, the students were shown a presentation of an article about making pizza in a toaster oven. The presentation, done in PowerPoint, looked at each aspect of the CRAAP test within the pizza article.
Finally, instead of the straight lecture, the lecture was “chunked.” Librarians would demonstrate currency and relevance, after which the students would complete the corresponding part of their assignment. Then the librarians would demonstrate authority and accuracy, with the students completing that portion of the assignment. Finally, they discussed purpose and an overall evaluation of the website.
The in-class assignment was different as well. Instead of asking two questions about website evaluation, it asked six, with reminders to the students about what they were looking at. Currency, for instance, lists its piece as: “Currency: How current is the information? What dates do you see?”
With the changes in place, librarians and students were able to experience a more thorough immersion in website evaluation.
Both assessments were electronic; the in-class assessment was designed using Google Forms and the post-test was designed using Survey Monkey. The in-class assessment was “sent” to the students’ computers within the library’s computer lab using a teaching software and was completed before the student left the class. The post-test was emailed electronically to students in the FYS classes who had attended a library instruction session. Emails went out approximately 2 months after the class.
With the changes to the in-class assessment, the sample data showed student results were significantly stronger. A student web evaluation comment from 2014 was typically very general about evaluating the website, but when we broke it up in 2015, the responses were more in-depth, detailed, and accurate. Instead of the students only mentioning or evaluating one or two facets of the CRAAP test, we listed each facet, ensuring that the students would evaluate the website looking at each aspect of the test. As with all student responses, some were more detailed than others, but on balance, most students were able to correctly identify the different parts of the CRAAP test.
A similar test had been given in 2014. However, the response rate was low, with only 24 students responding. In 2015, with a skillful marketing incentive, the response rate nearly quadrupled, up to 100 students. The students were able to name parts of website evaluation that they understood from the library instruction session, even months after they had attended the session.
Student responses were then “graded” on a scale of 0 (incorrect), 1 (partially correct), and 2 (correct). In 2014, the average score was 1.5, as opposed to 1.69 in 2015. As with the in-class assessments, student responses were overall more thought-out and thorough, mentioning different aspects of the CRAAP test and evaluating it.
Implications and Future Use
Breaking up the lecture and the assessment piece benefited the students in their overall understanding of website evaluation, as did finding an item for the students to relate it to. While the assessment results proved positive, we are still thinking of ways to make the content more engaging. We are considering instituting a collaborative learning approach to website evaluation in which the students break into groups, evaluate a website on their own, and report back to the class. However, given the limited amount of time we have in FYS classes, this might not be possible. An option for the web evaluation piece to save time is to “flip” the chunked lecture into a brief video and then have students come to class for the in-class exercise.
Website evaluation is an essential lifelong learning skill for students. By using a standard tool already developed and in use at several libraries, librarians were able to present it in a unique way. In delivering instruction, employing a “chunked” lecture with a corresponding assignment, as well as demonstrating the CRAAP test to a relatable article, the content was made more meaningful to the students, and they were able to retain the information. Going forward, we may look to “flip” the lecture portion and adjust the student evaluation piece to a group exercise in which students collaboratively evaluate a website.
Cara Berg is reference librarian and co-coordinator of user education at William Patterson University. Her email is firstname.lastname@example.org.