4 Web Evaluation Skills: A “Bleak” Track Record

“At present, we worry that democracy is threatened by the ease at which disinformation about civic issues is allowed to spread and flourish” (Wineberg, et al., “Evaluating Information”).

How are we doing when it comes to recognizing disinformation and navigating the information disorder landscape?

Web Evaluation Skills: Students

Icon of student with backpack and book

In 2016, a few months before the U.S. Presidential election, an influential study on Web literacy was completed by the Stanford History Education Group. Their report—titled “Evaluating Information: The Cornerstone of Civic Online Reasoning”—was concerned with the spread of disinformation online and how this might threaten our democracy. The study asked nearly 8,000 students (in middle school, high school, and college) to perform five Web evaluation tasks. The results were quite shocking:

  • 80% of students couldn’t distinguish “sponsored content” from news articles on websites;
  • 67% of students failed to recognize potential bias in online information;
  • 65% of students took online images at face value;
  • Almost all struggled to evaluate information on social media.

In 2019, as yet another U.S. Presidential election approached, the Stanford History Education Group conducted a similar study of civic online reasoning, this time with a sample of 3,446 high school students. The results?

  • 52% of students believed a grainy video shot in Russia constituted “strong evidence” of voter fraud (ballot stuffing) in the U.S.
  • Two-thirds of students couldn’t tell the difference between news stories and ads (“Sponsored Content”)
  • 96% of students did not consider how bias impacts website credibility (for example, ties between a climate change website and the fossil fuel industry).

These studies conclude by describing students’ abilities to reason about the information as either “bleak” or “troubling” (Wineberg, et al., “Evaluating Information”; Breakstone, et al.).

However, as we will see below, this trouble is not only limited to students.

Web Evaluation Skills: “Experts”

Icon of three people on laptops

In 2017, the Stanford History Education Group conducted a study, “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information.” Here, they assessed the Web evaluation skills of presumed experts: Stanford undergraduates, History professors, and professional fact-checkers. This fascinating study confirmed that even Stanford students and professors with PhDs in History struggled to identify credible sources on the Internet.

For example, in one task, the participants were presented with two websites that provided information on bullying, and they were given up to ten minutes to determine which was the more reliable site. One of the websites (American Academy of Pediatrics) was from the largest professional organization of pediatricians in the world, while the other site (American College of Pediatricians) had been labeled a hate group because of its virulently anti-gay stance. The result?

  • Only 50% of the historians identified the reliable website;
  • Only 20% of the undergrads identified the reliable website;
  • 100% of the fact-checkers were able to quickly identify the reliable website (Note: For effective strategies, see the chapter on Fact-Checking).

Concept Review Exercise: Web Evaluation

Sources

This section includes material from the source book, Introduction to College Research, as well as the following:

Breakstone, Joel, et al. “Students’ Civic Online Reasoning: A National Portrait.” Stanford Digital Repository, 14 Nov. 2019. Licensed under CC BY-NC-ND 3.0.

Image: “College Student” by Gan Khoon Lay, adapted by Aloha Sargent, is licensed under CC BY 4.0.

Image: “Users” by Wilson Joseph, adapted by Aloha Sargent, is licensed under CC BY 4.0.

Wineburg, Sam, Sarah McGrew, Joel Breakstone, and Teresa Ortega. “Evaluating Information: The Cornerstone of Civic Online Reasoning.” Stanford Digital Repository, 22 Nov. 2016. Licensed under CC BY-NC-ND 3.0.

Wineburg, Sam, and Sarah McGrew. “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information.” Stanford History Education Group Working Paper No. 2017-A1, 6 Oct. 2017, dx.doi.org/10.2139/ssrn.3048994.

Original material by book author Calantha Tillotson.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

The Insiders: Information Literacy for Okies Everywhere Copyright © 2021 by Adam Brennan; Jamie Holmes; Calantha Tillotson; and Sarah Burkhead Whittle is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book