From climate change to genetically modified organisms to vaccine hesitancy, there has been no shortage of scientific topics in which people fall victim to conspiracy theories, or simply choose not to believe the scientific data.

But for many of us, the Covid pandemic has been a real eye-opener. In short order, it forced the general public to realize that science deniers weren’t just a rare fringe element — they’re our neighbors, our friends, our relatives. While most of us masked up, others drank bleach or dosed themselves with horse dewormer. While most of us rolled up our sleeves to get vaccinated, others believed that mRNA-based vaccines were the real health threat. No amount of data, it seemed, could change people’s minds.

For the past two years, a team of researchers has been brainstorming how to overcome the dangerously low trust in science that makes it easier for someone to believe a vaccine will make him magnetic than to believe the results of a clinical trial with thousands of participants. Now, nearly a dozen scientists have issued the report that emerged from those discussions.

Ad

Science Education in an Age of Misinformation” details the challenges we face today and recommends a series of steps that may help overcome these problems. The upshot is simple: a more educated public would be better-equipped to gauge the trustworthiness of scientific claims and to make well-informed decisions, even for areas of science that weren’t covered in class. “School science cannot anticipate what kind of scientific knowledge will be required to deal with the next science-related, humanitarian crisis,” the authors write.

Today’s scientific education focuses heavily on rote memorization: the parts of a plant, the elements of the periodic table. In their new report, scientists argue that training young students to evaluate scientific concepts is just as — perhaps even more — important. “Answering the key question of ‘Can this scientific claim be trusted?’ requires an understanding of the social structures of science,” the authors write. “Developing this understanding must be a fundamental core component of all science education, from cradle to grave—a feature of formal and informal science education and science communication.”

For example, the scientists say, students should be taught to evaluate someone’s expertise in the field. If someone is making a claim about the effectiveness of a vaccine, does that person have the right credentials to be trusted about that claim? Other key areas that could be helpful include understanding the peer-review process in the scientific literature, how to identify whether there’s a general consensus in the research community, and how to gauge the overall trustworthiness of a specific source of information.

Ad

“Most importantly, all this knowledge of how to engage critically with digital information needs to be explicitly taught and acquired as an ingrained habit from grade 2 upwards,” the scientists note. “Digital media and information literacy must be taught and practiced until it becomes as natural as riding a bicycle.”

These critical thinking skills would be helpful for evaluating any information, not just scientific claims. “The world as a result of the internet is different,” says Jonathan Osborne, a professor emeritus at Stanford and one of the authors of the new report. Bad actors have taken advantage of the internet and social media to spread disinformation, often adopting the language of science to add a veneer of credibility to their claims. “There are large numbers of people who are making arguments that don’t seem to attend to the evidence,” he adds. “It undermines trust in science and, at a broader level, it undermines trust in democracy.”

Unfortunately, even if the scientists’ recommendations for a better education process were implemented today, it would take decades before they had a meaningful impact. In the nearer term, Osborne hopes that scientists and science communicators do more to engage the public in conversations about how to weigh scientific claims. “Even if you don’t agree with what we’ve written in this report,” he says, “it should be read and it should be discussed.”