While healthcare workers have been on the front lines of the Covid-19 pandemic, factcheckers and journalists have been battling what the World Health Organization deems an infodemic.
That means the rapid and far-reaching spread of an abundance of information, says WHO – both accurate and inaccurate. During the Covid-19 pandemic, the consequences have been confusion, distrust of public health advice, and impacts on the decisions people make about their health.
Healthcare workers and leaders in the Northwest Territories’ Tłı̨chǫ region say the spread of misinformation may be why some people are choosing not to get vaccinated. Vaccination rates in the region have been consistently lower than elsewhere in the territory, particularly among residents aged 18 to 30.
Recent research has suggested the obvious: a link between Covid-19 misinformation and a decreased willingness to get vaccinated.
“If some people end up believing vaccines are not important or the pandemic is not even real – it’s a hoax – this could lead to a public health issue,” said Ahmed Al-Rawi, an assistant professor at Simon Fraser University, where he runs the Disinformation Project.
Disinformation and misinformation aren’t quite the same. Misinformation is when someone shares or creates false information without intending to cause harm, usually without realizing it’s not true. Disinformation is when false information is deliberately created or shared with the intention of causing harm.
Collectively, Al-Rawi has coined the term “disruptive information” as a catch-all for the dissemination of false information.
“I introduced this term to suggest that, whether it’s intentional or not, this kind of information is supposed to disrupt the status quo,” he said.
Covid-19 misinformation has been shown to cause real-life harm.
Several people in Canada and the US have been using Ivermectin, an anti-parasitic drug, to prevent or treat Covid-19 – despite it being an unproven antiviral treatment – after it was promoted by conservative talk show hosts, politicians, and even some physicians.
Some people, unable to get the drug at pharmacies, have turned to feed stores that carry a version meant for horses and cattle, causing them to get sick.
According to Health Canada, the veterinary version of Ivermectin – which has a more concentrated dose – can cause serious health problems from vomiting, diarrhea, and low blood pressure to seizures, coma, and even death.
In the US, several people have been hospitalized after taking veterinary-grade Ivermectin. Global News reported BC’s Drug and Poison Information Centre has dealt with nine cases related to people taking veterinary Ivermectin related to Covid-19 in the past six months.
Health Canada says there is no evidence that either the human or animal versions of Ivermectin are safe or effective for treating or preventing Covid-19. It has not received any drug submission nor clinical trial application for the drug in relation to Covid-19.
Why do people share misinformation?
Gordon Pennycock is an assistant professor of behavioural science at the University of Regina who researches decision-making and the psychology behind why people are susceptible to misinformation.
“A lot of the time what happens is people share things without even considering whether they’re true or not,” he said. “That’s partly attributable to the fact that people rely too much on their intuition and their gut feelings and they don’t stop and kind-of reflect about things, which I think is exacerbated on social media.”
According to a July 2020 survey of provincial residents by Statistics Canada, 96 percent of internet users reported seeing information about Covid-19 they suspected was misleading, false, or inaccurate during the first few months of the pandemic, with 40 percent saying they saw it on a daily basis.
In the same study, 40 percent of internet users said they initially believed inaccurate information about Covid-19 before they realized it was false. One in five said they shared Covid-19 related information they found online without checking its accuracy.
Pennycock said people are more likely to share information that grabs their attention or evokes an emotional response, which could cloud their ability to reflect on whether it’s true.
Aengus Bridgman, a PhD student at McGill University who studies political behaviour and misinformation, added that people are more likely to share information that supports their worldview, or what’s known as confirmation bias, regardless of whether they’ve verified it. He said information becomes viral through peer-to-peer transfer, and people who spend a lot of time online share the most stuff.
“Often it’s because they’re interested in it or they think it’s important,” he said.
“People who spend time on social media platforms genuinely care about politics and care about their communities and they’re trying to share information that they think is useful for others.”
Bridgman said people who distrust mainstream media, politicians, and private enterprise – including social media companies or pharmaceutical companies – are also more likely to share misinformation. Another factor is anti-intellectualism, or disregard of science-based facts, academics, and experts.
Al-Rawi said it can be hard to know for sure why people share inaccurate information, but some reasons include that they may financially benefit, have a political agenda, want to go viral and become famous, or are trying to be funny.
It has often been noted that the Covid-19 pandemic has highlighted systemic inequalities, and the same is true when it comes to Covid-19 misinformation.
People who are older and may be less digitally literate, people who are less educated, and those who have lower incomes tend to share more misinformation, Bridgman said.
In an article in the New Republic, Dr Jessica Jaiswal, an assistant professor in health science at the Univeristy of Alabama, said structural factors like poverty and marginalization can lead to what’s known as information inequality. That can include lack of access to reliable internet and news subscriptions, and lower levels of education that can make it harder to evaluate information.
Where is Covid-19 misinformation coming from?
The type of information people are exposed to on social media also varies due to algorithms.
Pennycock said older adults tend to share more misinformation, not because they are worse at distinguishing between what is true and false, but in part because those algorithms are more likely to expose them to misinformation in the first place. The same is true for people who are politically conservative.
“A lot of that has to do with the cultural narratives that have emerged,” he said, pointing to misinformation spread by former US president Donald Trump that has since been repeated by right-wing news outlets.
A 2020 study from Cornell University concluded that Trump was likely the largest driver of Covid-19 misinformation. Researchers analyzed 38 million articles on the pandemic in English-language media across the globe between January and May 2020, and found mentions of Trump accounted for almost 38 percent of th Covid-19 “misinformation conversation.”
Bridgman said it’s incredibly difficult to track where a single piece of misinformation starts, but research has shown the majority of misinformation that Canadians share comes from the US.
In January 2020, he said, a “massive amount of information” started coming out about Covid-19, and in the US, there were strong political divides on the issue. Bridgman said that’s due in part to the hyperpartisan political culture in the US where the “truth itself has become partisan to some extent.”
“That intense polarization in the United States is going to produce an environment where misinformation can really thrive,” he said. “Canadians are less polarized, but there is still some polarization.”
Initial research on how misinformation is shared by region across Canada, Bridgman said, shows higher levels among provinces in the prairies. He said that’s partly because residents in provinces like Alberta tend to consume more US news and follow more US-based Twitter accounts.
The Centre for Countering Digital Hate, a US and UK non-profit, said 12 people, dubbed the “disinformation dozen,” are responsible for the bulk of anti-vaccine misinformation spread on social media platforms. The organization chastised social media platforms for not doing enough to address misinformation, noting that despite violating the terms of Facebook, Twitter, and Instagram, nine of the 12 were still active on all three platforms in March.
In August, Facebook criticized the methodology of the disinformation dozen study and said focusing on just 12 people “misses the forest for the trees.” It did, however, remove more than three-dozen pages, groups, and accounts linked to those 12 people for violation of its policies.
The social media platform said it has removed more than 3,000 accounts, pages, and groups since the beginning of the pandemic for repeated violation of rules against spreading misinformation, along with more than 20 million pieces of content. Bridgman and others say that’s difficult to confirm, however, as social media platforms like Facebook control access to that data.
Vaccine hesitancy: around since smallpox
While the Covid-19 pandemic has coincided with an unprecedented infodemic, vaccine hesitancy and misinformation are not new.
After English doctor Edward Jenner developed the smallpox vaccine in the late 1700s – injecting humans with cowpox, which triggered an immune response against smallpox – his work met religious, scientific, and political objections.
In the 1800s, opposition to the smallpox vaccine resulted in the formation of anti-vaccination leagues and the spread of misinformation about the vaccine’s efficacy and safety, how it was made, and its potential side effects – with some claiming it actually caused disease. (While exaggerated, these claims weren’t entirely without grounds. Poor medical practices and unsanitary conditions did sometimes lead to secondary disease transmission.)
Some who objected to the vaccine claimed smallpox was a minor threat, despite mortality rates of 30 to 40 percent, and lambasted health officials and the media for causing senseless panic. Others pushed back against public health measures as an attack on personal freedom and an overreach of government power.
Bridgman said what’s different now is the Covid-19 pandemic has required an enormous, international, coordinated effort to address the spread of the disease.
“Typically, conspiratorial thinking emerges in instances of profound uncertainty,” Bridgman said. “You have this pandemic that, basically, nobody has lived through anything close to equivalent. There are a lot of questions about its origin and the extent to which different governments are taking different policies or different steps.”
Meanwhile, Bridgman said people are spending more time online since the start of the pandemic.
“With that amount of time comes the ability to read maybe too much, and to do too much online, and get sort-of caught in some of these rabbit holes.”
Misinformation tips and resources
You’ve read, heard, or watched something and you’re not sure if it’s true. What should you do next?
Before you share a link, Bridgman suggests taking a second to think about the accuracy of the headline, something studies have shown can reduce the spread of misinformation by 50 to 60 percent.
Canada’s Centre for Digital and Media Literacy recommends using factchecking tools and finding the source of information by following links to the original story, or looking up information in a search engine. Read multiple sources of information and compare whether most experts on the topic agree.
Another tip is to check the news outlet and author that published the information and other articles they have published. Does the person really exist? Are they who they say they are? Are they trustworthy? Make sure to look at an organization’s “about us” page, its mission statement, and any information about the company or leadership that runs the site.
Scientists from 20 leading universities across the world, including Al-Rawi, have now published an online tip sheet to help.
Al-Rawi said it’s important to understand where people are coming from and find ways to address their concerns through respectful conversation.
“Regarding them as inferior or not worthy of discussion might not really lead to a tangible or positive outcome,” he said.
Online factchecking resources focused on Covid-19 include:
- CornaCheck, a newsletter fighting Covid-19 misinformation that’s part of a joint factchecking project from the Royal Melbourne Institute of Technology and the Australian Broadcasting Corporation
- Poynter’s CoronaVirusFacts Alliance database, that unites factcheckers in more than 70 countries and includes articles published in at least 40 languages
- Mythbusters from the World Health Organization
- Health Desk’s Covid-19 Vaccine Media Hub, an initiative from the Australian Science Media Centre and technology non-profit Meedan that provides the latest scientific information, expert opinion, and evidence-based research on Covid-19
- Infotagion, an online and social media factchecking service created in part by Iconic Labs
- First Draft News’ Vaccine Insights Hub tracks and provides reporting advice on emerging health and vaccine misinformation
- Politifact’s Coronavirus Fact Check analyzes political statements and provides an accuracy score
Resources you can use to factcheck a wide variety of information include:
- Snopes, a widely used internet factchecking resource
- FactCheck.org from the Anneburg Public Policy Center at the University of Pennsylvania
- Lead Stories, which fact checks trending stories in the US
- Google’s Fact Check Explorer
- Full Fact, an independent factchecking organization in the UK
- FactsCan, an independent and non-partisan factchecker on Canadian federal politics
- Health Feedback, a health-focused factchecking site from a global network of scientists
- NewsGuard, a free browser extension that ranks how trustworthy news sites are
- Media outlets with dedicated factchecking resources include Reuters, the Washington Post, the Associated Press, Agence France-Presse, and Radio-Canada
- Poynter is currently offering a free online course as part of its MediaWise project on how to tell what’s true and false online
A factchecking example
One inaccurate story circulating online, sent to Cabin Radio by several people, claimed a man named Patrick King won a court case in Alberta forcing the provincial government to drop Covid-19 restrictions and admit the virus was a hoax.
Tracing the origins of the story, it appears to have come from a video interview of King on the Stew Peters Show, published by Red Voice Media on August 3. Searching Peters’ name on factchecking websites shows he has been accused of spreading inaccurate information about Covid-19 on multiple occasions and is currently suspended on Twitter.
As for Red Voice Media, factchecking website Media Bias/Fact Check rates the platform as a far right-biased “questionable source” with low credibility based on its promotion of propaganda and conspiracy theories, use of poor sources, and failed factchecks.
On August 5, the Justice Centre for Constitutional Freedoms – a Canadian charity and legal advocacy group that pursues socially conservative cases – issued a statement in response to the video, saying it contained several inaccuracies. The centre felt it was “unclear” whether King understood the legal process in which he was involved. The claims in the video have since been debunked by Snopes, the Associated Press, and Reuters.
According to those sources, King represented himself in court after being fined $1,200 for violating public health orders while protesting Covid-19 restrictions. King sought to challenge the validity of those rules and had attempted to subpoena Dr Deena Hinshaw, Alberta’s chief medical health officer, requesting she provide evidence that the virus that causes Covid-19 had been isolated.
The court quashed that subpoena, saying the health agency had “no material evidence” related to King’s fine and adding he had not properly filed a Charter challenge to fight the public health laws.
King and others appear to have interpreted the wording of that ruling to mean there is no evidence that Covid-19 exists. It actually means such evidence was not relevant to King’s ticket. The virus that causes Covid-19 was isolated by Canadian scientists in March 2020. King was found guilty of violating provincial public health laws at trial and sentenced to pay the fine.
There is no evidence that King’s court case is connected to Alberta’s decision to relax its Covid-19 restrictions. While many public health measures related to Covid-19 have eased in Alberta, the provincial government put off its plan to lift all public health restrictions until September 27 due to a surge in Covid-19 cases and hospitalizations. Provincial officials recently reintroduced mandatory masking and a curfew on establishments that sell alcohol to address the rising number of cases.
This reporting was supported by Journalists for Human Rights and the Misinformation Project, with funding from the McConnell Foundation, the Rossy Foundation, and the Trottier Foundation.