From vaccine hesitancy to the rise of far-right extremism, COVID-19 has mainstreamed conspiracy theories at an astonishing rate, with devastating impacts — but the handful of reporters and researchers addressing them in Canada say they don’t have the resources to respond to the country’s dis- and misinformation crisis alone

For the first wave of the pandemic, journalist Jeff Yates and his colleagues at Radio-Canada’s disinformation debunking team were so overrun with emails that they had to put a triage system in place.

Before the pandemic, Yates says Décrypteurs, Canada’s only dedicated disinformation reporting team from a French-language media outlet, might receive anywhere from one to seven emails per day. But when COVID-19 was declared a pandemic in mid-March 2020, that number climbed to about 100 emails daily, and left three team members scrambling to address thousands of messages and false health claims circulating at a pace Yates says didn’t dissipate until the summer of 2020. (Décrypteurs received 5,230 emails from 4,077 people between March 1 and Aug. 31 of 2020. Yates says the waves of emails “came back with a vengeance” in September 2020 when Québec announced another lockdown.) 

“We kind of lost control of our inboxes,” Yates says, adding that the sheer volume of emails and disinformation online meant they had to put in place a “worst first” system: his colleagues got up early in the morning to sort through the many emails they’d received overnight. Like they always had,  the team decided what to address based on the information’s virality and risk. Now, though, it was no longer just a matter of what they should cover, but what they could.

And it went beyond debunking requests — there were “heartbreaking” messages from people saying QAnon was ruining their family, or asking what they should do about relatives who had gone down the conspiracy theory rabbit hole. Membership in their Facebook group quickly tripled in the first months of the pandemic, and is now at 52,000 members, versus the 10,000 Yates says there were before the pandemic. But the comments on posts became so overwhelmed with toxic false information that Yates says they had to turn them off. He says that during that first wave, there were also death threats directed towards the team that ultimately involved informing newsroom security and four different calls to law enforcement (Yates adds he never had to get police involved before the pandemic.) 

“I hope to not see anything like this ever again.”

“That was a really tough time for the whole team,” he says, adding that for those first three months, they were all in panic mode. “I think all of us were kind of scarred by it, to be quite frank.”

Although Yates says nothing close to that first three-month influx happened again, emails waves tend to correlate with COVID-19 ones. And after seeing the worst of the worst for so long, he still doesn’t know if the Canadian public at large grasps how bad disinformation is and was. Yates says he’s had “a growing sense of panic” about its societal impact since he began covering it in 2014, but he says this global crisis has threatened the information system beyond what he could have ever imagined.

“To have this much disinformation about one specific subject like that, I’ve never seen anything even remotely close. I hope to not see anything like this ever again.”

Canada has not been spared from the impacts of COVID-19’s infodemic, which is the wave of  “too much information including false or misleading information in digital and physical environments during a disease outbreak,” according to the World Health Organization. But Yates and other sources say that despite individual journalistic and academic efforts, systematic reporting on the structural and networked nature of disinformation in Canada remains critically under-resourced.

Sources say that the lack of sustained investment is in part due to technical barriers and the often time-consuming, costly nature of online investigations, as well as potential reluctance from news leaders to assign disinformation stories beyond election periods.

The influence and power of U.S.-based social media companies and news organizations also skews the conversation stateside, but sources point to the outsized global impact of a handful of Canadian conspiracy theory super spreaders — as well as the public health imperative to understand false and misleading information’s impacts and report on them without amplifying extremists — as a few of many reasons to put resources into dedicated disinformation reporting and train journalists on how not to perpetrate harm in this area. 

There are responses like the Journalists for Human Rights Misinformation Bursary Project that indicate awareness is shifting around the scale and necessity of investment in responding to dis- and misinformation. Yates says that Décrypteurs’s viewership and readership since its inception shows that there’s a lot of public demand for more reporting on this in Canada.  

While distrust of media may run deep for those whose worldviews are entrenched in conspiracy, Yates says that last year,  when he wrote an article explaining QAnon’s belief system for family and friends of those lost to the conspiracy, he got messages from people saying they finally understood what was happening to their family members.

He says that distributing resources based on the idea that the disinformation problem is smaller in Canada, compared to places like the United States, is somewhat of a catc22, because the root causes of the problem (and people profiting from disinformation) will remain uncovered if no one’s looking for them. 

Although he and his colleagues at Radio-Canada have heard many stories detailing the family strain of conspiracies in Quebec (which Yates says has its own semi-distinct disinformation ecosystem because of the language-based conspiracy theory exchange between the province and France) there’s a lot left to learn about how disinformation is affecting other communities across the country.

“Because not a lot of people are looking at this in Canada, we have a very fuzzy idea of what’s circulating, and we have an even fuzzier idea of the impact,” he says.   

Barriers to a dedicated disinformation reporting

When reporter and researcher Jane Lytvynenko ventured outside in Toronto in January 2021, one of the first things she saw was a car caravan escorted by police officers with signs proclaiming some of the same conspiracy theories she had been working to debunk. When she first saw it, Lytvynenko says her “jaw was on the floor.” 

“Disinformation is a structural issue, and it’s a networked issue.”

“It’s surreal to encounter offline, and to see, right in front of your eyes, the impact that conspiracies [and] disinformation … has on the real world,” she says. “I’ve spent the last year and a bit basically, just thinking and writing about this  …  but still, it’s very different when you talk about it on the phone, or when you look at it on a screen, versus when it’s physically surrounding you in real life.”

Lytvynenko is a research fellow for the Technology and Social Change Project for Harvard’s Shorenstein Center, and previously worked as a senior reporter for Buzzfeed News covering disinformation and online investigations. 

She says that while there are a handful of journalists in Canada that work in this field, consistent mis- and disinformation reporting on a national and local level is still fairly limited, which means “we don’t really have a handle on the scale of the problem.”  And it’s a gap that debunking alone can’t solve.

“Disinformation is a structural issue, and it’s a networked issue,” she says. “You can’t only focus on fact-checking without looking at the ecosystem and doing those larger investigations, because you’re cutting yourself off at the knees.”

There have been efforts like a specialized CBC team to “expose false news and disinformation” for the 2019 federal election and the CTV Truth Tracker in 2021, but sources say there is a reluctance in Canadian media to invest in dedicated disinformation units outside of an election.

Lytvynenko notes that reporting on online disinformation this past year didn’t continue at the same scale after the 2020 U.S. election, even as COVID-19 remained a pressing global issue.

The barriers to specialized disinformation reporting in Canada are both technical and financial, Lytvynenko says, and overcoming them will require “buy-in” from reporters as well as editors and managers. 

Disinformation reporting requires both the ability to measure and understand how speech functions online and reporting on disinformation without accidentally amplifying it, she says. 

To develop these skills, there needs to be training and a “conscious investment from news leaders to both understand this problem and to train their reporters in investigating the problem properly.” 

This leads to another major barrier, according to Lytvynenko: already thinly-stretched newsrooms might not prioritize disinformation training or investigation projects because they are technical and time-consuming, and can sometimes yield few results.

But in the meantime, sources say reporters can do more harm than good by accidentally glorifying or promoting disinformation channels, like linking to the page of COVID-19 conspiracy theorists and inadvertently sending them them more followers. This has the potential to bring hateful ideologies from the fringes and normalize them in the mainstream. 

A 2018 Data & Society report which details better reporting practices for covering extremists and other antagonists online found that conventional journalistic practices and profit models have proven vulnerable to exploitation by manipulators to further hateful narratives and were used to hijack news media from 2016 to 2018.

The report, which is based on scholarly in-depth interviews with journalists, found that click-driven editorial strategies and excessive reporting demands (especially for interns, early-career reporters and freelancers) are some of the things that contribute to an environment ripe for manipulation, as information travels “too far, too fast, with too much emotional urgency.”

“Both-sidism on steroids” also creates false equivalencies that normalize untrue, extremist and dehumanizing positions, according to the report, normalizing previously fringe ideas by giving them equal platform to those that are factual and unquestionably newsworthy.  

Another journalistic gap detailed in the report is the overwhelming whiteness of newsrooms, and the homogeneity of perceived audiences, which bears “not just on how these stories are told, but what stories are deemed worthy of telling in the first place.”

Amarnath Amarasingam, assistant professor at Queen’s University’s School of Religion and senior research fellow at the Institute for Strategic Dialogue, says there’s a broader danger in reporting on extremism and disinformation without context or background knowledge, in particular when reporting on marginalized communities.

“When you get rid of dedicated ‘I’m going to stick with one topic for a long time to really figure out what’s going on,’ there’s a potential there for sloppiness,” he says, adding that mistakes can contribute to community-wide harm and distrust of media. “And in particular, the misinformation space, when you’re dealing with ethnic communities [and] refugee communities, those mistakes are sometimes harmful and they linger for a while.”

“There’s an assumption, I think, in the mainstream media that misinformation is not a racialized issue … but it has a differential impact in different communities.”

Amarasingam adds that someone who’s embedded in a particular beat and trusted by community members because they’ve been around long enough “could stave off a lot of misunderstanding.” 

“The language barrier, the cultural barrier, religious barriers are all there and mediate how misinformation works and has an impact,” he says. “There’s an assumption, I think, in the mainstream media that misinformation is not a racialized issue … but it has a differential impact in different communities.”

Amarasingam also points to a gap between academic and journalistic focus on the racialized dynamics of misinformation during the pandemic, like the “mountains” of public health research dedicated to addressing vaccine intake by the South Asian and Black community that isn’t covered broadly in the media. 

“People of colour are very aware of the misinformation in their own families and in their own communities, but it rarely makes it into mainstream coverage, because I think journalists are quite worried that they’re going to be called racist or something. This is why I think it’ll probably be better done by journalists of colour, and they can have a more substantive conversation along those lines.”

Making pandemic science more accessible

Unclear public health communication means that the burden of correcting misinformation has largely fallen to individuals during the pandemic, says Lauren Salim, project lead for the Canadian Coalition for Combating Online Disinformation at the Montreal Institute for Genocide and Human Rights Studies. 

Salim says dedicated disinformation reporting could be a “crucial step to the news environment in Canada,” and could serve a dual purpose, not just as a helpful tool for people who are already interested in the topic, but also as an indirect way of reaching people who believe or perpetrate misinformation.

“I think having that disinformation beat is really crucial for the people that it will reach anyway, but also for people that are confronting their loved ones to have resources that they can use … over a period of time, [when they] very gently approach their loved ones and present them with information.” 

Salim also says the coalition’s research and discussions with experts point to a “lost opportunity” for clear health communications during the pandemic. She says Canada was slow to recognize that people were getting their news from health authorities and news organizations on social media platforms alone, not clicking through to base websites. (For example, a tweet linked to an article about new guidelines would not be as effective as a thread breaking down said guidelines, and then linking to the article for additional context.) 

“If the good information online isn’t accessible, and the misinformation or disinformation online is accessible, not everyone is going to take the time to read the full article or engage with the material deeply,” she says, adding that based on the coalition’s research, she suggests small graphics, with bullet points that are  accessible and easy to read. Without that, she says, people turn to easy and relevant information on their Facebook accounts, regardless of whether it’s true.

Another important consideration for news organizations, Salim says, is the decision of what to put behind paywalls, especially for making sure that crucial information that could help someone with pivotal health decisions in a timely manner is available. 

“A lot of public health communication tends to be top down, and not meeting people where they are at,” says Sabina Vohra-Miller, founder of health education and misinformation debunking page Unambiguous Science, as well as co-founder of the South Asian Health Network. “So there’s a lot of gaps that exist, [and] people like myself have to then bridge that gap.”

Vohra-Miller is a pharmacologist who started Unambiguous Science after being harassed and threatened for correcting misinformation on social media pages using her personal account. She says a health policy is only as good as the way it is communicated to the public. 

She describes a system where individual science communicators are trying to undo the damage left in the wake of almost every new public health announcement or report released with updated COVID-19 science, because the reports’ technical, lengthy and inaccessible formats are often rife with opportunity for “chaos and confusion.” 

For example, Vohra-Miller says, there was AstraZeneca in the spring of 2021: “even though the science behind pausing the vaccine made sense, it’s just the way it was communicated that did not build confidence.”

She adds that science communication itself is also under resourced, so Vohra-Miller and others are left doing a vast majority of their work unpaid and “off the side of their desk,” which means the work is sometimes at expense of their own jobs, wellbeing, paid work, or ability to spend time with their loved ones. Vohra-Miller says she puts in anywhere from four to six hours of her own time per day on Unambiguous Science alone, and that she hasn’t had a weekend off in a year.

A more sustainable approach, she says, would be to include a paid science communication professional at the public health body level who is primarily focused on making information more accessible. 

Vohra-Miller also does education sessions, and she echoes Salim’s points that conversations addressing people’s fears often take time and care — one person Vohra-Miller says she spoke to, who was pregnant, only decided to get vaccinated after a 45-minute conversation with her, even though they told her that  they had already spoken to a physician.

“Stories like this give me goosebumps because it just makes me realize how impactful the work is,” she says, adding that one non-profit community health centre that Vohra-Miller got data back from says their vaccine uptake went from 50 per cent to 90 per cent after one of her sessions.  

Vohra-Miller says another important conversation is that “the people who are most disproportionately impacted by COVID-19 are typically racialized people from many marginalized communities, and English is not necessarily their first language,” which means technical, jargon-laden language is especially inaccessible.

Salim echoes that new Canadians have potential to fall through the gaps, and both she and Vohra-Miller point to the importance of engaging and partnering with community groups to communicate health information in places like immigration centres, community health organizations or faith centres.  

Language-specific communication can also come from ethnic and multilingual news outlets across the country. Madeline Ziniak, chair of the Canadian Ethnic Media Association, says that this messaging can be about, not just providing translation for those who don’t speak English or French, but building trust.

“Getting an editorial perspective which comes directly from your community is very important,” she says, adding that this perspective is also key for conversations about the importance of vaccination. “It’s a trusted source, but also, it can really reach a perspective to ethnic cultural communities that is not readily available in traditional media.”

“When there’s an election, all these politicians come to us, and they praise us for [doing] such a great job representing Canada [and] multiculturalism,” Rezvanifar says. “As soon as the election is over, they’re gone. You can’t reach them.” 

For example, in a collaborative disinformation session with Journalists for Human Rights, Ziniak says one of the points raised was that vaccine recommendations on the Canadian Russian embassy website conflicted with Canadian health authorities, which may be confusing for recent immigrants or community members and something that ethnic media is more suited to directly address. 

Even so, Ziniak says there’s a lack of support and direct information from the public sector, and while things are starting to evolve after a year of advocacy, engagement with ethnic media is “still wanting.”

“There needed to be a conduit and a healthy vehicle for factual information in the beginning that had to come down from the government,” she says. “In the initial early period of COVID, there really was a lack of support for ethnic media, for [things like] public service announcements, that should have been going to the various language media with factual direct information.”

There was also a lack of monetary support, Ziniak says, which is especially devastating for ethnic media because of its reliance on advertising from cultural-specific retail spaces that were also struggling with lockdown losses. Ziniak says this lack of continuous public sector support for the roughly 1,280 ethnic media entities that her association estimates are in Canada is nothing new, and that things like the CRTC’s Ethnic Broadcasting Policy have been put on the backburner for decades. 

But both her and Kiu Rezvanifar, founder of KVC Communications Group and president of the Canadian Ethnic Media Association, say the government understands the importance of ethnic media when they need the vote of the communities they reach. 

“When there’s an election, all these politicians come to us, and they praise us for [doing] such a great job representing Canada [and] multiculturalism,” Rezvanifar says. “As soon as the election is over, they’re gone. You can’t reach them.” 

It’s this pattern, coupled with promises for funding that never materialize, that Rezvanifar says he’s seen time and time again from many politicians, regardless of party affiliation, over the years. So when he found himself in the same situation on a Zoom call with opposition leader Erin O’Toole, he asked him point blank: “if [none of us] exist, how are you going to get your message to our community?” 

Countering extremism  

The continued rise of right-wing extremism during the pandemic is another imperative for a reimagined Canadian response to disinformation. 

Recent research from the U.K.-based Institute for Strategic Dialogue analyzed three million messages sent by over 2,400 groups, pages, accounts and channels that seemed to be based in Canada  over the course of two years. The Online Environmental Scan of Right Wing Extremism in Canada suggests  that right-wing extremists are one of the largest drivers of disinformation in the country, and that the pandemic has fostered an environment where these groups flourished.

While the report found that the number of Canadian right-wing extremists online remains relatively small, with one Facebook page or group for every 235,420 Facebook users, right-wing extremists in Canada are able “to generate a sizeable reaction from the public.” Posts garnered over 44 million reactions on Facebook, and others got 600,000 comments on YouTube, and content was viewed more than 16 million times on Telegram. 

Experts say that far-right groups have co-opted or benefitted from pandemic conspiracy theories, and in some cases, blended them with existing hateful ideologies. Sources also say that financial backing as well as incentives for disinformation remain largely unexplored areas of research in Canada. 

Mackenzie Hart, one of the lead researchers on the 2021 report, says their team found a “huge overlap” between sovereignist and militia groups — an anti-government ideological subgrouping of right-wing extremists — and anti-maskers.

“We saw that on and offline, anti-government groups that existed before the pandemic were extremely active and engaging with anti-mask content,” she says, adding that disinformation was also distinct depending on the ideology of the right-wing extremist subgrouping it was coming from — like white supremacists, ethnonationalists, and anti-Muslim groups. 

She also says that one of the key challenges to understanding these groups is defining them to begin with, adding that because their report was looking more broadly at themes in conversation and public-facing engagement data, it can only partially understand if and why individual people joined right-wing extremist groups during the pandemic.

Hart also says one of the report’s recommendations points to the need for “more research focusing solely on the relationship between extremism and conspiracy theories.”

Another barrier to understanding how these groups function is the often clandestine nature of their communications. According to Lytvynenko, many far-right extremist groups have made a conscious effort to move to closed networks over the past two or three years, which poses an additional challenge to reporters trying to understand how they organize and recruit members.

We’re just at the tip of the iceberg when it comes to actually understanding how such ecosystems function in Canada

Lytvynenko adds that globally, there’s also been examples of messaging apps being used as tools to target specific communities with false information. 

Barbara Perry, director of the Centre on Hate, Bias and Extremism at Ontario Tech University, one of the authors of the seminal report on modern far-right extremism in Canada, says we’re just at the tip of the iceberg when it comes to actually understanding how such ecosystems function in Canada.    

Perry acknowledges that journalists can often only work with available information, and Canadian-specific far-right extremism is “seriously under-researched.” 

“From an academic perspective, research follows the money. And until recently, there really hadn’t been any funding explicitly devoted to what we might think of as domestic terrorism or right-wing extremism,” she says, adding that resources and programming were almost entirely dedicated to Islamic-inspired extremism.  

Perry says that in 2012, when she applied for funding to support an environmental scan of right-wing extremism in Canada, she didn’t expect to be approved because the topic was so off-the-radar for government and law enforcement, and applied mainly because of what she saw as an overemphasis on Islamic-inspired terror. Perry says her past and current work shows  that there are more instances of far-right violence on Canadian soil than Islamic-inspired attacks.

“I think there’s still a lot of denial or trivialization of the threat associated with the far right,” she says. “So it’s very difficult to get an in-depth perspective from law enforcement, it’s very difficult to engage with active members of the far-right movement to do any interviews, or any kind of primary research with them. We’re much better at accessing formers.” 

Reporting still needs social media companies to provide information 

The remaining challenge for almost all online disinformation reporting and research is that it is in large part dependent on information from private, mostly U.S.-based social media companies. And despite some efforts like Facebook’s collaboration with the International Fact-Checking Network, platforms have little to no obligation to provide key metrics or tools to understand how far a post actually reaches.

“Social media gives us what they want to give,” Yates says. “You can have a rough idea of how viral something is by looking at the shares. But it’s a very incomplete picture. And Facebook itself says that.” 

He and other sources say tools like CrowdTangle are useful  — but there were concerns that the Facebook-owned data analytics tool might disband or change because executives were concerned that the information it was providing was hurting the company’s image, according to the New York Times. 

“It’s people who are creating this content, and it’s people who are impacted by this content. And to me, that’s what makes it worth investigating.”

On the other hand, legislation compelling these companies to turn over data or take down posts also comes with a host of challenges. Yates says that the Décrypteurs team often gets emails, comments and questions from people at conferences asking why fake news isn’t illegal. But Yates, Lytvenenko and other sources point to how most existing fake news laws around the world have been used by authoritarian governments to censor free speech and gag journalists. 

While there are government initiatives dedicated to educate and give citizens resources to address misinformation, Canadian tech policy experts have raised concerns with some parts of recent proposed federal legislation to address online harms. For example, they say requiring social media platforms to take down illegal content and speech within 24 hours has led to over-censoring in countries like Germany.   

When it comes to the issue of closed-network misinformation spreading, Canadian journalists could take cues from message-based journalism innovations like the The Continent, a pan-African publication that that send its reporting directly to readers on WhatsApp and Signal (messaging apps whose encryption functions meant it was shielded from COVID-19 government censorship in Tanzania.) 

When asked what Canadian journalism could do to better understand closed-network disinformation, Lytvynenko laughs and says, “investigate it, I guess.”  

She says that sometimes there’s an overemphasis on the algorithmic aspects of disinformation, to the point where there’s potential to lose sight of the fact that it’s still very much a fundamentally human topic. 

“It’s people who are creating this content, and it’s people who are impacted by this content. And to me, that’s what makes it worth investigating,” she says. “That’s what almost makes it a classic reporter story of uncovering influence and power. It’s just that that influence and power manifests itself on social media.”

This story was reported as part of the 2021 J-Source/CWA Canada reporting fellowship, funded at arm’s length by CWA Canada.

Emma Buchanan is the 2021 J-Source/CWA Canada reporting fellow.