The Canadian Anti-Hate Network
Source: Unsplash
An analysis of over 2,400 groups, channels and accounts is charting the trends seen on social media when it comes to hate groups and their ideologies over last year. Conducted by researchers at the Institute for Strategic Dialogue (ISD), these environmental scans shed some light on how “right-wing extremists” express themselves in the largest digital social spaces -- all while those companies claim to attempt to crack down on them.
Led by a team of researchers at Ontario Tech University (OTU) in partnership with Michigan State University and the University of New Brunswick, the team followed 2,467 right-wing extremist, or RWE, accounts, channels, and pages active in 2020. In this time, the social sites saw over 3,207,332 posted pieces of content, according to the study, singling out 61,679 across a host of different platforms.
Of those included, 4,037 posts appeared on Facebook, 38,857 were tweets, 46 YouTube videos, 448 Telegram messages, and 18,291 posts on 4chan’s /pol/ board.
The study defines RWE as a “loose movement, characterized by a racially, ethnically and sexually defined nationalism” that is built on an “xenophobic and exclusionary understandings of the perceived threats” coming from people of colour, Jews, immigrants, the 2SLGBTQ+ community and feminists.
Notably, the amount of this type of content has generally increased since the latest reports.
“On Facebook, Twitter and 4chan, RWEs were more active in 2020 than in 2019,” the report reads. “Building on analysis of the RWE online ecosystem in 2019, we charted increases in the amount of content produced by RWE communities on both 4chan (by 66.5%) and Facebook (by 8.2%) over the two years of the study.”
On Twitter, the number of accounts actually fell by 63.5%, though the researchers note that “RWE actors sent proportionally more Tweets,” equating to more than double the amount of “extremist messaging” posted to the bird site in 2020 compared to 2019.
“Only on YouTube were we able to note a significant drop in the volume of output across the channels analyzed.”
The report notes that this may be due to the growth of alternative streaming and video upload sites, like Bitchute and Odysee, where there are virtually no enforcement mechanisms for hate content and which take an anything goes approach to content moderation.
Despite being a small fraction of accounts, “RWEs in Canada are able to generate a sizable reaction from the public. Across our analysis of activity in 2020, content from RWEs on Facebook generated over 44 million reactions, 15 were Retweeted nearly 9 million times, and generated over 600,000 comments on YouTube.
“Additionally, on Telegram, where some of the most violent and concerning communities were analyzed, content was viewed over 16 million times.”
COVID-19 Drove Hate In 2020
2020 saw the rise of a series of conspiracy-driven influencers distrustful of public health measures, as well as the results of the US election. In Canadian spaces, discussions of Canadian politics appeared only 3.1% more than American politics, which often focused on Donald Trump.
“We hypothesize that the increase in activity across the platforms identified above was in part driven by the impact of COVID-19 restrictions and the resulting increase in the time that many people spent online.”
The pandemic became the most widely discussed topic in the communities analyzed, “accounting for 38.8% of all messages,” according to the study. This includes increases in activity on Facebook and YouTube that specifically focused on the Canadian government’s response to the pandemic.
The authors note that “RWEs in Canada are drivers of disinformation” and point to places like 4chan as providing examples of where Canadians can be found “creating and disseminating” viral disinformation. Examples include the idea that Prime Minister Justin Trudeau was preparing to launch an invasion of Canada’s southern neighbour if Donald Trump had won the 2020 presidential election.
A “small but highly concerning” amount of the captured groups and accounts included 30,847 posts targeting vulnerable communities and perceived political rivals. This included violent misogyny found on incel forums and calls for terrorist action from accelerationists.
Deplatforming Hate
Noting that social media companies’ attempts at moderation still have a long way to go, the research did find that a significant amount of channels studied in 2019 were no longer active in 2020.
With significant decreases in public pages and private Facebook groups -- 50% and 52% decreases respectively -- and only a 17% decrease in public groups. On Twitter, 63.5% of the accounts tracked the year before were gone in 2020, and 47% on YouTube.
However, the report notes what researchers have been suggesting for years: insufficient moderation measures, such as removing groups or pages but not individual users, doesn’t address the problem, it only creates a whack-a-mole effect.
“Through our analysis, we were able to identify a number of new channels and pages which had sprung up to take the place of those which were removed. The result was that there was no discernible difference in the number of Facebook and YouTube channels analyzed between 2019 and 2020.”
The report identifies Twitter as the only platform with a drop in the number of active RWE accounts between 2019 and 2020.
Read the full report from Institute for Strategic Dialogue here.