Report March, April, May 2021

YouTube Disinformation

This report looks at the activity of 398 different radical YouTube channels during the period of the 1st of February till the 31st of May 2021.

Many YouTube channels are flocking to other networks. We looked at the video descriptions and extracted which other platforms they are pointing to.

aug '20
nov '20
feb '21
may '21

instagram

44.16%

instagram

43.40%

instagram

46.53%

instagram

47.55%

twitch

5.58%

twitch

7.55%

twitch

7.64%

twitch

7.69%

gab

40.61%

gab

32.70%

gab

58.33%

gab

51.75%

parler

32.49%

parler

44.03%

parler

38.89%

parler

33.57%

rumble

3.05%

rumble

12.58%

rumble

20.14%

rumble

23.08%

bitchute

38.58%

bitchute

28.30%

bitchute

25.00%

bitchute

23.08%

odysee

0.00%

odysee

2.52%

odysee

4.17%

odysee

12.59%

locals

2.54%

locals

1.26%

locals

1.39%

locals

6.29%

dlive

10.66%

dlive

7.55%

dlive

6.94%

dlive

5.59%

brighteon

3.55%

brighteon

3.14%

brighteon

2.08%

brighteon

4.20%

lbry

4.57%

lbry

3.77%

lbry

2.78%

lbry

3.50%

trovo

0.51%

trovo

0.00%

trovo

1.39%

trovo

1.40%

ugetube

0.00%

ugetube

0.63%

ugetube

1.39%

ugetube

1.40%

Analysis

The popularity of main stream platforms remained consistent over the year. Slightly less channels seem to be active on Twitter and Facebook. The popularity of Instagram and Twitch is slightly increasing. In the alt-tech space Gab and Parler are the two most dominant players. Less channels we're mentioning Parler after the platform got suspended from AWS in January 2021. Video sharing platform Bitchute seems to be losing territory to video platforms such as Rumble and Odysee.

Locals, a social media platform started by YouTuber Dave Rubin, has been slowly gaining more traction over the year. The branding that this platform uses looks more mainstream and startup-like than platforms such as Bitchute, UGETube and Rumble.

Methodology

These metrics have been calculated by looking at the video descriptions of 393 QAnon, conspiracy, alt-right and antivax channels on YouTube. For the calculations we used the sum of all the channels that mentioned at least one platform in their descriptions during the timespan of a month (total platform mentions). These numbers differ per month, but roughly around half of the channels mentioned those platforms in their description. We then aggregated all the mentions of these specific channels in a month and divided them through total platform mentions. One limitation of our methodology is that we focused on the mention of a platform rather than the url containing a platform reference. We will build on this for our next publication.

Our preliminary research has confirmed a pattern of creators removing their own videos that is distinct from content that has been removed due to YouTube’s moderation policies.

These videos are still able to garner significant engagement and views before they are removed. In the case of one channel, videos they removed after just three days were more successful than any other of that channel’s videos in the month. However, the majority of videos we saw removed in the past three months were taken down by YouTube. Many of the videos removed by creators or by YouTube discussed themes from the QAnon conspiracy or anti-vaccination content.

Since March, we tracked 403 video removals from 48 different channels, most of these concentrated in March and April. For almost every channel, removed videos were much more popular than videos that were still live. On average, removed videos received more than 11 times as many views as other videos for any given channel. This suggests that YouTube’s moderation efforts fail to catch problematic content before it goes viral, and that content creators gain much more attention from content that might violate YouTube’s guidelines.

Numbers

Stats

Conclusion

As many of these earlier channels and videos that mention these theories have been removed by YouTube, in efforts towards moderation of Qanon related content, our research will seek to track the emergence of similar content that is able to appear despite YouTube’s growing moderation policies and algorithms. Our research also seeks to understand how the theories we’ve found to be of interest— “great reset,” “higher consciousness” and “biblical times”— cross-relate within conspiracy, Qanon and alt-right, anti-vax and marxist communities. The data we collect on removed videos will help to understand how content may transform or shift in response to ongoing moderation and removal policies. Future research goals include visualizing the broader ecosystem of misinformation outside of YouTube and how it spreads on social platforms outside of YouTube.

Appendix and Further Discussion

Topic graphs

The full list of search terms used for each of the themes is shown below:

  • Global Elite: “great reset”, “new normal”, “build back better”, “g7”, “Bill Gates”, “Epstein”, “billionaire”, “elite pedophile ring”, “billionaires” , “bilderberg”, “global elite”, “soros”, “new world order”, “globalism”
  • Higher Consciousness: “spiritual warfare”, “consciousness”, “higher consciousness”, “beyond the veil”, “behind the veil”, “behind a veil”, “matrix”, “simulation”, “spiritual awakening”, “vibration”, “manifestation”, “raising vibration”, “intent”, “manifest”
  • Biblical Times: “jesus is coming”, “end time”, “end times”, “end of days”, “end of times”, “second coming”, “has risen”, “revelation”, “revelations”

The graphs were generated using videos from all tracked channels, not limited to specific categories of content. We simply looked for the occurrence of these phrases, not the context in which they were used. Despite casting an overly broad net, we feel this is still a good measure of the popularity of certain discussion terms. We are investigating methods for recognizing topics and discussions without manual analysis.

Removal

For removal, we specifically looked at the alt-right, conspiracy, Qanon, alt-health, and spirituality focused channels. As mentioned above, some videos have been removed by the site and some by the creators themselves. While we found removed videos to be consistently more popular than their still-live counterparts, we currently have little data on why the videos were removed in the first place. YouTube’s lack of transparency around moderation was one of our major motivations for starting this project. As such, future reports will investigate to what extent removal is used by creators to avoid moderation and how quickly YouTube responds to problematic content.

Offsite links

Our current methodology simply checks for the existence of these social media links. Some video descriptions will link to other social media posts outside of referencing a creator’s alternative profile. However, from manual checking we’ve found the vast majority of these links are creators promoting their profiles on alternative sites. Many creators also use url shorteners or aggregators like bitly or linktree to obfuscate detection and avoid censorship. Future analysis will seek to follow these links to other sites as well.

Community classification

We built our list of channels from some academic papers (https://arxiv.org/abs/1908.08313 and https://arxiv.org/abs/1912.11211) and manually searched for other related channels. Our classification comes from manual review or the determination of the papers we found. We are adding automatic discovery of channels, and plan on using an implementation of the methods in this paper (https://arxiv.org/abs/2010.09892) for automatic tagging/classification of newly discovered channels. For a full list of channels please contact us at info@raditube.com.