According to Atlas VPN investigation, Google’s video platform YouTube removed 1.
98 million channels between January and March of 2020.
5% of the cases, the channels were terminated due to promoting scams, sending out spam, or posting misleading content.
YouTube deletes a channel when it violates its community guidelines three times within 90 days.
Also, a channel can be terminated because of a single severe case, such as predatory behavior.
When YouTube deletes a channel, all of its videos are removed as well.
Together with 1.
98 million channels, due to channel-level suspension, more than 51 million videos were removed in the first quarter of 2020, as reported in the recently released YouTube Community Guidelines enforcement report.
Google’s streaming platform removed nearly 158 thousand or around 8% of channels for nudity or sexual content.
Another 51 thousand or 2.
6% of channels were terminated for violating YouTube’s child safety rules.
YouTube deleted 11 thousand or 0.
6% of channels due to harassment and cyberbullying.
Another small portion - nearly 10 thousand channels or 0.
5% were removed because they promoted violence and violent extremism.
Similarly, YouTube terminated 9.
5 thousand or 0.
5 % of channels for multiple policy violations.
Google’s video platform puts channels that violated several rules into this category.
6 thousand or 0.
1% of channels were terminated for hateful or abusive content, reports YouTube.
Another 2 thousand or around 0.
1% of channels were deleted because they impersonated other channels, brands, or personalities.
Nearly 694 million comments removed YouTube’s report also highlights that a staggering 693.
58 million comments were removed in Q1 of 2020.
Out of 693.
58 million comments, Youtube’s algorithm flagged 99.
6% of cases automatically, indicating that the algorithm is doing most of the heavy lifting.
In contrast, humans flagged 2.
5 million or 0.
4% of comments.
As a side-note, these numbers do not include comments removed by channel owners on their videos.
Just like with channels, most comments - 470 million or 67.
9% were deleted because they promoted scams, misleading content, or violated spam guidelines.
The leading streaming platform terminated another 96.
6 million comments or 13.
9% because of child safety violations.
In addition, over 81.
5 million or 11.
8% of remarks were removed due to harassment and cyberbullying.
Lastly, YouTube removed more than 43.
5 million or 6.
3% of opinions because they were hateful or abusive.
YouTube clarifies that due to the pandemic, they reduced in-office staffing, and they mostly rely on machines to flag content.
Using technology to flag comments, channels, or videos means that more terminations than usual may not violate YouTube’s policies.
However, YouTube does have dedicated teams around the world that reviews flagged comments and removes them if they do violate Youtube’s community guidelines.
Users increasingly take part in flagging videos For users who wish to help make YouTube a better place, under every video, after clicking on three horizontal dots, there is an option to “Report” a video for various reasons.
Also, users can report comments by clicking on three vertical dots on the right of the comment.
Youtube’s staff reviews these reports, and if the content is inappropriate, they delete it.
Google’s streaming platform informs that over 90 million users have flagged videos on YouTube since 2006.
The number of reports per day is up 25% year-by-year.
In total, YouTube removed 6.
11 million videos in the first quarter of 2020.
The platform’s algorithm seems to be working quite well and fast, as 77.
3% of removed videos had 10 or fewer views.
Most terminated videos originated from the United States (1 million videos).
Youtube checks the content creator’s IP address to determine from which country the creator is uploading the video.
By looking at the numbers of removed channels, videos, and comments, it seems that Google will not give up on its battle against inappropriate content anytime soon.