Skip to toolbar

■ YouTube strengthens its Rules About critical Conspiracy Theories, Especially QAnon Content

   YouTube has shared an announcement of the new update about rules around hate speech. The focus rules on decreasing the spread of misinformation and conspiracy theories and especially QAnon that have caused some read-world violent incidents.

According to YouTube:

“Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence. One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.”

This update came after Facebook toughened the stance against content about QAnon, considering the developing danger of the group and its activity. As Facebook’s move went a bit further, in that it will witness the removal of all Instagram accounts, Pages and groups on Facebook that represents QAnon.

YouTube has left some room for exemption in its updated process:

“As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up. We will begin enforcing this updated policy today, and will ramp up in the weeks to come.”

This is the newest identification from the giant social media platforms that helping in that matter that can lead to danger in the real life. And as YouTube has stopped of a complete ban on all content related to QAnon. The new measurement will see more restrictions of the group that will restrict its impact.

Also, YouTube stated that it has already reduced much of discussion about QAnon. A couple of years ago, YouTube decreased the reach of dangerous misinformation by its ‘Up Next’ suggestion. It stated that has resulted in a 70% decline in views comes from discovery and search systems.

“In fact, when we looked at QAnon content, we saw the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019. Additionally, we’ve removed tens of thousands of QAnon-videos and terminated hundreds of channels under our existing policies, particularly those that explicitly threaten violence or deny the existence of major violent events.”

Therefore, there are still channels that are known as ‘Q-related’ that will not be deleted after this new update.

It seems strange, We do understand the stance of YouTube in this matter, and it will just plan to delete content that targets a group or individual. However, part of the problem with QAnon, and other moves, they have been permitted to begin as harmless chatter, and have extended from there into serious and concerning movements.

You could dispute, earlier, that no one knew that QAnon would see this increase in what it has. However, we do this currently. Therefore, why let any of it stay?

Also, the QAnon case is highlighting the need for social media platforms to interest official warnings back in this matter. And that in order to stop the activity of these groups before they have real traction. Specialists have been alerting the giant social platforms about the danger of the QAnon for several years, but now they are only looking to seriously limit the discussion.

 

Link:

https://www.digitalmarketnews.com/youtube-strengthens-its-rules-about-critical-conspiracy-theories-especially-qanon-content/

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*