zeal
De Techizard
According to a new tech publication on Mashable earlier today, YouTube is set to completely remove videos that Spread Misinformation or controversies about the 2020 US Elections. You can check below for the full content:
Time is up for YouTubers spreading conspiracy theories about the outcome of the 2020 U.S. presidential election.
On Wednesday, YouTube announced that it will begin removing content that alleges fraud interfered with the results of the election in November. YouTube?s policy change comes around five weeks after election day. That gave disinformation peddlers a considerable amount of time to spread conspiracy theories about the election results and unproven claims of voter fraud.
What took them so long? According to the Google-owned company, it was waiting for enough states to certify the election results.
?Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect,? reads the announcement on YouTube?s blog. ?Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.?
While Facebook and Twitter both amped up their policies against misinformation in preparation for the election, YouTube was often criticized for its more hands-off approach to stopping the spread of potentially dangerous falsehoods on the platform.
For example, in the days following the election, YouTube allowed a video by the right wing One America News Network (OANN) that falsely claimed that President Trump won the election spread on the site. In fact, it wasn?t until OANN broke the company?s COVID-19 misinformation policy that YouTube took action against the right wing news organization?s channel.
According to YouTube, the company had previously terminated thousands of channels and videos that misled voters about ?where and how to vote.? The company also removed a number of conspiratorial channels, such as those spreading QAnon-related falsehoods, in the weeks leading up to the election.
The new policy update means that YouTube will now also remove videos that claim ?a Presidential candidate won the election due to widespread software glitches or counting errors.?
There will be some exceptions to this rule, such as content that discusses these topics in an educational, scientific, or artistic way. But, if your intent is to spread these unsubstantiated claims about the election, YouTube is no longer the place to do it.
Source:
Time is up for YouTubers spreading conspiracy theories about the outcome of the 2020 U.S. presidential election.
On Wednesday, YouTube announced that it will begin removing content that alleges fraud interfered with the results of the election in November. YouTube?s policy change comes around five weeks after election day. That gave disinformation peddlers a considerable amount of time to spread conspiracy theories about the election results and unproven claims of voter fraud.
What took them so long? According to the Google-owned company, it was waiting for enough states to certify the election results.
?Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect,? reads the announcement on YouTube?s blog. ?Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.?
While Facebook and Twitter both amped up their policies against misinformation in preparation for the election, YouTube was often criticized for its more hands-off approach to stopping the spread of potentially dangerous falsehoods on the platform.
For example, in the days following the election, YouTube allowed a video by the right wing One America News Network (OANN) that falsely claimed that President Trump won the election spread on the site. In fact, it wasn?t until OANN broke the company?s COVID-19 misinformation policy that YouTube took action against the right wing news organization?s channel.
According to YouTube, the company had previously terminated thousands of channels and videos that misled voters about ?where and how to vote.? The company also removed a number of conspiratorial channels, such as those spreading QAnon-related falsehoods, in the weeks leading up to the election.
The new policy update means that YouTube will now also remove videos that claim ?a Presidential candidate won the election due to widespread software glitches or counting errors.?
There will be some exceptions to this rule, such as content that discusses these topics in an educational, scientific, or artistic way. But, if your intent is to spread these unsubstantiated claims about the election, YouTube is no longer the place to do it.
Source: