YouTube Takes Steps to Stop Spread of Election DisinformationStreaming Site Joins Other Social Media Companies in Updating Policies
YouTube is the latest social media firm to adjust its policies as the 2020 U.S. presidential election gets underway. On Monday, the company announced plans to remove misleading political content and other disinformation from its platform.
See Also: A Toolkit for CISOs
The policy update from Google-owned YouTube came just before the Iowa caucuses got underway Monday night - the first step in selecting the nominees for president in 2020. Reporting of the results of the caucuses was delayed by technical issued related to a new app that was designed to collect the results, according to the New York Times and other media reports.
While officials ruled out hacking and other cybersecurity concerns, it appears that faulty code and a lack of training resulted in the delay in tabulating the results, according to the Times.
In a Monday blog post, YouTube noted videos that contain wrong information about the voting process - such as incorrect voting date, or videos that make false claims about a candidate or elected government officials - will be removed. The company will also remove videos that have been doctored to manipulate users.
YouTube also will terminate channels that try to impersonate another person or channel, falsify their country of origin or hide associations with a government entity.
Plus, the video-sharing service will ban channels that artificially inflate metrics, such as views, likes and subscribers, by either using automatic systems or deceiving viewers by making them watch misleadingly labeled content.
Highlighting 'Quality Journalism'
YouTube also says authoritative or reliable voices for news and information will be displayed higher in its search results as well as the platform's "watch next" panels.
"We introduced Top News and Breaking News shelves to highlight quality journalism, as well as information panels that indicate funding sources below videos from publishers that receive public or government funding," the blog states.
When users search for candidates in 2020, YouTube will show an information panel with additional details about the candidates along with the candidates' official YouTube channels, like it did during the 2018 U.S. midterms and the 2019 European Union Parliamentary elections.
Other Social Media Efforts
Other social media companies have also been working to clamp down on the spread of misinformation in the run-up to the presidential election.
For example, Twitter announced last year that it was banning political advertising worldwide on its platform. Music streaming platform Spotify also said it would pause the sale of political advertisements in early 2020, according to Ad Age.
YouTube's parent company Google changed its policy on political ads late last year, saying it would stop political targeting based on public voter records and general political affiliations in the U.S.
In December, Facebook said it would give users greater control over how many political ads they see on its main platform as well as Instagram, but refused to put limits on ad targeting. Instead, the company said it would remove so-called "deepfake" videos and media as well as other content created by artificial intelligence.
In Oct 2019, Facebook also announced that it had removed four networks from its platform - three connected to Iran and one from Russia (see: Facebook Shuts Misleading Accounts Ahead of 2020 Election).
Some federal agencies have met with social media and tech firms to discuss election-related security sssues (see: Feds, Tech Giants Meet to Coordinate 2020 Election Security).
Last year, the U.S. Senate Intelligence Committee released a report on foreign interference in the 2016 election and made suggestions to social media companies on preventing election meddling. One of the suggestions made by the committee was the labelling of political ads and false content (see: Preventing Election Interference: New Recommendations).