YouTube users are running into fake and dangerous videos, study finds
Pew Research found 60 percent of people run into "troubling" videos.
New data from Pew Research Center highlights both the highs and lows of YouTube. The study -- which was based on a survey of 4,594 adults in the US conducted earlier this year -- found that people of all ages use YouTube for just about everything. It's especially a popular choice for figuring out how to do something they have never done before and for entertainment purposes. Unfortunately, a majority of YouTube users report being exposed to false or "troubling" videos during their visits to the platform.
At its best, YouTube is a great resource for people. According to Pew, 87 percent of survey respondents said YouTube is important for learning how to do new things. That benefit reached all age groups, with more than half of users ages 18 to 29 and 41 percent of people 65 and older saying they used YouTube to learn new skills.
And then there's the flip side of YouTube, where problematic content runs wild. Pew found three in five people encounter videos showing people in dangerous or troubling situations. Two-thirds of people said they run into false or untrue videos at least sometimes, with 15 percent finding such videos frequently. That's a real problem for a platform that is increasingly becoming a source of news for many visitors; 53 percent of YouTube users told Pew the site is at least somewhat important for helping them understand what is happening in the world.
YouTube has been combatting disturbing, problematic content on its platform for some time now. The platform came under fire earlier this year when a top trending video promoted a conspiracy theory about the students who were victims of the mass shooting at Marjory Stoneman Douglas High School in Florida. The platform also took heat for a rash of disturbing videos targeted at kids that cropped up earlier this year.
When asked for comment, YouTube provided a summary of its ongoing effort to keep tabs on any content that violates its terms. For example, the company explained that it removed over 17 million videos that violated policies in the first half of 2018, with the majority being flagged by machine learning systems. YouTube also explained that it added features that surface credible news outlets, especially during major events. That second item was announced as part of a $25 million investment back in July.