YouTube is still suggesting videos with misinformation, violent content, and COVID-19 misinformation, according to a major new study published this month. Notably, this isn't just an issue with ...
For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation. The findings, ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. Reading time 3 minutes My YouTube ...
YouTube's recommendation algorithm shows false information and inappropriate videos to its users repeatedly, as per a study by Mozilla. BRAZIL - 2021/02/08: In this photo illustration the YouTube logo ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results