Youtube’s recommendation algorithm pushes people to watch ever more extreme content
“Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content. … What we are witnessing is the computational exploitation of a natural human desire…. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.” Regardless of topic, Youtube’s recommendation algorithm recommends videos containing more extreme versions of whatever it is you just watched. This makes sense from a…