There is a new and very interesting paper on this topic by Annie Y. Chen, Brendan Nyhan, Jason Reifler, Ronald E. Robertson and Christo Wilson. Here is the abstract:
Do online platforms facilitate the consumption of potentially harmful content? Despite widespread concerns that YouTube’s algorithms send people down “rabbit holes” with recommendations to extremist videos, little systematic evidence exists to support this conjecture. Using paired behavioral and survey data provided by participants recruited from a representative sample (n=1,181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment. These viewers typically subscribe to these channels (causing YouTube to recommend their videos more often) and often follow external links to them. Contrary to the “rabbit holes” narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered.
I am traveling and have not had the chance to read this paper, but I do know the authors are very able. I am not saying this is the final word, but I would make the following observation: there are many claims made about social media, and many of them might be true, but for the most part they are still largely unfounded.