Results for “YouTube” 1146 found
To what extent does the YouTube recommendation algorithm push users into echo chambers, ideologically biased content, or rabbit holes? Despite growing popular concern, recent work suggests that the recommendation algorithm is not pushing users into these echo chambers. However, existing research relies heavily on the use of anonymous data collection that does not account for the personalized nature of the recommendation algorithm. We asked a sample of real users to install a browser extension that downloaded the list of videos they were recommended. We instructed these users to start on an assigned video and then click through 20 sets of recommendations, capturing what they were being shown in real time as they used the platform logged into their real accounts. Using a novel method to estimate the ideology of a YouTube video, we demonstrate that the YouTube recommendation algorithm does, in fact, push real users into mild ideological echo chambers where, by the end of the data collection task, liberals and conservatives received different distributions of recommendations from each other, though this difference is small. While we find evidence that this difference increases the longer the user followed the recommendation algorithm, we do not find evidence that many go down `rabbit holes’ that lead them to ideologically extreme content. Finally, we find that YouTube pushes all users, regardless of ideology, towards moderately conservative and an increasingly narrow range of ideological content the longer they follow YouTube’s recommendations.
That is from a new research paper by Jonathan Nagler and Joshua A. Tucker. And here are previous posts on YouTube, some of them covering radicalization charges in further detail. Via the excellent Kevin Lewis.
Abe emails me:
Tyler, I really enjoyed your recent podcast with Russ Roberts talking about favorite books and reading strategies. On the podcast, you mentioned YouTube a couple of times. I was hoping Russ would ask you about your YouTube habits, but he didn’t, so I thought I’d email to ask. What type of things do you watch on YouTube? Do you have any favorite channels or strategies for finding good content? I think it would be interesting to hear your thoughts on the subject.
My habits here are primitive, and not recommended for most of you sophisticates, but here goes:
1. I don’t subscribe to YouTube channels.
2. I watch some reasonable percentage, at least in part, of what people send me.
3. I watch prospective guests for CWT, to experience their conversational rhythms and mannerisms and “tics.”
4. I listen to music, especially when I am traveling, mostly classical music recitals or “world music,” to use a much-abused phrase. For many “world musics,” the visual element is all-important. I love Led Zeppelin, but I don’t click on them in this medium. Piano and guitar recitals I enjoy much more than orchestral music, at least on YouTube.
5. Sometimes I watch videos on science, or occasionally econometrics. It is often the best way to learn new concepts in these areas.
6. I watch Magnus Carlsen play BanterBlitz and engage in related chessboard antics in other forums, mostly while I am exercising on the Peloton. If you understand chess reasonably well, he is one of the greatest entertainers of our time, in addition to being the best chessplayer ever.
7. I don’t listen on speeds other than 1x. Doing so would disrupt the purposes mentioned above! If I am just trying to absorb information rapidly, typically I would prefer a book. The information from #5 usually is difficult enough for me to stick with 1x. If it is just someone blabbing, typically I care about the true human rhythms of speech, or I just won’t do it.
There is a new and very interesting paper on this topic by Annie Y. Chen, Brendan Nyhan, Jason Reifler, Ronald E. Robertson and Christo Wilson. Here is the abstract:
Do online platforms facilitate the consumption of potentially harmful content? Despite widespread concerns that YouTube’s algorithms send people down “rabbit holes” with recommendations to extremist videos, little systematic evidence exists to support this conjecture. Using paired behavioral and survey data provided by participants recruited from a representative sample (n=1,181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment. These viewers typically subscribe to these channels (causing YouTube to recommend their videos more often) and often follow external links to them. Contrary to the “rabbit holes” narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered.
I am traveling and have not had the chance to read this paper, but I do know the authors are very able. I am not saying this is the final word, but I would make the following observation: there are many claims made about social media, and many of them might be true, but for the most part they are still largely unfounded.
The channel behind this operation is called AroundMeBD, and its success has created a whole new economy in Shimulia, which has since been dubbed the YouTube village of Bangladesh.
The YouTube village is a prominent example of a niche but is also part of a growing online trend across South Asia: As the internet reaches villages, rural societies are finding ways to showcase and monetize their unique food cultures to audiences across the world, using platforms like YouTube and Facebook. In India, Village Cooking Channel, which posts videos of large-scale traditional cooking, has over 15 million subscribers. In Pakistan, Village Food Secrets has 3.5 million subscribers. Villagers who previously had little presence in media are now using these platforms to take ownership of the way their culture is portrayed — and building businesses that support dozens, and occasionally hundreds, of individuals.
Here is the full story, via Zach Valenta. The article is interesting throughout, and yes YouTube remains underrated.
Here is the link, it is about one hour long, with questions interspersed throughout, the title is “The future social and political implications of COVID-19.” Self-recommending!
Outsiders and critics often think of YouTube and computer gaming as entertaining and quite superficial modes of cultural consumption. I have increasingly moved away from that point of view, and to pursue the argument I will note that lately my favorite YouTube video is Magnus Carlsen doing 100 chess endgames in 30 minutes. That is not recommended for most of you, but I believe that is part of the point. I now think of YouTube as a communications medium with (often, not always) high upfront “investment in context” costs. So if a lot of videos seem stupid to you, well sometimes they are but other times you don’t have enough context to understand them, or for that matter to condemn them for the right reasons. This “high upfront costs” model is consistent with the semi-addictive behavior exhibited by many loyal YouTube users. Once you start going down a rabbit hole, it can be hard to stop, and the “YouTube is superficial” models don’t really predict that kind of user behavior, rather they predict mere channel-surfing.
Did you know that Yonas, my Ethiopian contact in Lalibela, and recipient of royalties from my book Stubborn Attachments, loves YouTube videos on early Armenian church history? He seems to know all about that topic. A lot of those same videos would not make much sense to me. I could follow them, but they wouldn’t communicate much meaning, whereas the Ethiopian and Armenian Christian churches have a fair amount in common, including in their early histories.
Has the popularity of PewDiePie — 103 million subscribers — ever mystified you? I have in fact come to understand the material is brilliant, though not in a way I care about or wish to come to grasp in any kind of detailed way. For me the entry costs are just too high relative to the kind of payoff I would achieve. You really have to watch a lot of videos to get anywhere with grasping the contexts of his various jokes and remarks.
This also helps explain why there is no simple way to find “the best videos on YouTube.”
Perhaps computer games have some of the same properties. They have great meaning to those who know their ins and outs, but leave many others quite cold. Sometimes I hear people that things like “Twenty or thirty years from now, computer games will develop into great works of art.” I doubt that. To whatever extent computer games are/will be aesthetically notable, those properties are probably already in place, just with fairly high upfront context costs and thus inaccessible to someone such as I.
The high upfront costs, of course, mean a high degree of market segmentation and thus perhaps relatively high profits for suppliers, at least in the aggregate if not in every case.
Could it be that these top cultural forms of today have higher upfront costs than say appreciating 18th century Rococo painting?
In any case, trying to understand the cultural codes of 2020 is a truly difficult enterprise.
For this material, I wish to thank a related conversation with S.
The role that YouTube and its behind-the-scenes recommendation algorithm plays in encouraging online radicalization has been suggested by both journalists and academics alike. This study directly quantifies these claims by examining the role that YouTube’s algorithm plays in suggesting radicalized content. After categorizing nearly 800 political channels, we were able to differentiate between political schemas in order to analyze the algorithm traffic flows out and between each group. After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels. Our study thus suggests that YouTube’s recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.
That is from a new paper by Mark Ledwich and Anna Zaitsev. That hardly settles the matter, but you may recall the last serious papers on this topic also indicated that YouTube does not radicalize. So if you are still believing that YouTube radicalizes, you will need to come up with additional facts for your point of view.
From a new and very important paper by Kevin Munger and Joseph Phillips from Penn State:
The most extreme branches of the AIN (the Alt-Right and Alt-Lite) have been in decline since mid-2017.
However, the Alt-Right’s remaining audience is more engaged than any other audience, in terms of likes and comments per view on their videos.
The bulk of the growth in terms of both video production and viewership over the past two years has come from the entry of mainstream conservatives into the YouTube marketplace.
…despite considerable energy, Ribeiro et al. (2019) fail to demonstrate that the algorithm has a noteworthy effect on the audience for Alt-Right content. A random walk algorithm beginning at an Alt-Lite video and taking 5 steps randomly selecting one of the ten recommended videos will only be recommended a video from the Alt-Right approximately one out every 1,700 trips. For a random walker beginning at a “control” video from the mainstream media, the probability is so small that it is difficult to see on the graph, but it is certainly no more common than one out of every 10,000 trips.
That authors suggest (p.24) that if anything the data suggest deradicalization as a more plausible baseline hypothesis.
Of course this is not the final word, but in the meantime so much of what you are reading about YouTube would appear to be wrong or at least off-base.
Birmingham was brought to a standstill on Saturday, with motorists abandoning cars and the city gridlocked for hours after thousands of teenagers flooded the city centre to see a 19-year-old YouTuber make a 30-second public appearance at a cosmetics store.
Many shoppers were forced to cancel their trips, while parts of the bus network ground to a halt and road traffic was at a standstill, as fans hoped to catch a glimpse of James Charles, who is known for his online makeup guides.
His channel is Market Power, and he promises new economics videos every Tuesday. Here is the associated Twitter account for the channel. Here is his video “How much does vibranium cost in the Marvel movies?”:
Here is another video “How much is an Oscar nomination worth?”
And I am pleased to announce that Craig is a newly minted Emergent Ventures fellow. He also is an economic historian, and has lived for two years in Haiti, both big pluses in my view.
Angie does not have a formal residence, but he does have a job:
The park offers free wireless access, and with his laptop, Angle watches YouTube videos in exchange for bitcoins, the world’s most popular digital currency.
For every video he watches, Angle gets 0.0004 bitcoins, or about 5 cents, thanks to a service, called BitcoinGet, that shamelessly drives artificial traffic to certain online clips. He can watch up to 12 videos a day, which gets him to about 60 cents. And he can beef up this daily take with Bitcoin Tapper, a mobile app that doles out about 0.000133 bitcoins a day — a couple of pennies — if he just taps on a digital icon over and over again. Like the YouTube service, this app isn’t exactly the height of internet sophistication — it seeks to capture your attention so it can show you ads — but for Angle, it’s a good way to keep himself fed.
Angle, 42, is on food stamps, but that never quite gets him through the month. The internet provides the extra money he needs to buy a meal each and every day. Since setting up a bitcoin wallet about three or four months ago, he has earned somewhere between four or five bitcoins — about $500 to $630 today — through YouTube videos, Bitcoin Tapper, and the occasional donation. And when he does odd jobs for people around Pensacola — here in the physical world — he still gets paid in bitcoin, just because it’s easier and safer. He doesn’t have to worry as much about getting robbed.
The full story is here, excellent photos, and for the pointer I thank Mike Komaransky.
I am teaching IO next fall and would like to use some lectures on YouTube. The class is at the Ph.d. level but a good intuitive talk would be fine, I'm not expecting all of these talks to be directed at the Ph.d. level per se. Would you all have recommendations?
By the way, here is my Google talk on prizes as incentives.
I love being reminded of the history of economic thought:
It seems safe to assume that YouTube’s traffic will continue to grow,
with no clear ceiling in sight. Since the majority of Google’s costs
for the service are pure variable costs of bandwidth and storage, and
since they’ve already reached the point at which no greater economies
of scale remain, the costs of the business will continue to grow on a
linear basis. Unfortunately, far more user-generated content than
professional content makes its way onto the site, which means that
while costs grow linearly, non-monetizable content is growing
geometrically as compared against the monetizable content that YouTube
really wants and needs to survive. This means less and less of
YouTube’s library will be revenue-contributing, while the costs of
delivering that library will continue to grow.
The article is interesting through and the hat tip goes to Andrew Sullivan.