guglice.blogg.se

Youtube repeat
Youtube repeat












youtube repeat youtube repeat

Driven by AI algorithms incentivized to reinforce traits that are positive for engagement, more of that content filters into the recommendation systems. Now that this content has been favored in some way, it gets boosted, which causes creators to upload more of it. By then, it's too late they have given a positive signal to the algorithm. Even if a user notices the deceptive nature of the content and flags it, that often happens only after they've engaged with it. We know that misinformation, rumors, and salacious or divisive content drives significant engagement. It is sometimes in the interest of the user to stay on a platform as long as possible-when listening to music, for instance-but not always. The companies employing recommendation algorithms want users to engage with their platforms as much and as often as possible because it is in their business interests. More generally, it’s important to examine the incentive structure underpinning the recommendation engine. Because AI aims to maximize engagement, hyper-engaged users are seen as “models to be reproduced.” AI algorithms will then favor the content of such users. In the real world, AI, content creators, and users heavily influence one another. The model didn’t take into account how the recommendation system influences the kind of content that's created. They concluded that “feedback loops in recommendation systems can give rise to ‘echo chambers’ and ‘filter bubbles,’ which can narrow a user’s content exposure and ultimately shift their worldview.” Without him, the cycle could have continued for years.īut this incident is just a single example of a bigger issue.Įarlier this year, researchers at Google’s Deep Mind examined the impact of recommender systems, such as those used by YouTube and other platforms. In the case of the pedophilia recommendation chain, YouTube should be grateful to the user who found and exposed it. At that stage, problems with the algorithm become exponentially harder to notice, as content is unlikely to be flagged or reported.

youtube repeat

Here’s where it gets dangerous: As the AI improves, it will be able to more precisely predict who is interested in this content thus, it's also less likely to recommend such content to those who aren't. The stronger the AI becomes-that is, the more data it has-the more efficient it will become at recommending specific user-targeted content. In the case of the pedophile scandal, YouTube's AI was actively recommending suggestive videos of children to users who were most likely to engage with those videos. Those algorithms track and measure the previous viewing habits of the user-and users like them-to find and recommend other videos that they will engage with. Using recommendation algorithms, YouTube’s AI is designed to increase the time that people spend online.














Youtube repeat