There's a lot to watch on YouTube. From gameplay commentary, makeup tutorials, comedy sketches, and more, it can be difficult to find stuff that you want to see. YouTube tries to help you out with its recommended videos, but if you've ever watched one cooking video only to have your YouTube app filled with nothing but cooking clips, you'll know that it doesn't always work very well.
On January 25, YouTube announced a few changes coming to its recommended videos system to make sure it's actually helpful rather than broken and annoying. One of the first changes:
We now pull in recommendations from a wider set of topics—on any given day, more than 200 million videos are recommended on the homepage alone. In fact, in the last year alone, we've made hundreds of changes to improve the quality of recommendations for users on YouTube.
Additionally, YouTube also notes that:
We'll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to—but doesn't quite cross the line of—violating our Community Guidelines. To that end, we'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
YouTube also used this announcement to reiterate the fact that it's using a mixture of machine learning and human employees to improve things. These changes to recommended videos will be rolling out gradually, and right now, YouTube will only be testing them in the United States. As they become more accurate, they'll expand to other countries.