YouTube doubles down on efforts to combat abusive comments
The platform will put you in timeout if it detects violative comments.
What you need to know
- YouTube has updated its policy against spam and abusive comments.
- The platform has improved its comment removal warnings and introduced timeouts for abusive comments.
- Repeat offenders will be banned from commenting for up to 24 hours if YouTube detects abusive behavior.
YouTube's comment section is undeniably a hotspot for spam and abusive comments, and the platform's efforts to combat them have fallen short in recent years. The service now wants to bolster its systems to get better at fighting spam and deterring abusive behavior in the comment section.
According to a YouTube Help community post, the platform is introducing a new way to warn users if it detects potentially harmful comments, which will also be removed (via The Verge). The latest move comes in response to growing concerns expressed by big content creators about the prevalence of spam and abusive comments on YouTube.
On top of removing those comments, the service will ban repeat offenders from leaving comments on videos for up to 24 hours. "Our testing has shown that these warnings/timeouts reduce the likelihood of users leaving violative comments again," a YouTube representative wrote.
While this may not prevent the first instance of offensive comments, it may discourage users from leaving multiple abusive comments. For the time being, these warnings will only be available in English comments, but YouTube plans to "bring it to more languages in the coming months."
The goal is to stop repeat offenders from ruining the user experience in the platform's comments area while also giving due notice to users whose comments have been removed.
To get better at detecting spam, YouTube intends to continuously improve its machine learning model responsible for spam detection. This will also keep the platform's spam detection algorithm adaptive to new methods spammers may employ in the future.
YouTube has also improved its spambot detection to kick bots out of live chats. "We know bots negatively impacts the live streaming experience, as the live chat is a great way to engage and connect with other users and creators," according to YouTube.
The new changes build on previous updates made by the video-sharing service earlier this year to combat comment spam and make impersonating creators more difficult. These rules prohibited users from concealing their subscriber count or from using certain special characters in channel names. Additionally, creators were given access to YouTube's comment moderation in order to filter out specific types of comments.
Get the Android Central Newsletter
Instant access to breaking news, the hottest reviews, great deals and helpful tips.
Jay Bonggolto always keeps a nose for news. He has been writing about consumer tech and apps for as long as he can remember, and he has used a variety of Android phones since falling in love with Jelly Bean. Send him a direct message via Twitter or LinkedIn.