YouTube Warns Users about Potentially Offensive Comments
Have you thought well before posting your comment on YouTube? If not, the site will warn you that your words may be taken as offensive. This is one of the updates YouTube rolls out to provide safer and less toxic talks on its platform. The video hosting expects these measures to positively affect the atmosphere.
Any video popular enough becomes a magnet for various commenters that don’t always care about other people’s feelings. Accusations and slurs, sexism and racism, and other offensive messages in the comments can be filtered manually, but it may take too much time if comments appear frequently. Now, the new feature can decrease the number of these comments.
Not only is this warning meant to warn the user that their comment can be deleted. It also contains the link to guidelines that shape YouTube’s policy regarding commenting on the videos. The efficiency of this warning is yet to be rated, though, and it will take some time.
There are also tools for channel administrators and moderators. Now they can create custom blocked words lists. The comments that contain these words are moved to the studio and held for review. Later you can address these comments and either delete them or allow them to appear. The AI detects closely matching words or phrases along with exact matches.
Among other changes, YouTube rolls out a more detailed creator payment system. Now creators can see their payments separately from income from AdSense. Those who earned primarily on AdSense won’t see much difference, but the rest will get better insights.
The update that arrived in early March mirrors the feature that has been long available in YouTube apps for iOS and Android; now warnings appear in the web version as well. So, do you hope it will make YouTube comments friendlier and less toxic? Let’s share our thoughts and expectations in the comments. And though we don’t have such a warning yet, let’s respect each other.