YouTube has released a new quarterly moderation report, detailing how it removes harmful videos on the platform |
The Google-owned company said it took down more than 8 million videos between October and December for violating its community guidelines. The majority of the videos were spam or people trying to upload "adult content."
"This regular update will help show the progress we're making in removing violative content from our platform," the video-sharing site said in a blog post.
How Videos Are Filtered in YouTube
While it hires experts to deal with the matter, 80% of the videos or 6.7 million videos to be exact, were removed by machines. 76% of these videos were removed before a single view was registered.Most of the deleted content came from spam networks as they uploaded adult content violating the terms and conditions of the website.
Normal users flag 95% of the remaining videos that haven’t been caught the automated moderator yet, though, they have a success rate of 5%. Finally, there are trusted flaggers, the people who review the remaining 5% of the videos. Their success rate is higher at 14%.
YouTube also said it will add more details to the quarterly reports by the end of the year, such as information about comments, the speed of removal and policy removal reasons.
YouTube also announced a "Reporting History" dashboard where users can check to see the status of videos they've flagged for review.
Post A Comment:
0 comments: