due to the coronavirus pandemic, YouTube’s moderation was mostly done by machine algorithms that work with fundamentally higher error rates than the human workforce. By parameterizing this algorithm, YouTube can control how intensively the moderation works, operators say Google, as the responsible company, has opted for stricter automatic selection.
Google's parent company, Alphabet, told its employees in March that due to the coronavirus epidemic, office work would be replaced by work from home by the end of this year, but this is not ideal for employees with similar moderation tasks, as sensitive user data and not least videos may be released into an unregulated environment.
Incidentally, automatic moderation removed violating videos in 42% of cases before a single user watched them. The report found that in the second quarter, a third of the videos were removed for content detrimental to minors' personal development, while 28.3 percent of the content was classified as spam or intentionally misleading image information, and just over ten percent depicted violence.
Most of the slightly more than two million removed videos were uploaded in the United States, followed by India and Brazil with 1.4 million and 980,000 videos in line.
Gellert is Technology Editor at Counting News Media and contributor at other major tech publications. Her interests includes testing new gadgets and reading.