A world without teachers: YouTube, incentives and the problem with self-regulating platforms

in technology, regulation, media

Ally Law, a 20 year old YouTuber, breaks into the National Theatre (Link)

YouTube has been in the news a lot recently. Whether it’s Logan Paul filming a suicide victim in a Japanese forest or Ally Law breaking into the Celebrity Big Brother house, it’s undeniable that YouTube - along with many self-regulating platforms - has a significant problem. YouTube, and platforms like it, make more money if viewers are engaged for longer. The more eyeballs they control, the more ads they can sell. For top creators - typically young, creative and hungry for fame - this is important. View count and engagement are the critical metrics that can make the difference between a video with revenues of a few hundred dollars or many thousands: universal indifference or global fame.

Incentives & The rule of the playground

There is one deep fear among these young YouTubers: having a video demonetized. If YouTube believes, for any reason, that the content you’re producing is not advertiser friendly then it will demonetized your video and prevent you earning money from it. Reasons why a video will be demonetized include:

Discussion of controversial issues and sensitive events, harmful or dangerous acts .. [or] pranks involving sexual harassment or humiliation
Read the full list here

This content is allowed on YouTube, it just can’t be monetized by creators. The decisions about what is appropriate and what is not are taken initially by pieces of software and, if escalated, by YouTube’s (small) team of moderators. These moderators are judge and jury; though unlike most judges the moderators are incentivised to keep content monetized that’s performing well - it’s YouTube’s revenue after all (YouTube splits the ad revenue 40:60 with creators).

From a regulation standpoint the incentives are clear: create content that garners the most number of views, for the longest possible time, whilst staying within loose regulations that primarily include a prohibition on sexual content and swearing. Beyond that, anything goes and anything is permissible.

Think about the sort of events in your life that encourage people to linger and observe: traffic slowing to observe a traffic accident on the opposite side of the road, a fight breaking out on the adjacent platform, a child at school jumping from a roof. These forms of entertainment are some of the most unsophisticated but effective. In YouTube’s world this is even better: content like this crosses cultural boundaries, language and more.

The YouTuber Jake Paul's most popular videos

If you needed any indication as to what content performs the best, consider Jake Paul, Logan’s 21 year old brother and amasser of 13 million subscribers. Some of the most popular videos include: “I CHEATED ON MY WIFE PRANK (she freaked out)” with 21 million views, and “RANDOM TATTOO SPIN WHEEL GAME (You Spin It, You Get It…)” with 18 million views - described by Jake Paul as “ONE OF THE MOST SAVAGE THINGS WE'VE EVER DONE”. The content that thrives on YouTube is the same content that thrives in the playground: bullying and sensational. Unlike the playground though, in YouTube’s world there aren’t any teachers to regulate what goes on - Jake Paul’s bulling behaviour can continue long after school (and make him a millionaire at the same time).

And Jake Paul really is a bully. The Martinez Twins who used to live in the "Team 10" house with Jake Paul managed to leave and posted a video about the abuse they suffered. Paul left them afraid to sleep by insisting that they left their door open at night and "pranking" them awake with "funny things" like a tazer. This wasn't new to Jake Paul:

A world without teachers

In a sense this problem has existed in the media since its inception. The Daily Mail’s “side bar of shame” is but one example of a publication driven by sensation and views. The one difference is that mainstream media does have at least some regulation. YouTube’s chief business officer said today that YouTube was "different" from traditional media outlets as it doesn’t have the same "editorial hand”. This is absolutely not the case - a moderator making a decision to allow a piece of content based on guidelines is exactly the same decision making process as an editor in a traditional newspaper; the only difference is that YouTube refuses to take this heavy and important responsibility. Why? Because admitting that they're a publisher carries with it a huge cost. 300 hours of content are uploaded to YouTube every minute - 432,000 hours every day. If you assume moderators work an 8 hour day YouTube would need 54,000 moderators just to watch the content, let alone make any decisions. To put that in perspective, Alphabet - YouTube’s parent company - currently employs around 72,000 people globally.

This raises two fundamental questions:

1. Can AI seriously replace the moderation of human editors?

Platforms like YouTube argue that they don’t need a large army of moderators because AI can do the job for them. Any mistakes made so far are because the AI hasn’t been trained properly. In my view this fails to take into account the subjective and nuanced view of an editor - the decisions made often aren’t clear cut and can’t be based on previous data. Context is critical for decision making - racism or abuse might be subliminal - something a human can pick up on but a machine never could. If this is the case, is it acceptable for YouTube to leave 99% of content unwatched? Publishing what could be incendiary content in the dark and waiting for enough viewers to flag it.

2. Should YouTube be the arbiter of what is acceptable content?

YouTube has a clear motive: increase viewership whilst not going too far so that advertisers will stop advertising. However, publishers have responsibilities as members of society. The ideas, thoughts and content created by YouTube permeate the world. Free speech is guaranteed, of course, but if freedom of speech is YouTube’s defence then they must also bear the true responsibility of free speech. As Areeq Chowdhury the Chief Executive of WebRootsUK said recently:

"Freedom of speech doesn't mean you can simply say whatever the f*ck you want about something, without there being consequences."

Promoting content like Ally Law’s which involves breaking the law again and again is YouTube's responsibility too. It shouldn’t be possible for a platform that publishes videos that involve breaking the law to take absolutely no responsibility whatsoever. Tacit support is still support, and simply closing your eyes doesn’t remove your responsibility.