Ally Law, a 20 year old YouTuber, breaks into the National Theatre (Link)
YouTube has been in the news a lot recently. Whether it’s Logan Paul filming a suicide victim in a Japanese forest or Ally Law breaking into the Celebrity Big Brother house, it’s undeniable that YouTube - along with many self-regulating platforms - has a significant problem. YouTube, and platforms like it, make more money if viewers are engaged for longer. The more eyeballs they control, the more ads they can sell. For top creators - typically young, creative and hungry for fame - this is important. View count and engagement are the critical metrics that can make the difference between a video with revenues of a few hundred dollars or many thousands: universal indifference or global fame.
Incentives & The rule of the playground
There is one deep fear among these young YouTubers: having a video demonetized. If YouTube believes, for any reason, that the content you’re producing is not advertiser friendly then it will demonetized your video and prevent you earning money from it. Reasons why a video will be demonetized include:
Discussion of controversial issues and sensitive events, harmful or dangerous acts .. [or] pranks involving sexual harassment or humiliation
Read the full list here
This content is allowed on YouTube, it just can’t be monetized by creators. The decisions about what is appropriate and what is not are taken initially by pieces of software and, if escalated, by YouTube’s (small) team of moderators. These moderators are judge and jury; though unlike most judges the moderators are incentivised to keep content monetized that’s performing well - it’s YouTube’s revenue after all (YouTube splits the ad revenue 40:60 with creators).
From a regulation standpoint the incentives are clear: create content that garners the most number of views, for the longest possible time, whilst staying within loose regulations that primarily include a prohibition on sexual content and swearing. Beyond that, anything goes and anything is permissible.
Think about the sort of events in your life that encourage people to linger and observe: traffic slowing to observe a traffic accident on the opposite side of the road, a fight breaking out on the adjacent platform, a child at school jumping from a roof. These forms of entertainment are some of the most unsophisticated but effective. In YouTube’s world this is even better: content like this crosses cultural boundaries, language and more.
The YouTuber Jake Paul's most popular videos
If you needed any indication as to what content performs the best, consider Jake Paul, Logan’s 21 year old brother and amasser of 13 million subscribers. Some of the most popular videos include: “I CHEATED ON MY WIFE PRANK (she freaked out)” with 21 million views, and “RANDOM TATTOO SPIN WHEEL GAME (You Spin It, You Get It…)” with 18 million views - described by Jake Paul as “ONE OF THE MOST SAVAGE THINGS WE'VE EVER DONE”. The content that thrives on YouTube is the same content that thrives in the playground: bullying and sensational. Unlike the playground though, in YouTube’s world there aren’t any teachers to regulate what goes on - Jake Paul’s bulling behaviour can continue long after school (and make him a millionaire at the same time).
And Jake Paul really is a bully. The Martinez Twins who used to live in the "Team 10" house with Jake Paul managed to leave and posted a video about the abuse they suffered. Paul left them afraid to sleep by insisting that they left their door open at night and "pranking" them awake with "funny things" like a tazer. This wasn't new to Jake Paul:
my mom back in 5th grade: "he's just a bully, he won't succeed in life"— brun brun (@Bruno_Bush) June 3, 2017
my literal 5th grade bully now, no joke pic.twitter.com/uaMdpQ1fPd
A world without teachers
In a sense this problem has existed in the media since its inception. The Daily Mail’s “side bar of shame” is but one example of a publication driven by sensation and views. The one difference is that mainstream media does have at least some regulation. YouTube’s chief business officer said today that YouTube was "different" from traditional media outlets as it doesn’t have the same "editorial hand”. This is absolutely not the case - a moderator making a decision to allow a piece of content based on guidelines is exactly the same decision making process as an editor in a traditional newspaper; the only difference is that YouTube refuses to take this heavy and important responsibility. Why? Because admitting that they're a publisher carries with it a huge cost. 300 hours of content are uploaded to YouTube every minute - 432,000 hours every day. If you assume moderators work an 8 hour day YouTube would need 54,000 moderators just to watch the content, let alone make any decisions. To put that in perspective, Alphabet - YouTube’s parent company - currently employs around 72,000 people globally.
This raises two fundamental questions:
1. Can AI seriously replace the moderation of human editors?
Platforms like YouTube argue that they don’t need a large army of moderators because AI can do the job for them. Any mistakes made so far are because the AI hasn’t been trained properly. In my view this fails to take into account the subjective and nuanced view of an editor - the decisions made often aren’t clear cut and can’t be based on previous data. Context is critical for decision making - racism or abuse might be subliminal - something a human can pick up on but a machine never could. If this is the case, is it acceptable for YouTube to leave 99% of content unwatched? Publishing what could be incendiary content in the dark and waiting for enough viewers to flag it.
2. Should YouTube be the arbiter of what is acceptable content?
YouTube has a clear motive: increase viewership whilst not going too far so that advertisers will stop advertising. However, publishers have responsibilities as members of society. The ideas, thoughts and content created by YouTube permeate the world. Free speech is guaranteed, of course, but if freedom of speech is YouTube’s defence then they must also bear the true responsibility of free speech. As Areeq Chowdhury the Chief Executive of WebRootsUK said recently:
"Freedom of speech doesn't mean you can simply say whatever the f*ck you want about something, without there being consequences."
Promoting content like Ally Law’s which involves breaking the law again and again is YouTube's responsibility too. It shouldn’t be possible for a platform that publishes videos that involve breaking the law to take absolutely no responsibility whatsoever. Tacit support is still support, and simply closing your eyes doesn’t remove your responsibility.
Today Transport for London (TfL), the body responsible for regulating transportation in London announced something surprising: they will not be renewing Uber’s licence to operate in London, and Uber will have to stop providing its service in the coming months. Of course, there was an understandable uproar from the 40,000 Uber drivers currently making a living from driving in the capital, as well as the millions of passengers that use the service every day. Uber will appeal the decision, but TfL may decide to uphold the ban.
The Regulatory Decision
The reasons TfL have given to suspend Uber centre upon one major judgement. In the report, TfL states that:
Uber's approach and conduct demonstrate a lack of corporate responsibility in relation to a number of issues which have potential public safety and security implications
It is certainly true that Uber hasn’t been a perfect corporate citizen in recent years. Its corporate culture has been shown to be hideously sexist, it has bullied its staff and systematically underpaid its drivers. The judgement made by TfL doesn’t directly reference this behaviour, though, and despite Uber’s activity being undeniably abhorrent it doesn’t directly relate to the public safety and security implications raised by TfL.
Surely if public safety issues were brought to TfL’s attention - as they have been in the past - it would have been appropriate for them to revoke Uber’s licence before it came up for renewal after five years. When it comes to public safety, how is it possible that the issues TfL has cited have only become apparent at the exact moment that the licence needs to be reviewed? In this way the decision seems like a general moral judgement rather than one related specifically to safety.
The pace of change in today’s world has resulted in regulators being left behind. I’ve identified this in many areas - from crypto currencies to data and education. Sadly, the regulatory approach to technology is often doing nothing for many months or years, and then rather than engaging with the inevitable social and political issues that this technology raises, banning the technology completely.
On the same day that TfL made its regulatory ruling, Jamie Dimon the Chief Executive of JPMorgan made a comment about crypto currencies:
Right now these crypto things are kind of a novelty. People think they're kind of neat. But the bigger they get, the more governments are going to close them down
See full video
This is exactly the same approach that TfL has taken with Uber - permissive at the beginning so that the technology is able to proliferate, resulting in ambiguous employment relationships and unfair competition, and then - unable to control it - aggressively punitive at the end, shutting the entire service down.
This regulatory lurch - moving from absolutely zero regulation to a ban - is hugely damaging to innovative efforts and acts to characterise entire technologies and activities as illegal without trying to identify the benefits and actively engaging in the debate. Simply banning Uber will not solve the issue of insecure employment for workers, or the fact that many people who are able to act as taxi drivers were prevented from doing so because of extensive regulation in the past. Firms like Deliveroo and Amazon operate with these principles too; will the government shut down these operations as well? Where is the line drawn?
Engagement is the answer
These decisions are damaging. Not just for Uber, but for the 40,000 who drive for the company and the 3.5 million people who use the service. By engaging early with disruptive technologies regulators can actually help to shape services, forcing them to be socially responsible as well as profitable. By ignoring the issues until the last moment no one wins: consumers get a worse service, drivers lose their jobs and the important social and political issues raised by peer to peer technology are - conveniently - swept under the rug.