Regulating the VSPs – YouTube and TikTok Under Scrutiny?
The Government has proposed that Ofcom take responsibility for the regulation of Video Sharing Platforms (VSPs) based in the UK - such as TikTok. Ofcom issued a call for evidence - or a public consultation - on the proposal.
This action is to bring the UK into line with a European directive - the AVMSD. The government proposal is that Ofcom should take this on for now, pending the passing of their own Online Harms Bill - which is in a legislative limbo at the moment. Ofcom will likely be the regulator under the jurisdiction of the new bill - if it ever happens.
The plans for the regulatory regime don't envisage Ofcom regulating YouTube, as the biggest VSPs will fall under Irish regulation and the AVMSD places responsibility for regulation in the European territory where the platforms are based.
The Children's Media Foundation has for some time called for online content delivery platforms to accept their responsibilities to the large numbers of children who use the services on a regular basis, whom for the most part the platform operators ignore. They claim in many cases that they only cater for 13 plus, which as all parents know is simply not the case. Our position is that it should not be beyond the wit (and wallets) of these massively powerful and influential sources of content to re-configure their platforms so that their default position is SAFETY ON. And to reposition themselves as family friendly outlets, with adult-only areas behind protections which require age-verification and are subject to robust parental tech controls. Basically put the adult material in a walled garden, not the kids'.
The VSP's are hugely popular with children of all ages. They carry significant amounts of content aimed directly at kids, either by "makers" or by producers and broadcasters using them as new and direct distribution platforms. Unfortunately while they do have methods for reducing offensive content - whether it be political, hate-based, sexual or taste-based - these systems are not immediate. They are not manually moderated in advance of the material being posted. How could they be when hundreds of hours of content are uploaded every minute?
The problem arises when their search functions or recommendation algorithms offer content to young users which is inappropriate for them - and instances of this are frequent.
As we say in our response to the Ofcom consultation "children can't un-see things" and their childhoods can be irreversibly damaged by some of the content they are offered. Not content they seek out - though some always will - simply content they believe to be "for them" because the algorithm decides it should be recommended.
CMF approached our response to the Ofcom consultation with the above issues in our minds and in the context of default SAFETY ON and family friendly platforms.
The proposals listed in Ofcom's explanation of their likely powers are relatively toothless. First, the territory-specific limitation means that the big providers like Twitter, Facebook and YouTube won't be in their purview. Not a great start. Though they say they will collaborate with regulators in other countries.
And when it comes to sanctions, for the most part the regulatory powers appear to consist of advising the platforms on the standards they should apply, and ensuring they have in place a series of policies, such as age-verification, parental controls, walled gardens for children, complaints procedures, escalation of complaints, and media-literacy information.
While all of these have their uses, they are mostly already in place. So the consultation reads like a document asking the platforms to explain to Ofcom what they currently have, so that the regulator can reflect back to them: "that should work then". And many of these measures simply don't work to protect children under 13 from a huge variety of "online harms". They are easy to circumvent, not used by parents, the platforms are slow to respond, and fundamentally avoid the duty of care we expect from media operators towards young people.
That is the second part of toothless... Ofcom see themselves as a complaint handler only in the last resort, and in their consultation proposal they come across as a regulator planning to keep the companies on the straight-and-narrow as the companies themselves define it. They don't apply this sort of logic to broadcast regulation.
Despite having to navigate a series of questions in the consultation response document which Ofcom seems to have designed to get the answers they want, the CMF response is forthright and demanding.
You can read our response- here.
Play And The Art of Playful Communication YouTube: A Children’s Public Service Content Platform?