An insight into how our politicians might pull the levers of government to create a new public service media system that meets the needs of the children's audience.
Our Children’s Future
There are always more urgent issues demanding the attention of ministers and MPs than there is the time in the day or the space on the legislative timetable that is needed, perhaps never more so than now. And in these times, it can be difficult to make an argument for the arts sector as a priority, when there are so many other businesses in need of support as well as schools and hospitals. But even as we debate how best to cope with the immediate crisis, we must look to the future. What sort of world will we build as we emerge from this crisis? What is the future that, as parents, we imagine for our children? And how will our children develop the resilience and creativity that will allow them to make the most of that future?
When we think about how we can prepare our children, we turn first to our schools and our teachers. And rightly so. Our education system is at the heart of our community efforts to open young people’s eyes to the complexities of the world they live in, whether that is the world of work, politics or even personal relationships. We ask our schools to teach and then uphold our shared values of fairness, tolerance, inclusivity and respect for our democratic institutions and the rule of law. With great dedication and professionalism, teachers take on that responsibility, guiding each generation to become informed, active citizens of the United Kingdom.
But increasingly, many of our children are getting one clear message from their teachers and parents and a very different, often confused message from the media content they access on their phones, tablets and laptops. Instead of engaging with content curated for them by a well-regulated public service broadcasting system, our children are increasingly watching content ‘curated’ for them by an algorithm. Instead of being guided toward diverse content that touches on different topics, ranging across genres and formats, they are being guided toward what they ‘like’. To be precise, they are guided toward whatever the algorithm, and its analysis of data from millions of users, has calculated is most likely to entice that specific user to keep watching video content on that platform. And what is wrong with that?
Where exactly is the problem? We all want to watch stuff we like. Certainly, if you asked a child that question, they would answer loud and clear – “I want to see more of what I like”. However, as adults, we have – hopefully - learned there is a difference between what we want and what we need. We might want to watch another episode of the latest drama ‘box set’, but we choose to watch a news bulletin because we understand we need to hear what is happening in the world. We might like the idea of watching a favourite sitcom episode for the twentieth time, and sometimes we end up doing exactly that! But the next time perhaps we choose a new show on an unusual theme or about a place that we know nothing about. As adults we understand that if you don’t challenge yourself, or only watch what you’ve always watched, you risk simply reinforcing your beliefs and prejudices.
That’s the danger of algorithmic interactions with media content. They can be limiting and, inadvertently, can restrict our freedom to choose. I stress ‘can’ because there are many benefits to the technologies we have developed that allow media companies to personalise digital experiences. But they work best when the user is savvy enough to make the algorithm work for them, instead of vice versa; when they are able to break away from some predetermined pattern of engagement. There are times when you have to resist where the algorithm takes you. And despite claims to the contrary, children and young people are not always that savvy around technology. It’s true they learn fast, but their learning tends to be focused on breaking down barriers and getting around parental restrictions to access more of what they like, rather than developing discernment. The younger audience, in general, is not great when it comes to thinking about getting what they need.
At this point we are all probably thinking about some of the popular video sharing platforms (VSPs) and, in particular, the big players in that space, such as YouTube, TikTok and Facebook. Those platforms are extraordinary success stories. They demonstrate just how responsive businesses can be in a free market. They are smart, agile and completely focused on what their customers want, constantly analysing the data and adjusting systems to maximise value to those customers. The only problem with that is their customers are not the young people who access content on the platforms; their customers are the businesses that are selling the products and services advertised on the platforms. The children and young people who access the video content are actually the product that is being sold. The more users on the platform, the higher the price for placing an advert.
Does that sound terrible? Really, it’s not. It’s exactly the same business model you find in commercial television. ITV is in the same game and we are all fine with that. We, the audience, get to watch ‘free’ TV and ITV makes a profit from the ad sales. We all win. Although up until very recently, there was one important difference between ITV and YouTube, TikTok and Facebook. We come back again to that very dull but important concept, the regulatory framework.
ITV is a commercial public service broadcaster and part of our existing PSB system, along with the BBC, Channel Four and Five. It operates under the watchful eye of the communications regulator, Ofcom, and is subject to the legally-constituted Broadcasting Code, which attempts to set quality thresholds for programmes by, for example, laying down guidelines on impartiality and accuracy in news and current affairs. Ofcom regulations also offer protection to both contributors and audiences by setting down rules on fairness and by requiring broadcasters to take steps to avoid causing harm and offence.
As of October 2020, TikTok, along with some other smaller VSPs, is now regulated by Ofcom. It is a move in the right direction, although the details of how services will comply with the new regulations are still being worked out. As it stands, the VSPs are only required to take "appropriate measures" to protect children from potentially harmful content, in contrast to the stronger wording contained in the Ofcom Broadcasting Code, which lays down a guiding principle "to ensure that people under 18 are protected" and requires broadcasters to take "all reasonable steps" to protect people under 18. But, as I’ve said, this is a move in the right direction and the government is committed to extending that regulation when it introduces the Online Harms Bill.
YouTube is another matter. Although YouTube, according to Ofcom’s research, is the most popular media platform for UK children, it does not come under the new regulations for VSPs, because YouTube is based in Ireland, not in the UK. Irrespective of where a media company operates, it is regulated by the government of the country where its European business is registered. So, when there is a particular issue the UK would like to see addressed by regulation, The UK is dependent on the goodwill and judgement of the Irish government and EU directives. Of course, there is excellent cooperation between the two governments, and even if there wasn’t total unanimity on the best way forward, what are the chances of a large, global media company seeking to exploit any difference in opinion? Heaven forfend.
But again, as things currently stand, the only regulation Ofcom has brought forward with respect to VSPs is directed towards preventing harm and controlling advertising. We have not even begun to consider the thorny question of how a new regulatory framework might try to work with these media platforms to change the quality of media experiences for our children.
The point I am making is that this is going to be hard. We are dealing with powerful companies that work across national jurisdictions and have their roots in a very different tech media business culture. In the United States, businesses are familiar with the concept of regulation designed to prevent harm, but far less comfortable with regulation requiring them to do good, which is a fundamental principle of the UK’s public service broadcasting system. So, if we decide that what we have isn’t working for our children, and that to help them grow and learn about the world we need to change our PSB model into a new public service media model, that will encompass those platforms beloved by the children’s audience, then we will probably have to offer the YouTubes and TikToks of this world a large carrot while waving an even bigger stick.
But let’s say that happens. Let’s suppose we find there is strong public support for taking back control of the UK’s media landscape, shaping it to our own needs and purposes. What will be on our shopping list? What exactly might we discuss with those platforms?
Setting aside the issue of harmful content - for which legislation is already planned and which the platforms are starting to address - here are a couple of things that these large and very profitable businesses could help us with. They could, for example, agree to divert a percentage of their production budget into making UK-originated, high-quality, public service media content, helping us to create safer, curated online spaces for our children.
The thing with online platforms is that you’re either nowhere or you are King of the Castle, and when you become one of the dominant players, the profits are eye-watering. In Australia, Facebook, YouTube and Google - which owns YouTube - have steadily increased their combined share of the digital advertising market to 80%. The Australian government decided that enough is enough and is forcing Facebook to pay for using media content created by Australian businesses. The situation for YouTube and TikTok is different, because they do pay their content creators. But although some people make a lot of money from YouTube channels and TikTok accounts, most do not. The profits are not shared equitably and, because these platforms have dominant market positions, it’s difficult to see how competitors will erode that profit margin. It’s a cliché, but in the online world, coming second is nowhere at all and to the victor go all the considerable spoils.
YouTube and TikTok, Netflix and Disney, do make a significant contribution to our creative economy, with massive productions made here, as well as providing platforms for UK creatives. But this is not public service broadcasting per se, and nor is it necessarily culturally essential. So perhaps it's time to consider a cultural levy, similar to the one that was recently proposed for Canada by their Heritage Minister, the Honourable Steven Guilbeault. It’s a perfectly sensible approach. As Guilbeault puts it, it’s about cultural sovereignty: we would be asking these companies to invest in particular forms of British content, perhaps children’s media, in the same way that we require British commercial PSBs to make an investment.
That precedent harks back to the foundations of our current public service broadcasting system. When the government awarded broadcast licences to the original ITV companies, it was clear their dominant position as sole providers of television advertising was, in the words of the media magnate and owner of Scottish Television, Roy Thomson, "a licence to print money". In return for this business advantage, the ITV companies were required to divert some of their profits into public service programming. Today you could argue that Roy Thomson’s licence to print money has passed into the hands of the media tech giants. Perhaps it is time they assumed similar responsibilities?
Finding the funding for public service media content for children is only half the battle. At the same time, we must consider how that content will be made available to the audience. And again, some of these companies could help with that, particularly YouTube and TikTok. YouTube, as previously noted, is the audience’s ‘go to’ platform for video content with TikTok not far behind. Furthermore, YouTube and TikTok are free to access, which is an important prerequisite for PSM content. Isn’t that an attractive scenario? The UK’s number one providers of videos for children helping to fund UK-originated PSM content that, to return to the refrain from my introduction, will help to open their eyes to the complexities of the world they live in, whether that is the world of work, politics or personal relationships, and will also help them to understand our shared culture and values; fairness, tolerance, inclusivity and respect for our democratic institutions and the rule of law.
But - I hear you ask - will they actually watch those videos? Is it just a waste of money, because children will still choose to watch their favourite influencers, celebrities, dance-craze videos, etc? That’s hard to say. It depends on the quality of the content and how it is promoted to the audience. It has always been understood that if people are to engage with public service media content, it has to be properly publicised and promoted. And that is where we could, once again, turn to some of these companies for help in creating those positive media experiences for children. It goes back to the heart of how they operate: it goes back to the algorithm.
Words like ‘algorithm’ and ‘data’ have become tainted. They now sound sinister and representative of everything that is scary about the online experience. But very few things are ‘bad’ in and of themselves, and algorithms are whatever one chooses to make them. For obvious reasons, social media platforms and VSPs choose to create algorithms that keep people engaged with their content. In the early stage of a company, it’s about attracting as many users as you can to build the buzz around the platform. For established services, it’s about making sure you don’t lose your user to some up and coming rival. The algorithm makes sure all its users get plenty of ‘sugar’. And just like in food and drinks, media ‘sugar’ is good for sales, but not always good for your health. (In a wonderful touch of serendipity, Zuckerberg actually means ‘sugar mountain’, which seems quite apt.)
Could it all be different? Could an algorithm be created that would have less ‘sugar’? Perhaps a public service media algorithm could help our children to experience a different perspective on the world. Such a new approach should never be dictated by government, but would involve guidelines worked out in discussions with the platforms, along similar lines to the discussions currently taking place with the food industry around actual sugar intake. Any agreement would have to apply to all platforms to prevent unfair competition, perhaps with a ‘health’ kitemark to help parents identify compliant services.
Many of the ideas I have discussed in this article are quite revolutionary. As a conservative, you might think I would be uncomfortable with that. But throughout our history, from the industrial revolution to the welfare revolution that started after World War One, conservatives have embraced change when they could see it would be for the benefit of the whole nation. We now find ourselves in a position where radical changes to the regulatory framework for our media services has become essential. The pace of technological change that led to the growth of the tech giants and the streaming services means we now have to respond, and respond fast. Over the next year to eighteen months, we will be discussing what must go and what we will put in its place.
Even if you disagree with my suggestions for change, I hope I have at least convinced you that children and young people should be at the centre of this debate on the future of public service media and that their needs should be an absolute priority for government. We know lessons start at school and continue beyond school, and we know media experiences are pivotal in the development of children’s attitudes and their engagement with society. Those experiences help them imagine their future, based on their understanding of our shared culture and their perception of their place in the community. And the truth is, we know that if we ignore their needs, eventually we will pay a price.
What people are saying...

By Ed Vaizey
Ed Vaizey (Lord Vaizey of Didcot) is a member of the House of Lords, appointed in 2020, and sits on the Communications and Media Committee. He was the Member of Parliament for Wantage between 2005 and 2019. He served as the UK Government Culture and Digital Minister from 2010-16, and is the longest-serving Minister in that role.