The Children’s Media Foundation (CMF)

Online Good?


CMF Director Greg Childs contributed to the Lego Group's recent industry stakeholder research into children and technology.  On April 28th 2021 he attended a roundtable to hear the results.



Lego commissioned BritainThinks to research key stakeholder views about the current relationship between children and technology, focusing in particular on:

  • The benefits and harms to children related to their use of technology
  • Perceptions of the level of responsibility the industry has when using technology to engage with children
  • Best practice for designing child-facing digital experiences
  • Future priorities for the industry to enable positive innovation in digital experiences for children

There were 24 in-depth interviews with politicians, academics, think tanks, consumer bodies, regulators, charities and industry figures.

The results were discussed at the Roundtable in late April 2021. In summary they revealed:

Technology is seen as a non-negotiable element of all children’s lives. Being able to use and understand technology is seen to be vital for children, and there is widespread rejection of the idea that children’s lives would be improved by having no access to technology. However, the benefits of technology can be hard to categorise and articulate. There is no clear dividing line between ‘offline’ and ‘online’ lives anymore, meaning that the benefits of technology are often seen as ‘self-evident’.

Most believe that the benefits of technology outweigh the downsides. Despite this, it can be difficult to think about and categorise the benefits and think of specific examples. When thinking about the benefits, education and building ‘digital skills’ are front-of-mind, although many also reference creative and interactive forms of play that are simply not replicable ‘offline’, alongside the opportunity that technology can provide to build social skills.

Concerns around the relationship between children and technology tended to focus on the ‘digital divide’. Children from lower socio-economic backgrounds are seen to have less access to good quality technology and appropriate online experiences, impacting on their lives both in the short and long-term. Beyond the digital divide, concerns around ‘online harms’ (such as exposure to inappropriate content, sexual exploitation, etc.) and children’s data privacy were frequently mentioned as important issues relating to children’s relationship with technology.

It is seen as neither possible nor desirable for parents to have ‘complete control’ over their children’s use of technology. However, there is concern that some children have ‘free rein’ over their use of technology. Older children (aged 11+) are seen to be particularly likely to have ‘free rein’ and are often using technology which is primarily designed for adults, rather than being designed specifically for children. Parents are currently seen to have too little support to help them ensure their children have positive online experiences.

Children are seen to be more at risk of a range of harms when using technology aimed primarily at adults, rather than technology specifically designed for children. Technology designed for adults is seen to both do less to protect children from harms and to deliver fewer specific benefits for children. Many of these companies are seen to be resistant to acknowledging the presence of children on their platforms and to take effective action to ensure that they offer adequate protection from harms.

Conversely, technology designed specifically for children is seen to be more likely to be beneficial for children. This perception is driven primarily by a belief that these platforms are less likely to be unsafe (or harmful) to children than technology designed for adults, rather than perceptions of benefit.

Thinking about the future, protecting children from the risk of harm and ensuring data privacy are key priorities for designing child-focused technology. Whilst widening participation and advancing the right to play are also seen as important, they are secondary to protection and privacy. In practice this means that it tends to be seen as acceptable to make platforms and technology less accessible if doing so means that children are safer from harms and have greater privacy, whereas it is seen as unacceptable to make a platform less safe (or to reduce children’s privacy) in order to make it more accessible.

Contributors to the discussion at the Roundtable included Chris Payne, Lego's Director of Digital Responsibility, Government and Public Affairs, Jeremy Wright, former Secretary of State at DCMS (2018-19), Polly Mackenzie, CEO of Demos and Tabitha Goldstaub, co-founder of CogX and Chair of the government's AI Council.

The general thrust of the conversation at the roundtable was very much about seeking out the benefits of technology. Online harms are known and process is beginning to be made in their regulation or limitation. But very little is discussed about "Online good". There was recognition that parental control was unlikely. A host of factors in busy parents lives conspire against it. Equally there was recognition of the digital divide and the need to solve that problem. In fact not having online or tech access, it was suggested, could be seen as a "harm".

The meeting discussed the concept of constructive fun online and the need for more discussion about what works, what produces "good" rather than focusing on harm.

Chis Payne revealed that for Lego "safety by design" is built into the DNA of the company.  But as in the real world it was suggested that it might be possible to be too careful with children. That without risk there is no learning and no growth. He felt the focus should be on "responsible engagement" and very much on empowering children and parents through their online activity, and to conduct that activity in safety. Lego's agenda for internet engagement with kids was, said Chris Payne defined by the results of their 2020 "sentiment survey.

Jeremy Wright opened a panel discussion by stating it was clear now that internet platforms could be regulated and that this would come in the Online Harms Act.  However, he supported the need for deeper thinking about what constituted "good". For example should we reconsider the approach to collecting data from children? If AI cannot learn from children then how can it serve children, how can it understand  and mirror their world?

Polly Mackenzie believed that "good" should be defined as well-being. "Most people would agree tat childhood should be about maximising well-being". She also made a plea for more help with parenting skills.  Digital inclusion is as much an issue of parenting capabilities as it is wireless speeds.

Tabitha Goldstaub explored play as a route to learning, once again following up on the Lego idea of more freedom to play.  She also felt that what we need is to improve everyone's connection with technology. Her left-field suggestion to solve this was a promo game-show called :"Tech-Off" though she didn't go into the details of the format!

The roundtable ended with a recognition from Lego that none of the technology they are currently using is designed for children - not smartphone, not the Metaverse, not social media. So there is a responsibility for care and improvement.  But they also stressed that while legislating against harms was important, what was needed was a parallel emphasis on the good provided by children's experiences online.

Events Industry Policy Research

Leave a Reply

Your email address will not be published. Required fields are marked *

The Children’s Media Foundation (CMF)