The Children’s Media Foundation (CMF)

APPG Report on Online Harms

The All Party Group for Children’s Media and The Arts

Co-Chairs: Julie Elliott MP, Baroness Floella Benjamin
Vice Chairs: Andrew Rosindell MP, Baroness McIntosh of Hudnall, Public Enquiry Point: Jayne Kirkham

Due to the Covid-19 restrictions, the Children's Media and the Arts APPG meeting to discuss Age Verification and the Online Harms Bill had to be cancelled.  However, the co-chairs deemed this topic too important, so the speakers and participants were invited to put their thoughts to paper.  These can be Downloaded as a PDF or read in this report compiled by Jayne Kirkham,  APPG Clerk and Political Liaison, The Children’s Media Foundation.

Children and Online Pornography - How will the Online Harms Bill protect our young people?

“This is not an official publication of the House of Commons or the House of Lords. It has not been approved by either House or its committees. All-Party Parliamentary Groups are informal groups of Members of both Houses with a common interest in particular issues. The views expressed in this report are those of the group.”

The contributors:

John Carr OBE Secretary of the UK Children’s Charities’ Coalition on Internet Safety, Senior Advisor to ECPAT International, adviser to the Council of Europe, the International Telecommunication Union, former member of Microsoft’s Policy Board for Europe, the Middle East and Africa.

Iain Corby Iain is the Executive Director of the Age Verifications Providers Association, the trade body for companies ranging in size from start-ups to PLCs who offer technology to provide rigorous, standards-based online age checks. Prior to this, he was Deputy CEO of the charity GambleAware, and began his career as a management consultant with Deloitte.

BBFC The British Board of Film Classification (BBFC) is an independent regulator and guide with over 100 years’ experience. Its role is to help everyone in the UK choose age-appropriate films, videos and websites, wherever and however they watch or use them.

#NOTYOURPORN Kate started #NotYourPorn when a 'Revenge Porn' video of her friend was uploaded to Pornhub, without her friend's consent. Her mission is to hold porn companies to account for the content they share and profit from. Kate is calling on the UK government to regulate the commercialised porn industry.

John Carr, Secretary, Children’s Charities’ Coalition on Internet Safety (CHIS)

The story begins with gambling. In the early 2000s two things happened:

  1. Faster internet connections started to become available via smartphones. This meant young people began getting access to the internet outside the home and without any realistic possibility of there being any meaningful parental supervision.
  2. Banks and other financial institutions started issuing debit cards to children from the age of 11 e.g. “Solo” cards. Visa, Mastercard and others also started issuing plastic which could be bought for cash in corner shops, supermarkets and petrol stations.

Both of the above meant children could be economically active online in ways and on a scale that were never previously possible.

Several children’s charities soon started getting calls about children, typically a boy of 14/15, being diagnosed as a gambling addict. CHIS went to see all the big companies that were running online gambling sites. Almost all of them said the same thing “We are aware of this problem and take it very seriously”. Almost all of them did nothing. The Blair Government then announced a review of gambling policy. This resulted in a Bill (The Gambling Act 2005). During its passage through Parliament we campaigned for age verification to be made mandatory. It was, from 1/9/2007, since when the problem of simply ticking a box to confirm you are 18 or above disappeared. Note that: prior to the Gambling Act 2005 there was no real age verification industry in the UK. Capitalism then worked its magic. New companies were established or old ones e.g. the credit reference agencies, started to innovate.

Having achieved the necessary change in the law in respect of gambling we turned our attention to pornography.

There was a great of scepticism about whether or not it would be possible to do the same for porn, and about the civil liberties implications. Progress was slow. Eventually a cross-Party group of Parliamentarians established an enquiry and this led to the policy of age verification for porn sites being included in the 2015 Conservative Manifesto. When the Digital Economy Bill emerged, with the inclusion of Age Verification, most of the major parties supported it.

The fundamental point about the Digital Economy Act 2017 is that it created a framework which would allow some form of control to be exercised over businesses not domiciled in the UK i.e. businesses which are normally outside the reach of our courts.

Age verification can be carried out in a way which respects the privacy rights of adults. It is not a panacea: it will not stop all persons under 18 from seeing porn but it will substantially reduce the scale of inappropriate exposure, particularly among the very young. It also establishes a new norm. It says to publishers “It is not OK to keep pumping out stuff you say is not appropriate for children without you also doing something concrete to ensure those words mean something.”

It also shows children that the grown-ups are making a serious attempt to back up their beliefs. It is about insisting that, as a society, we can and we will create a greater alignment between our physical world laws and the realities of cyberspace. Alibis for inaction are always thick on the ground. The UK is the first country to try to find a way to strike the right balance, to respect the principles of liberal democracy while at the same time trying to find reasonable ways to discharge our obligations to protect the young.

Age Verification Providers Association

A note to the All-Party Parliamentary Group for Children's Media and The Arts

  • 48% of 11-16 year olds had seen pornography online. (NSPCC  learning report.)
  • 62% of 11-13 year olds who reported having seen pornography described their viewing asmostly “unintentional”
  • 83% of parents agreed with the statement “there should be robust age-verification (“AV”) controls in place to stop children (under-18s) seeing commercial pornography online”. Only 7% disagreed, with the remainder expressing a neutral opinion.
  • 11-13 year olds were most positive when asked whether “I want to be locked out of websites that are for 18-plus year olds” 56% agreed, 14% disagreed. (Revealing Reality Report BBFC)
  • Safe, secure, online age verification technology is already very well developed, and is being used already by thousands of UK customers every day buying age- restricted goods on the internet, such as alcohol or vaping products, or to access online gambling.
  • The UK leads the world in this technology – other countries such as Poland and Australia have taken our advice and just this month, the Australian House of Representatives recommended to its government that they implement age verification for online pornography and wagering based broadly on the UK approach.
  • A specific form of age verification was developed for online pornography to implement Part 3 of the Digital Economy Act 2017. It was designed to guarantee privacy, by allowing people to undertake age checks with an independent provider, and then prove their age anonymously to adult websites. This solution has been ready to go live since Easter 2019, when the government had first said it would be implemented.
  • The government abandoned the plan just before the General Election, and has suggested it would be better to develop a new solution as part of the Online Harms Bill.
  • BUT, that Bill has not yet been drafted and published. Indeed, the new regulator, OfCom, is not expected to be operational in this field before 2023/24, according to government assumptions shared with industry.
  • We cannot envisage what a different, better solution would look like, compared to what is already built. Any replacement would need to meet the same requirements as the existing answer, so is going to be practically identical.
  • The Online Harms Bill is a major endeavour, covering a wide range of issues, and the Duty of Care it will include is not yet defined. Indeed, we’ve only had an interim response to the consultation on the White Paper. Government proposals will be extensively debated, and there will be much lobbying from affected companies. This is not going to be straightforward legislation.
  • The ambition of the Bill is also enormous as it seeks to bring the internet under the rule of law in a wide number of dimensions. Age verification is certain to be a fundamental building block on which other
    regulation rests. It seems pragmatic and prudent to start with something like online pornography where government, regulators, AV suppliers and content publishers can all learn lessons, before we move on to apply AV comprehensively across multiple products, services and a wider range of content. A Big Bang approach to addressing all online harms at once is ill-advised.
  • The worldwide pornography industry was ready to adopt UK age verification. They were keen to ensure that it was applied universally, but the regulator was ready to act against sites that did not comply, and had the powers it needed to act globally, with website and financial blocking tools at its disposal.
  • The regulator, the BBFC, had also put in place a Certification Scheme to audit compliance with GDPR data protection, information security and privacy needs.
  • Oft-repeated claims that the system was easily circumvented have been defeated – e.g. Virtual Private Networks can be blocked by adult sites if they chose, just as Netflix and the BBC iPlayer already do. ISPs can still block sites accessed using DNS over HTTP, just using different techniques.
  • And at the end of the day, the goal was to stop young children stumbling over hardcore porn online – those with the technical knowledge to get round the system would be the same kids whose older sibling might once have bought them an adult magazine at the local newsagent. We were just trying to bring the virtual world in line with the real world.
  • The Digital Economy Act is still on the statute books. Its powers can be given to OfCom, the ICO or back to the BBFC. It can be improved upon, of course, but we should not let the perfect be the enemy of the good, especially in the fast moving world of technology where if you never start, you will have no chance of keeping up.
  • The Age Verification sector was a UK technology success story, and has the chance to remain so if Ministers can be persuaded to do the right thing, and act now using existing powers to protect our young children from the inestimable mental health damage we are doing to millions of them each year we allow the current Wild West Web to operate unchecked in this way.

For further information, please contact: Iain Corby
Executive Director
+44 (0)7811 409769

Online Harms and Age-verification

BBFC submission to the All-Party Parliamentary Group for Children's Media and the Arts

The UK Government has calculated 1.4 million children see pornography every month. First exposure is often accidental, with children as young as 7 stumbling across pornography online unintentionally. Exposure so early can have a devastating impact on children’s development and their relationships in the longer term.

The BBFC was designated as the Age-verification Regulator under Part 3 of the Digital Economy Act (DEA), recognising our expertise in classifying pornographic material and online regulation. However, in October 2019, the Government announced that they would not introduce age-verification under the DEA, and instead the child protection goals of the legislation would be met as part of its broader online harms strategy. In her statement to Parliament, the Secretary of State Rt Hon Nicky Morgan MP said that age-verification will “continue to play a key role in protecting children online”.

Age-verification remains an important child protection measure, and the BBFC will continue to contribute to the discussion on the regulation of pornography online. Our engagement with the adult industry has been positive, and we were confident the largest pornographic companies would have complied with the legislation.

There are several viable age-verification solutions ready to be deployed, including innovative methods that are light-touch while being necessarily robust. These solutions do not require that personal data be shared with pornographic services in the process of verifying age.

While age-verification is not a silver bullet, it is an effective way to prevent children stumbling across pornographic content on commercial sites. The introduction of age-verification would help the Government achieve their objective to ensure the same protections are in place online as offline.

More generally, the BBFC is engaging with wider aspects of the Government’s online harms strategy. 90% of parents believe it is important to display age ratings when watching a film online, and 92% think video-on-demand platforms should show the same age ratings they would expect at the cinema or on DVD. BBFC age ratings can be linked to parental filters, which is what 87% of parents say they would like to see to protect their children from inappropriate content. In response to this demand, since 2008, the BBFC has been working in partnership with the home entertainment industry and others to bring offline regulatory protections online. We operate a successful partnership with Netflix, and are working towards 100% coverage of BBFC age ratings on Netflix programming in the UK.

The BBFC is also the independent regulator of content delivered via the UK’s mobile networks. Using our Classification Guidelines, content which would be age rated 18 or R18 is placed behind controls to restrict access by children. This includes pornography, pro-anorexia sites and content which promotes discrimination or real-life violence.

These models for online regulation make a substantial contribution to child safety and consumer empowerment and have been welcomed by parents in particular. The BBFC supports the Government’s ambition to make the UK the safest place for children to be online, and will work with them to ensure their child protection goals are achieved in the online harms strategy.

Kate Isaacs

The Online Harms Bill is premised on protecting users of the internet, including child users, from being exposed to harm. While it is important to protect child users of the internet as consumers of online content, it is equally important to protect children from being exploited online and used as ‘content’ in circumstances beyond their control.

Children are frequently harmed online in situations where they are not the internet user. We are seeing increasing numbers of cases of children being used as ‘content’ on legal, commercialised porn websites available and operating within the UK.

Due to the lack of external regulation of the commercialised porn industry, these companies have failed to put in place the processes to protect these children in the videos being uploaded to their sites - and are often encouraging users to do so under explicit categories which use children within their descriptive tags. MindGeek, who have offices in the UK and own a reported 80% of the commercialised porn industry, have proven themselves to not only be lacking in any moderation of this content, but also, not cooperating with victims or police investigations when non-consensual content has been reported.

In the past 12 months alone, we have seen several confirmed cases of children being used for profit on porn websites. A trafficked 15-year-old child from Florida had been made a verified Pornhub creator and was only found after several videos of her rape were used as content on the site.  In a joint investigation with The Sunday Times, we found a number of confirmed children on Pornhub . A victim recently came forward in a BBC article stating that her rape at the age of 14 had been used on Pornhub and advertised throughout the site - it is also worth noting that her name became a Pornhub suggested search term the day after the article went live. The Internet Watch Foundation frequently finds illegal content on MindGeek sites.

The commercialised porn industry has been failing to ‘self-regulate’ and has continued to profit from child abuse images and non-consensual content for too long. We are calling on the UK Government to regulate the commercialised porn industry - not only in terms of protecting children who are accessing these sites, but also those who have been exploited and abused on them for profit.

The Online Harms Bill must place equal emphasis on both the child user and the child who becomes the subject matter of illegal online content. The #NotYourPorn Campaign calls for both of these facets of content regulation to be addressed together so that a “healthy” internet model, where children are fully protected, can be properly achieved.

Thank you for this opportunity to share our thoughts with you.

Industry Policy Research

Leave a Reply

Your email address will not be published. Required fields are marked *

The Children’s Media Foundation (CMF)