The Gist: A Free Speech

The Online Safety Code to bring age verification to Internet platforms is heading towards implementation with a fatal flaw.

The Gist: A Free Speech
Photo by Volodymyr Hryshchenko / Unsplash

A few days ago I agreed to deliver a keynote to an online conference on the "Next steps for online safety and regulation in Ireland". I was one of four keynote speakers. The other three were Niamh Hodnett, Online Safety Commissioner, Coimisiún na Meán; Rita Wezenbeek, Director, Platforms and Enforcement, DG Connect, European Commission and Julie Inman Grant, Australia’s eSafety Commissioner.

The conference itself was intended to have real-world impacts. The organisers charge for attendance, and then charge for access to the after-the-fact records. They also make them available to policy-makers.

the output of the conference, which will be shared with parliamentary, ministerial, departmental and regulatory offices. This includes the full proceedings and additional articles submitted by delegates. Those that attended include officials from DCEDIY, ROI; DTCAGSM, ROI; DfC, NI; DfE; DSIT; Ofcom; ICO; Education Scotland; Home Office; Government of Jersey; and the Welsh Government.

But you, happy Gist reader don't need to pay your €315+VAT for, at least, my contribution. Here it comes now, largely as I delivered it.

Main point for busy readers: The Online Safety Code has, by accident or design, created two seperate legal requirements to introduce age verification by designated platforms, and they don't both meet the tests set down by the CJEU for being compliant with EU law.


Hello, and thanks for the invitation to speak today. I’m going to be addressing the Online Safety Code which is looming for implementation from October. I should say that I represent Digital Rights Ireland in my professional capacity, but that my submission today is a personal one, not on their behalf.

We were gifted a primary overview of the legislative history and basis of the Online Safety Code earlier and so you’ll be spared listening to that from me again.

I should say that I’m aware I probably represent the Counter-Programming element of the morning. My interest in this topic arises from my wish to see regulation of platforms, and particularly the regulation of platforms to protect children, to be effective. My comments are not related to material which is illegal- there is no question that illegal content should be removed or prevented.

I’m addressing material which is explicitly legal, but defined as harmful. And there are some classes of data which can be easily placed in that definition. The canonical example is material promoting self-harm or disordered eating in young people. But the devil is in the detail when it comes to drafting regulations and restriction of legal public speech. A code which disproportionately interferes with the rights of privacy and free expression set out in the EU Charter of Fundamental Rights is not an effective code.

And, its safe to say, an effective regulation is something that everyone here wants to see.

The Online Safety Code grows out of the EU Audio-Visual Directive, but it is very much an Irish interpretation of what that law intended, going beyond the quite basic requirements to build a very wide-ranging framework for the regulation and restriction of internet content.

That Directive was originally intended to bring in a harmonisation of laws regarding video content. That background, where the platforms were imagined as merely a new kind of broadcaster, is reflected in the final text, which speaks in Article 6a, of restricting access to material unsuitable to minors by “selecting the time of the broadcast, age verification tools or other measures”.

The EU Commission suggested that the form of age verification methods they anticipated was by means of allowing parents to set a pin code (as is used by Netflix).

These are uncontroversial extensions of the existing and familiar Watershed norms from television broadcasting.

The difficulties arise from how the Irish State, and then Coimisiún na Meán, decided they would react to these ideas. The online safety code text we have now has been submitted to the EU Commission through the TRIS procedure and the standstill period has ended. So, this is the text.

And I’m sorry to say, this is not a good text.

The proposal is to bring in mandatory age verification for both children and adults on two separate bases. 

The first is set out in Part A, Section 10.6(f) of the Online Safety Code.

A video-sharing platform service shall establish and operate age verification systems for users of video sharing content with respect to content which may impair the physical mental or moral development of minors.

As Elizabeth Farries pointed out earlier, this is almost a textbook example of a law which fails the CJEU’s test of being ‘clear and precise’ and ‘specific’. My idea about what content may impair the moral development of minors and Enoch Burke’s may be quite different. We are verging close to revisiting Ireland’s history with the Censorship Board, where everything from Ulysses to Women’s Magazines could be declared immoral.

And separate and distinct from that requirement in Part A of the Code is a different, specific, requirement in Section 12.11 (in Part B of the Code) for video sharing platforms to introduce an age assurance scheme in the event that their terms and conditions allow for the uploading of "adult-only video content”. This is a specifically defined term being video content consisting of pornography and/or violence or cruelty.

As Section 12.1 says that its Part B provisions are without prejudice to the requirements set out in Part A Section 10, we have two, parallel, age verification requirements set out by the regulator, only one of which could even arguably meet the CJEU’s test for clarity in law.

But let’s take a look at what the Commission’s Executive Chairman has set out as the Regulator’s expectation of what an Age Verification scheme would look like in practice.

“Uploading documents and or a live selfie is one example” (source: CnaM website)

“A requirement for a person to show their passport and then a selfie to verify they are the person on the passport” (source: Irish Examiner)

“A live selfie each time you want to access it and they use biometrics to check”. (souce: Irish Examiner)

Dr. Karlin Lillington of the Irish Times described these original proposals as

 “all absolutely mega-scale bonkers”

But this still seems to be the Commission’s plan. It was telling that the Australian Safety Commissioner speaking earlier had no answer to squaring privacy and age verification beyond saying ‘hopefully we come up with some workable scheme’ and, like the Brexiteers in the past trying to argue about the possibility of a frictionless Border, fell back on a hope that some technology, as yet undefined, would solve this issue.

But while it is one thing to say that a plan is bonkers, a more critical question is the question of whether this plan can be legal.

To be able to defend against claims of failure to properly implement a state mandated age verification system, the platforms will have to retain the age verification data. Our statute of limitations for claims of breach of contract (by parents against platforms, let us say) is six years.

We have had years of very clear decisions from the CJEU as to what is and is not proportionate in  laws requiring data retention which would impact the privacy and data protection rights of EU citizens.

Looking at the text of the proposed Online Safety Code in that light, we can ask some straightforward questions.

Does it, as is required in paragraph 65 of the Digital Rights Ireland Case judgment, lay down “clear and precise rules governing the extent of the interference with the fundamental rights enshrined in Articles 7 and 8 of the Charter”?

I think we’ve seen it does not.

Does it, as required by paragraph 68 of that judgement require that the senstive biometric or identity data is required to be kept inside the EU?

It does not.

Does it, as required by paragraph 67 of the same judgment, in its text ‘specifically ensure the irreversible destruction of the data at the end of the data retention period?”

Again, it does not.

Amongst other things, the lack of these clauses in respect of EU Directive 2006/24 led to the CJEU to strike down that directive as having breached the legal requirement to comply with

“The principle of proportionality in the light of Article 7,8 and 52(1) of the Charter”.

If those shortfalls were enough to strike down a Directive for the entire EU, the question remains why Ireland’s Coimisiún na Meán would think that their Online Safety Code will avoid the same fate.

Good regulation is critical and literally everyone would like to see it.

Unfortunately, if the current Online Safety Code is implemented, we will have neither good regulation nor certainty, while also significantly impacting the privacy of adults and children alike just before an election.