The House of Lords recently debated the report of the Communivccations and digitl Select Committee Repotry entitled Free For All? Freedom of Expression in the Digital Age.
This is an edited version of what I said in the debate.
I congratulate the Select Committee on yet another excellent report relating to digital issues It really has stimulated some profound and thoughtful speeches from all around the House. This is an overdue debate.
As someone who sat on the Joint Committee on the draft Online Safety Bill, I very much see the committee’s recommendations in the frame of the discussions we had in our Joint Committee. It is no coincidence that many of the Select Committee’s recommendations are so closely aligned with those of the Joint Committee, because the Joint Committee took a great deal of inspiration from this very report—I shall mention some of that as we go along.
By way of preface, as both a liberal and a Liberal, I still take inspiration from JS Mill and his harm principle, set out in On Liberty in 1859. I believe that it is still valid and that it is a concept which helps us to understand and qualify freedom of speech and expression. Of course, we see Article 10 of the ECHR enshrining and giving the legal underpinning for freedom of expression, which is not unqualified, as I hope we all understand.
There are many common recommendations in both reports which relate, in the main, to the Online Safety Bill—we can talk about competition in a moment. One absolutely key point made during the debate was the need for much greater clarity on age assurance and age verification. It is the friend, not the enemy, of free speech.
The reports described the need for co-operation between regulators in order to protect users. On safety by design, both reports acknowledged that the online safety regime is not essentially about content moderation; the key is for platforms to consider the impact of platform design and their business models. Both reports emphasised the importance of platform transparency. Law enforcement was very heavily underlined as well. Both reports stressed the need for an independent complaints appeals system. Of course, we heard from all around the House today the importance of media literacy, digital literacy and digital resilience. Digital citizenship is a useful concept which encapsulates a great deal of what has been discussed today.
The bottom line of both committees was that the Secretary of State’s powers in the Bill are too broad, with too much intrusion by the Executive and Parliament into the work of the independent regulator and, of course, as I shall discuss in a minute, the “legal but harmful” aspects of the Bill. The Secretary of State’s powers to direct Ofcom on the detail of its work should be removed for all reasons except national security.
A crucial aspect addressed by both committees related to providing an alternative to the Secretary of State for future-proofing the legislation. The digital landscape is changing at a rapid pace—even in 2025 it may look entirely different. The recommendation—initially by the Communications and Digital Committee—for a Joint Committee to scrutinise the work of the digital regulators and statutory instruments on digital regulation, and generally to look at the digital landscape, were enthusiastically taken up by the Joint Committee.
The committee had a wider remit in many respects in terms of media plurality. I was interested to hear around the House support for this and a desire to see the DMU in place as soon as possible and for it to be given those ex-ante powers.
Crucially, both committees raised fundamental issues about the regulation of legal but harmful content, which has taken up some of the debate today, and the potential impact on freedom of expression. However, both committees agreed that the criminal law should be the starting point for regulation of potentially harmful online activity. Both agreed that sufficiently harmful content should be criminalised along the lines, for instance, suggested by the Law Commission for communication and hate crimes, especially given that there is now a requirement of intent to harm.
Under the new Bill, category 1 services have to consider harm to adults when applying the regime. Clause 54, which is essentially the successor to Clause 11 of the draft Bill, defines content that is harmful to adults as that
“of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.”
Crucially, Clause 54 leaves it to the Secretary of State to set in regulations what is actually considered priority content that is harmful to adults.
The Communications and Digital Committee thought that legal but harmful content should be addressed through regulation of platform design, digital citizenship and education. However, many organisations argue especially in the light of the Molly Russell inquest and the need to protect vulnerable adults, that we should retain Clause 54 but that the description of harms covered should be set out in the Bill.
Our Joint Committee said, and I still believe that this is the way forward:
“We recommend that it is replaced by a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill”, but that
“These definitions should reference specific areas of law that are recognised in the offline world, or are specifically recognised as legitimate grounds for interference in freedom of expression.”
We set out a list which is a great deal more detailed than that provided on 7 July by the Secretary of State. I believe that this could form the basis of a new clause. As my noble friend Lord Allan said, this would mean that content moderation would not be at the sole discretion of the platforms. The noble Lord, Lord Vaizey, stressed that we need regulation.
We also diverged from the committee over the definition of journalistic content and over the recognised news publisher exemption, and so on, which I do not have time to go into but which will be relevant when the Bill comes to the House. But we are absolutely agreed that regulation of social media must respect the rights to privacy and freedom of expression of people who use it legally and responsibly. That does not mean a laissez-faire approach. Bullying and abuse prevent people expressing themselves freely and must be stamped out. But the Government’s proposals are still far too broad and vague about legal content that may be harmful to adults. We must get it right. I hope the Government will change their approach: we do not quite know. I have not trawled through every amendment that they are proposing in the Commons, but I very much hope that they will adopt this approach, which will get many more people behind the legal but harmful aspects.
That said, it is crucial that the Bill comes forward to this House. Lord Gilbert, pointed to the Molly Russell inquest and the evidence of Ian Russell, which was very moving about the damage being wrought by the operation of algorithms on social media pushing self-harm and suicide content. I echo what the noble Lord said: that the internet experience should be positive and enriching. I very much hope the Minister will come up with a timetable today for the introduction of the Online Safety Bill.
18th July 2015
Lord C-J debates the future of the BBC
1st December 2014
Happy New Year of the Horse!
28th July 2014