Government must ensure the regulation of election dis-and misinformation

Earlier this year during the Elections Bill process we debated the regulation of digital campaigning and how we needed to add new provisions to allow the Elections Commission to control misinformation and disinformation 

This is what I said 

Digital campaigning is of growing importance. It accounted for 42.8% of reported spend on advertising in the UK at the 2017 general election. That figure rose in 2019; academic research has estimated that political parties’ spending on platforms is likely to have increased by over 50% in 2019 compared to 2017. As the Committee on Standards in Public Life said in its report in July last year, Regulating Election Finance:

“Research conducted by the Electoral Commission following the 2019 General Election revealed that concerns about transparency are having an impact on public trust and confidence in campaigns.”

In that light, the introduction of digital imprints for political electronic material is an overdue but welcome part of the Elections Bill.

The proposed regime as it stands covers all types of digital material and all types of appropriate promoter. However, a significant weakness of the Bill may exist in the detail of where an imprint must appear. In its current form, the Bill allows promoters of electronic material to avoid placing an imprint on the material itself if it is not reasonably practicable to do so. Instead, campaigners could include the imprint somewhere else that is directly accessible from the electronic material, such as a linked webpage or social media profile or bio. The evidence from Scotland’s recent parliamentary elections is that this will lead in practice to almost all imprints appearing on a promoter’s website or homepage or on their social media profile, rather than on the actual material itself. Perhaps that was encouraged by the rather permissive Electoral Commission guidance for those elections.

Can this really be classed as an imprint? For most observers of the material, there will be no discernible change from the situation that we have now—that is, they will not see the promoter’s details. The Electoral Commission also says that this approach could reduce transparency for voters if it is harder to find the imprint for some digital campaign material. It seems that

“if it is not reasonably practicable to comply”

will award promoters with too much leeway to hide an imprint. Replacing that with

“if it is not possible to comply”

would ensure that the majority of electronic material is within the scope of the Bill’s intentions. What happened to the original statement in the Cabinet Office summary of the final policy in its response to the consultation document Transparency in Digital Campaigning in June last year? That says:

“Under the new regime, all paid-for electronic material will require an imprint, regardless of who it is promoted by.”

There is no mention of exemptions.

The commission says it is important that the meanings of the terms in the Bill are clear and unambiguous, and that it needs to know what the Government’s intent is in this area. In what circumstances do the Government really believe it reasonable not to have an imprint but to have it on a website or on a social media profile? We need a clear statement from them.

As my noble friend Lord Wallace said, Amendments 194A and 196A really should be included in the “missed opportunity” box, given the massive threat of misinformation and disinformation during election campaigns, particularly by foreign actors, highlighted in a series of reports by the Electoral Commission, the Intelligence and Security Committee and the Committee on Standards in Public Life, as well as by the Joint Committee on the Draft Online Safety Bill, on which I sat. It is vital that we have much greater regulation over this and full transparency over what has been paid for and what content has been paid for. As the CSPL report last July said,

“digital communication allows for a more granular level of targeting and at a greater volume – meaning more messages are targeted, more precisely and more often.”

The report says:

“The evidence we have heard, combined with the conclusions reached by a range of expert reports on digital campaigning in recent years, has led us to conclude that urgent action is needed to require more information to be made available about how money is spent on digital campaigning.”

It continues in paragraph 6.26:

“We consider that social media companies that permit campaign adverts in the UK should be obliged to create advert libraries. As a minimum they should include adverts that fit the legal definition of election material in UK law.”

The report recommends that:

“The government should change the law to require parties and campaigners to provide the Electoral Commission with more detailed invoices from their digital suppliers … subdivide their spending returns to record what medium was used for each activity”

and

“legislate to require social media platforms that permit election adverts in the UK to create advert libraries that include specified information.”

All those recommendations are also contained in the Electoral Commission report, Digital Campaigning: Increasing Transparency for Voters from as long ago as June 2018, and reflect what the Centre for Data Ethics and Innovation set out in its February 2020 report on online targeting in specifying what it considered should be included in any such advert library. The implementation of these recommendations, which are included in Amendment 196A, would serve to greatly increase the financial transparency of digital campaigning operations.

In their response to the CSPL report, the Government said:

“The Government is committed to increasing transparency in digital campaigning to empower voters to make decisions. As part of this, we take these recommendations on digital campaigning seriously. As with all of the recommendations made by the CSPL, the Government will look in detail at the recommendations and consider the implications and practicalities.”

The Public Administration and Constitutional Affairs Committee report last December followed that up, saying at paragraph 216:

“The Government’s response to the CSPL report on electoral finance regulation provides no indication of which of its recommendations (not already included in the Bill) the Government is likely to adopt … prioritise for consultation or when or how the Government proposes to give legislative effect to recommendations that will not be included in the Bill. The Government should give clarity on its next steps in this regard.”

So the time has come for the Government to say what their intentions are. They have had over six months to do this, and I hope they have come to the conclusion that fully safeguards our democracy. I hope the Government will now see the merits and importance of those amendments.

The CSPL also recommended changes to electoral law regarding foreign actors. The CSPL says at paragraph 6.29 of its report:

“As we discuss in chapter 4, the rules on permissible donations were based on the principle that there should be no foreign interference in UK elections. However, the rules do not explicitly ban spending on campaign advertising by foreign individuals or organisations.”

It specifically refers to the Electoral Commission’s Digital Campaigning report, which said:

“A specific ban on any campaign spending from abroad would … strengthen the UK’s election and referendum rules.”

It quoted the DCMS committee’s February 2019 report, Disinformation and “Fake News”, which said that

“the UK is clearly vulnerable to covert digital influence campaigns”,

and the Intelligence and Security Committee report, which stated that if the commission

“is to tackle foreign interference, then it must be given the necessary legislative powers.”

These are powerful testimonies and recommendations from some very well respected committees. As a result, the CSPL recommended:

“In line with the principle of no foreign interference in UK elections, the government should legislate to ban foreign organisations or individuals from buying campaign advertising in the UK.”

This is very similar to a recommendation in the Electoral Commission’s Digital Campaigning: Increasing Transparency for Voters report of 2018, which I referred to earlier. In response, the Government said: “We are extending this”—the prohibition of foreign money—

“even further as part of the Elections Bill, to cover all third-party spending above £700 during a regulated period.”

However, the current proposals in the Bill have loopholes that foreign organisations can readily use, for instance through setting up multiple channels. A foreign actor could set up dozens of entities and spend £699 on each one—something very easy for online expenditure.

Amendment 194B would ensure that foreign entities were completely banned from participating at all and would make absolutely certain that the Government’s intentions were fulfilled. Again, I hope that the Minister will readily accept this amendment as strengthening the Bill against foreign interference.

Tackling societal harms caused by misinformation and disinformation is not straightforward, as our Joint Committee on the Online Safety Bill found. However, consistent with the report of the Lords Select Committee on Democracy and Digital Technologies, Digital Technology and the Resurrection of Trust, chaired by the much-missed Lord Puttnam, we said:

“Disinformation and Misinformation surrounding elections are a risk to democracy. Disinformation which aims to disrupt elections must be addressed by legislation. If the Government decides that the Online Safety Bill is not the appropriate place to do so, then it should use the Elections Bill which is currently making its way through Parliament.”

There is, of course, always a tension with freedom of expression, and as we emphasised in our Joint Committee, so we must prioritise tackling specific harmful activity over restricting content. Apart from the digital imprint provisions, however, the Bill fails to take any account of mounting evidence and concerns about the impact on our democracy of misinformation and disinformation. The long delayed report of the Intelligence and Security Committee on Russian interference of July 2020 was highly relevant in this context, stating:

“The UK is clearly a target for Russia’s disinformation campaigns and political influence operations and must therefore equip itself to counter such efforts.”

Protecting our democratic discourse and processes from hostile foreign interference is a central responsibility of the Government. The committee went on, very topically, to say:

“The links of the Russian elite to the UK—especially where this involves business and investment—provide access to UK companies and political figures, and thereby a means for broad Russian influence in the UK.”

It continued:

“We note—and, again, agree with the DCMS Select Committee—that ‘the UK is clearly vulnerable to covert digital influence campaigns.’”

The online harms White Paper published in April 2019 recognised the dangers that digital technology could pose to democracy and proposed measures to tackle them. Given the extensive regulatory framework being put in place for individual online harms in the Online Safety Bill, newly published last week, why are the Government reluctant to reaffirm the White Paper approach to elections and include it in this Bill? The Government responded to our Joint Committee report on this issue last week by saying that they agreed that misinformation and disinformation surrounding elections are a risk to democracy. However, they went on to say:

“The Government has robust systems in place that bring together governmental, civil society and private sector organisations to monitor and respond to interference in whatever form it takes to ensure that our democracy stays open, vibrant and transparent”

—fine words. They cite the Defending Democracy programme, saying:

“Ahead of major democratic events, the Defending Democracy programme stands up the Election Cell. This is a strategic coordination and risk reporting structure that works with relevant organisations to identify and respond to emerging issues”.

So far, so vague. They continue:

“The Counter Disinformation Unit based in DCMS is an integral part of this structure and undertakes work to understand the extent, scope and the reach of misinformation and disinformation.”

The Government, however, seem remarkably reluctant to tell us through parliamentary Questions or FoI requests what this Counter Disinformation Unit within the DCMS is. What does it actually do? Does it have a role during elections? Given that government response, it seems clear that the net result is that the Elections Bill has, and will have, no provisions relating to misinformation and disinformation.

Amendment 194B is a start and is designed to prevent one strand of disinformation, akin to the 640,000 Facebook posts that led to the Capitol riots of 6 January last year, which not only has immediate impact but erodes trust in future elections. The Government should pick this amendment up with enthusiasm but then introduce something much more comprehensive that meets the concerns of the ISC’s Russia report and tackles online misinformation and disinformation in election campaigns.

I would of course be very happy to discuss all these amendments and all the relevant issues with Ministers between Committee and Report stages.


Lord C-J introduces new Public Authority Algorithm Bill

I recently introduced a  private members bill in the House of Lords designed to ensure that decisions made by public authorities-local and national -are fully transparent and propoerly assessed for the the impact they have on the rights of the individual citizen .

 It mandates the government to draw up a framework for an impact assessment which follows a set of principles laid out in the Bill  so that (a) decisions made in and by a public authority are responsible and comply with procedural fairness and due process requirements, and its duties under the Equality Act, (b) impacts of algorithms on administrative decisions are assessed and negative outcomes are minimized, and (c) data and information on the use of automated decision systems in public authorities are made available to the public. It will apply in general to  to any automated decision system developed or procured by a public authority other than the security services

 

 


Lord C-J calls for review of Policy Lethal Autonomous Weapons

The Debate on limitation of Lethal Autonomous weapons has hotted up, especially in the the light of the Government's new Defence AI sttategy.

This is what I said prior to the report being published last   when the Armed Forces Bill went thnrough  the House of Lords

 

We eagerly await the defence AI strategy coming down the track but, as the noble Lord said, the very real fear is that autonomous weapons will undermine the international laws of war, and the noble and gallant Lord made clear the dangers of that. In consequence, a great number of questions arise about liability and accountability, particularly in criminal law. Such questions are important enough in civil society, and we have an AI governance White Paper coming down the track, but in military operations it will be crucial that they are answered.

From the recent exchange that the Minister had with the House on 1 November during an Oral Question that I asked about the Government’s position on the control of lethal autonomous weapons, I believe that the amendment is required more than ever. The Minister, having said:

“The UK and our partners are unconvinced by the calls for a further binding instrument”

to limit lethal autonomous weapons, said further:

“At this time, the UK believes that it is actually more important to understand the characteristics of systems with autonomy that would or would not enable them to be used in compliance with”

international human rights law,

“using this to set our potential norms of use and positive obligations.”

That seems to me to be a direct invitation to pass this amendment. Any review of this kind should be conducted in the light of day, as we suggest in the amendment, in a fully accountable manner.

However, later in the same short debate, as noted by the noble Lord, Lord Browne, the Minister reassured us, as my noble friend Lady Smith of Newnham noted in Committee, that:

“UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement.”

Later, the Minister said:

“It is not possible to transfer accountability to a machine. Human responsibility for the use of a system to achieve an effect cannot be removed, irrespective of the level of autonomy in that system or the use of enabling technologies such as AI.”—[Official Report, 1/11/21; col. 994-95.]

The question is there. Does that mean that there will always be a human in the loop and there will never be a fully autonomous weapon deployed? If the legal duties are to remain the same for our Armed Forces, these weapons must surely at all times remain under human control and there will never be autonomous deployment.

However, that has recently directly been contradicted. The noble Lord, Lord Browne has described the rather chilling Times podcast interview with General Sir Richard Barrons, the former Commander Joint Forces Command. He contrasted the military role of what he called “soft-body humans”—I must admit, a phrase I had not encountered before—with that of autonomous weapons, and confirmed that weapons can now apply lethal force without any human intervention. He said that we cannot afford not to invest in these weapons. New technologies are changing how military operations are conducted. As we know, autonomous drone warfare is already a fact of life: Turkish autonomous drones have been deployed in Libya. Why are we not facing up to that in this Bill?

I sometimes get the feeling that the Minister believes that, if only we read our briefs from the MoD diligently enough and listened hard enough, we would accept what she is telling us about the Government’s position on lethal autonomous weapons. But there are fundamental questions at stake here which remain as yet unanswered. A review of the kind suggested in this amendment would be instrumental in answering them.

 


Coordination of Digital Regulation Crucial

The House of Lords recently debated the report  of its Select Committee on Communications and Digital entitled "Digital regulation: joined-up and accountable" 

This is what I said about the shape digital regulation should take and how it could best be coordinated 

In their digital regulation plan, first published last July and updated last month, the Government acknowledged that

“Digital technologies … demand a distinct regulatory approach … because they have distinctive features which make digital businesses and applications unique and innovative, but may also challenge how we address risks to consumers and wider society.”

I entirely agree, but I also agree with the noble Baroness, Lady Stowell, the noble Lord, Lord Vaizey, and the noble Earl, Lord Erroll, that we need to do this

without the kind of delays in introducing regulation that we are already experiencing.

The plan for digital regulation committed to ensuring a forward-looking and coherent regulatory approach for digital technologies. The stress throughout the plan and the digital strategy is on a light-touch and pro-innovation regulatory regime, in the belief that this will stimulate innovation. The key principles stated are “Actively promote innovation”, achieve “forward-looking and coherent outcomes” and

“Exploit opportunities and address challenges in the international arena”.

This is all very laudable and reinforced by much of what the Select Committee said in its previous report, as mentioned by the noble Baroness. But one of the key reasons why the design of digital governance and regulation is important is to ensure that public trust is developed and retained in an area where there is often confusion and misunderstanding.

With the Online Safety Bill arriving in this House soon, we know only too well that the power of social media algorithms needs taming. Retention of public trust has not been helped by confusion over the use of algorithms to take over exam assessment during the pandemic and poor communication about the use of data on things like the Covid tracing app, the GP data opt-out and initiatives such as the Government’s single-ID identifier “One Login” project, which, together with the growth of automated decision-making, live facial recognition and use of biometric data, is a real cause for concern for many of us.

The fragility of trust in government use and sharing of personal data was demonstrated when Professor Ben Goldacre recently gave evidence to the Science and Technology Committee, explaining that, despite being the Government’s lead adviser on the use of health data, he had opted out of giving permission for his GP health data to be shared.

As an optimist, I believe that new technology can potentially lead to greater productivity and more efficient use of resources. But, as the title of Stephanie Hare’s new book puts it, Technology Is Not Neutral. We should be clear about the purpose and implications of new technology when we adopt it, which means regulation which has the public’s trust. For example, freedom from bias is essential in AI systems and in large part depends on the databases we use to train AI. The UK’s national AI strategy of last September does talk about public trust and the need for trustworthy AI, but this needs to be reflected in our regulatory landscape and how we regulate. In the face of the need to retain public trust, we need to be clear, above all, that regulation is not necessarily the enemy of innovation; in fact, it can be the stimulus and key to gaining and retaining public trust around digital technology and its adoption.

We may not need to go full fig as with the EU artificial intelligence Act, but the fact is that AI is a very different animal from previous technology. For instance, not everything is covered by existing equalities or data protection legislation, particularly in terms of accountability, transparency and explainability. A considerable degree of horizontality across government, business and society is needed to embed the OECD principles.

 

As the UK digital strategy published this month makes clear, there is a great deal of future regulation in the legislative pipeline, although, as the noble Baroness mentioned, we are lagging behind the EU. As a number of noble Lords mentioned, we are expecting a draft digital competition Bill in the autumn which will usher in the DMU in statutory form and a new pro-competition regime for digital markets. Just this week, we saw the publication of the new Data Protection and Digital Information Bill, with new powers for the ICO. We have also seen the publication of the national AI strategy, AI action plan and AI policy statement.

In the context of increased digital regulation and the need for co-ordination across regulators, the Select Committee welcomed the formation of the Digital Regulation Cooperation Forum by the ICO, CMA, Ofcom and FCA, and so do I, alongside the work plan which the noble Baroness, Lady Stowell, mentioned. I believe that this will make a considerable contribution to public trust in regulation. It has already made great strides in building a centre of excellence in AI and algorithm audit.

UK Digital Strategy elaborates on the creation of the DRCF:

“We are also taking steps to make sure the regulatory landscape is fully coherent, well-coordinated and that our regulators have the capabilities they need … Through the DRCF’s joint programme of work, it has a unique role to play in developing our pro-innovation approach to regulation.”

Like the Select Committee in one of its key recommendations, I believe we can go further in ensuring a co-ordinated approach to digital regulation, horizon scanning—which has been mentioned by all noble Lords—and adapting to future regulatory needs and oversight of fitness for purpose, particularly the desirability of a statutory duty to co-operate and consult with one another. It is a proposal which the Joint Committee on the Draft Online Safety Bill, of which I was a member, took up with enthusiasm. We also agreed with the Select Committee that it should be put on a statutory footing, with the power to resolve conflicts by directing its members. I was extremely interested to hear from noble Lords, particularly the noble Lord, Lord Vaizey, and the noble Earl, Lord Erroll, about the circumstances in which those conflicts need to be resolved. It is notable that the Government think that that is a bridge too far.

This very week, the Alan Turing Institute published a very interesting report entitled Common Regulatory Capacity for AI. As it says, the use of artificial intelligence is increasing across all sectors of the economy, which raises important and pressing questions for regulators. Its very timely report presents the results of research into how regulators can meet the challenge of regulating activities transformed by AI and maximise the potential of AI for regulatory innovation.

It takes the arguments of the Select Committee a bit further and goes into some detail on the capabilities required for the regulation of AI. Regulators need to be able to ensure that regulatory regimes are fit for AI and that they are able to address AI-related risks and maintain an environment that encourages innovation. It stresses the need for certainty about regulatory expectations, public trust in AI technologies and the avoidance of undue regulatory obstacles.

 

Regulators also need to understand how to use AI for regulation. The institute also believes that there is an urgent need for an increased and sustainable form of co-ordination on AI-related questions across the regulatory landscape. It highlights the need for access to new sources of shared AI expertise, such as the proposed AI and regulation common capacity hub, which

“would have its home at a politically independent institution, established as a centre of excellence in AI, drawing on multidisciplinary knowledge and expertise from across the national and international research community.”

It sets out a number of different roles for the newly created hub.

To my mind, these recommendations emphasise the need for the DRCF to take statutory form in the way suggested by the Select Committee. But, like the Select Committee, I believe that it is important that other regulators can come on board the DRCF. Some of them are statutory, such as the Gambling Commission, the Electoral Commission and the IPO, and I think it would be extremely valuable to have them on board. However, some of them are non-statutory, such the BBFC and the ASA. They could have a place at the table and join in benefiting from the digital centre of excellence being created.

Our Joint Committee also thoroughly agreed with the Communications and Digital Committee that a new Joint Committee on digital regulation is needed in the context of the Online Safety Bill. Indeed the Secretary of State herself has expressed support. As the Select Committee recommended, this could cover the broader digital landscape to partly oversee the work of the DRCF and also importantly address other objectives such as scrutiny of the Secretary of State, looking across the digital regulation landscape and horizon scanning—looking at evolving challenges, which was considered very important by our Joint Committee and the Select Committee.

The Government are engaged in a great deal of activity. The question, as ever, is whether the objectives, such as achieving trustworthy AI, digital upskilling and powers for regulators, are going to be achieved through the actions being taken so far. I believe that the recommendations of the Select Committee set out in this report would make a major contribution to ensuring effective and trustworthy regulation and should be supported.


Broadband and 5G rollout strategy needs review

During the passage of the Product and Security Bill it has become clear that the Government's rollout strategy keeps being changed and is unlikely to achieve its objectives, especially in rural areas. This is what I said when supporting a review. 

We all seem to be trapped in a time loop on telecoms, with continual consultations and changes to the ECC and continual retreat by the Government on their 1 gigabit per second broadband rollout pledge. In the Explanatory Notes, we were at 85% by 2025; this now seems to have shifted to 2026. There has been much government bravado in this area, but it is clear that the much-trumpeted £5 billion announced last year for project gigabit, to bring gigabit coverage to the hardest-to-reach areas, has not yet been fully allocated and that barely a penny has been spent.

Then, we have all the access and evaluation amendments to the Electronic Communications Code and the Digital Economy Act 2017. Changes to the ECC were meant to do the trick; then, the Electronic Communications and Wireless Telegraphy (Amendment) (European Electronic Communications Code and EU Exit) Regulations were heralded as enabling a stronger emphasis on incentivising investment in very high capacity networks, promoting the efficient use of spectrum, ensuring effective consumer protection and engagement and supporting the Government’s digital ambitions and plans to deliver nationwide gigabit-capable connectivity. 

Then we had the Future Telecoms Infrastructure Review. We had the Telecommunications Infrastructure (Leasehold Property) Act—engraved on all our hearts, I am sure. We argued about the definition of tenants, rights of requiring installation and rights of entry, and had some success. Sadly, we were not able to insert a clause that would have required a review of the Government’s progress on rollout. Now we know why. Even while that Bill was going through in 2021, we had Access to Land: Consultation on Changes to the Electronic Communications Code. We knew then, from the representations made, that the operators were calling for other changes not included in the Telecommunications Infrastructure (Leasehold Property) Act or the consultation. From the schedule the Minister has sent us, we know that he has been an extremely busy bee with yet further discussions and consultations.

I will quote from a couple of recent Financial Times pieces demonstrating that, with all these changes, the Government are still not achieving their objectives. The first is headed: “Broadband market inequalities test Westminster’s hopes of levelling up: Disparity in access to fast internet sets back rural and poorer areas, data analysis shows”. It starts:

“The UK has nearly 5mn houses with more than three choices of ultrafast fibre-optic broadband, while 10mn homes do not have a single option, according to analysis that points to the inequality in internet infrastructure across Britain.

While some parts of the country are benefiting from high internet speeds, others have been left behind, according to research conducted by data group Point Topic with the Financial Times, leading to disparities in people’s ability to work, communicate and play.”

A more recent FT piece from the same correspondent, Anna Gross, is headed: “UK ‘altnets’ risk digging themselves into a hole: Overbuilding poses threat to business model of fibre broadband groups challenging the big incumbents”. It starts:

“Underneath the UK’s streets, a billion-pound race is taking place. In many towns and cities, at least three companies are digging to lay broadband fibre cables all targeting the same households, with some areas predicted to have six or seven such lines by the end of the decade.

But only some of them will cross the finishing line … When the dust settles, will there be just two network operators—with Openreach and Virgin Media O2 dominating the landscape—or is there space for a sparky challenger with significant market share stolen from the incumbents?”

Are we now in a wild west for the laying of fibre-optic cable? Will this be like the energy market, with great numbers of companies going bust?

By contrast, INCA, the Independent Networks Cooperative Association, reports in its latest update:

“The ‘AltNets’ have more than doubled their footprint year on year since 2019”—

I think my noble friend Lord Fox quoted these figures—

“now reaching 5.5m premises and expected to reach 11.5m premises by the end of this year. Investment remains buoyant with an additional £5.7bn committed during 2021 bringing total estimated investment in the independent sector to £17.7bn for the period to 2025.”

We have two very different stories there. What contingencies have the Government made? Who will pick up the tab if the former scenario is correct—the poor old consumer? In any event, will rural communities get any service in the end?

What of rural broadband rollout? It appears that DCMS is currently assessing policy options on the means of best addressing the shortfall. I was interested to hear the very pointed question that the noble Baroness, Lady Merron, asked about what working groups were examining some of these issues, following a call for evidence on improving broadband for very hard-to-reach areas. What is the department actually doing? Can we expect more changes to the ECC?

The policy justification for the 2017 reforms was that rent savings by operators would be reinvested in networks, with the then Minister saying that the Government would hold operators’ feet to the fire to ensure that they delivered, noting that to

“have real impact, savings must be invested in expanding network infrastructure”.—[Official Report, 31/1/17; col. 1157.]

and saying that the revised code secured real investment. This was supported by confirmation, in the impact assessment accompanying the reforms to the ECC in 2017, that the Government would review the impact of the policy by June 2022. But this has not been met, despite the Government’s future infrastructure review confirming that they were already considering undertaking a formal review of the code reforms to assess their impact in 2019. The Government’s decision to introduce new legislation proves that the 2017 reforms have not actually achieved their aims.

Instead of leading to faster and easier deployment, as we have heard today, changes to the rights given to operators under the code have stopped the market working as it should and led to delays in digital rollout, as well as eroding private property rights. This has resulted in small businesses facing demands for rent reductions of over 90%; a spike in mobile network operators bringing protracted litigation; failure by mobile operators to reinvest their savings in mobile infrastructure; and delayed 5G access for up to 9 million people, at a cost of over £6 billion to the UK economy. The Government’s legislation and their subsidies now show they know the reforms have failed. That is why they are passing new legislation to revise the code, as well as announcing £500 million in new subsidies for operators through the shared rural network.

In Committee in the other place, the Minister, Julia Lopez, claimed:

“If a review takes place, stakeholders will likely delay entering into agreements to enable the deployment of infrastructure. Only when the review has concluded and it is clear whether further changes are to be made to the code will parties be prepared to make investment or financial commitments”.—[Official Report, Commons, Product Security and Telecommunications Infrastructure Bill Committee, 22/3/22; col. 122.]

In addition to there being no evidence for this claim, this extraordinary line of reasoning would allow the Government to escape scrutiny and commitments in a wide variety of policy areas, were it applied more broadly. To maintain public faith in policy-making, it is vital that there is an accessible evidence base on which decisions are made. The Government’s decisions in this Bill do not meet the standard.

Moreover, I know that Ministers are sceptical about the Centre for Economics and Business Research’s report. The noble Lord, Lord Parkinson, has said that it oversimplifies the issue, but I do not believe that the Government have properly addressed some of the issues raised in it. The CEBR is an extremely reputable organisation and although the research was commissioned by Protect and Connect, the Government need to engage in that respect.

Our amendment would insert a new clause obliging the Government to commission an independent review of the impact of the legislation and prior reforms within 18 months. The review would assess the legislation’s impact on the rate of additional investment in mobile networks and infrastructure deployment, the costs borne by property owners and the wider benefit or costs of the legislation. It would also oblige the Government to publish a response to the review within 12 weeks of its publication and lay that before Parliament, to ensure parliamentary accountability for the Government’s action and to allow debate.

Another amendment would insert a new clause placing obligations on operators to report certain information to Ofcom each year. Operators would have to report on such information as their overall investment in mobile networks, the rent paid to site providers, the number of new mobile sites built within the UK, and upgrades and renewals.

It is the final group in Committee, so where in all this—as my noble friend Lord Fox and I have been asking each time we debate these issues—are the interests of the consumer, especially the rural consumer? How are they being promoted, especially now that market review is only once every five years? That is why we need these reviews in these amendments. We tried in the last Bill to make the Government justify their strategy. Now it is clear that changes to the ECC are not fit for purpose and we will try again to make the Government come clean on their strategy.


Government AI Procurement needs ethical and data compliance obligation

The Procurement Bill lacks any kinds of obligation on Government to ensure that AI systems procred comply with ethical and data protection principles despite numerous guidelines bering issed. This is what I said when proposing a new clause designed to ensure this.

In our report AI in the UK: Ready, Willing and Able?, our AI Lords Select Committee, which I chaired, expressed its strong belief in the value of procurement by the public sector of AI applications. However, as a recent research post put it:

“Public sector bodies in several countries are using algorithms, AI, and similar methods in their administrative functions that have sometimes led to bad outcomes that could have been avoided.”

The solution is:

“In most parliamentary democracies, a variety of laws and standards for public administration combine to set enough rules to guide their proper use in the public sector.”

The challenge is to work out what is lawful, safe and effective to use.

The Government clearly understand this, yet one of the baffling and disappointing aspects of the Bill is the lack of connection to the many government guidelines applying to the procurement and use of tech, such as artificial intelligence and the use and sharing of data by those contracting with government. It is unbelievable, but it is almost as if the Government wanted to be able to issue guidance on the ethical aspects of AI and data without at the same time being accountable if those guidelines are breached and without any duty to ensure compliance.

There is no shortage of guidance available. In June 2020, the UK Government published guidelines for artificial intelligence procurement, which were developed by the UK Government’s Office for Artificial Intelligence in collaboration with the World Economic Forum, the Government Digital Service, the Government Commercial Function and the Crown Commercial Service. The UK was trumpeted as the first Government to pilot these procurement guidelines. Their purpose is to provide central government departments and other public sector bodies with a set of guiding principles for purchasing AI technology. They also cover guidance on tackling challenges that may occur during the procurement process. In connection with this project, the Office for AI also co-created the AI procurement toolkit, which provides a guide for the public sector globally to rethink the procurement of AI.

As the Government said on launch,

“Public procurement can be an enabler for the adoption of AI and could be used to improve public service delivery. Government’s purchasing power can drive this innovation and spur growth in AI technologies development in the UK.

As AI is an emerging technology, it can be more difficult to establish the best route to market for your requirements, to engage effectively with innovative suppliers or to develop the right AI-specific criteria and terms and conditions that allow effective and ethical deployment of AI technologies.”

The guidelines set out a number of AI-specific considerations within the procurement process:

“Include your procurement within a strategy for AI adoption … Conduct a data assessment before starting your procurement process … Develop a plan for governance and information assurance … Avoid Black Box algorithms and vendor lock in”,

to name just a few. The considerations in the guidelines and the toolkit are extremely useful and reassuring, although not as comprehensive or risk-based as some of us would like, but where does any duty to adhere to the principles reflecting them appear in the Bill?

There are many other sets of guidance applicable to the deployment of data and AI in the public sector, including the Technology Code of Practice, the Data Ethics Framework, the guide to using artificial intelligence in the public sector, the data open standards and the algorithmic transparency standard. There is the Ethics, Transparency and Accountability Framework, and this year we have the Digital, Data and Technology Playbook, which is the government guidance on sourcing and contracting for digital, data and technology projects and programmes. There are others in the health and defence sectors. It seems that all these are meant to be informed by the OECD’s and the G20’s ethical principles, but where is the duty to adhere to them?

It is instructive to read the recent government response to Technology Rules?, the excellent report from the Justice and Home Affairs Committee, chaired by my noble friend Lady Hamwee. That response, despite some fine-sounding phrases about responsible, ethical, legitimate, necessary, proportionate and safe Al, displays a marked reluctance to be subject to specific regulation in this area. Procurement and contract guidelines are practical instruments to ensure that public sector authorities deploy AI-enabled systems that comply with fundamental rights and democratic values, but without any legal duty backing up the various guidelines, how will they add up to a row of beans beyond fine aspirations? It is quite clear that the missing link in the chain is the lack of a legal duty to adhere to these guidelines.

My amendment is formulated in general terms to allow for guidance to change from time to time, but the intention is clear: to make sure that the Government turn aspiration into action and to prompt them to adopt a legal duty and a compliance mechanism, whether centrally via the CDDO, or otherwise.

 


Debate on AI in the UK: No Room For Complacency report

Recently the House of Lords belatedly debated the follow Report to the the original House of Lords AI Committee Report  AI Report No Room for Complacency . This is how I introduced it:

My Lords, the Liaison Committee report No Room for Complacency was published in December 2020, as a follow-up to our AI Select Committee report, AI in the UK: Ready, Willing and Able?, published in April 2018. Throughout both inquiries and right up until today, the pace of development here and abroad in AI technology, and the discussion of AI governance and regulation, has been extremely fast moving. Today, just as then, I know that I am attempting to hit a moving target. Just take, for instance, the announcement a couple of weeks ago about the new Gato—the multipurpose AI which can do 604 functions —or perhaps less optimistically, the Clearview fine. Both have relevance to what we have to say today.

First, however, I say a big thank you to the then Liaison Committee for the new procedure which allowed our follow-up report and to the current Lord Speaker, Lord McFall, in particular and those members of our original committee who took part. I give special thanks to the Liaison Committee team of Philippa Tudor, Michael Collon, Lucy Molloy and Heather Fuller, and to Luke Hussey and Hannah Murdoch from our original committee team who more than helped bring the band, and our messages, back together.

So what were the main conclusions of our follow-up report? What was the government response, and where are we now? I shall tackle this under five main headings. The first is trust and understanding. The adoption of AI has made huge strides since we started our first report, but the trust issue still looms large. Nearly all our witnesses in the follow-up inquiry said that engagement continued to be essential across business and society in particular to ensure that there is greater understanding of how data is used in AI and that government must lead the way. We said that the development of data trusts must speed up. They were the brainchild of the Hall-Pesenti report back in 2017 as a mechanism for giving assurance about the use and sharing of personal data, but we now needed to focus on developing the legal and ethical frameworks. The Government acknowledged that the AI Council’s roadmap took the same view and pointed to the ODI work and the national data strategy. However, there has been too little recent progress on data trusts. The ODI has done some good work, together with the Ada Lovelace Institute, but this needs taking forward as a matter of urgency, particularly guidance on the legal structures. If anything, the proposals in Data: A New Direction, presaging a new data reform Bill in the autumn, which propose watering down data protection, are a backward step.

More needs to be done generally on digital understanding. The digital literacy strategy needs to be much broader than digital media, and a strong digital competition framework has yet to be put in place. Public trust has not been helped by confusion and poor communication about the use of data during the pandemic, and initiatives such as the Government’s single identifier project, together with automated decision-making and live facial recognition, are a real cause for concern that we are approaching an all-seeing state.

My second heading is ethics and regulation. One of the main areas of focus of our committee throughout has been the need to develop an appropriate ethical framework for the development and application of AI, and we were early advocates for international agreement on the principles to be adopted. Back in 2018, the committee took the view that blanket regulation would be inappropriate, and we recommended an approach to identify gaps in the regulatory framework where existing regulation might not be adequate. We also placed emphasis on the importance of regulators having the necessary expertise.

In our follow-up report, we took the view that it was now high time to move on to agreement on the mechanisms on how to instil what are now commonly accepted ethical principles—I pay tribute to the right reverend Prelate for coming up with the idea in the first place—and to establish national standards for AI development and AI use and application. We referred to the work that was being undertaken by the EU and the Council of Europe, with their risk-based approaches, and also made recommendations focused on development of expertise and better understanding of risk of AI systems by regulators. We highlighted an important advisory role for the Centre for Data Ethics and Innovation and urged that it be placed on a statutory footing.

We welcomed the formation of the Digital Regulation Cooperation Forum. It is clear that all the regulators involved—I apologise for the initials in advance—the ICO, CMA, Ofcom and the FCA, have made great strides in building a centre of excellence in AI and algorithm audit and making this public. However, despite the publication of the National AI Strategy and its commitment to trustworthy AI, we still await the Government’s proposals on AI governance in the forthcoming White Paper.

It seems that the debate within government about whether to have a horizontal or vertical sectoral framework for regulation still continues. However, it seems clear to me, particularly for accountability and transparency, that some horizontality across government, business and society is needed to embed the OECD principles. At the very least, we need to be mindful that the extraterritoriality of the EU AI Act means a level of regulatory conformity will be required and that there is a strong need for standards of impact, as well as risk assessment, audit and monitoring, to be enshrined in regulation to ensure, as techUK urges, that we consider the entire AI lifecycle.

We need to consider particularly what regulation is appropriate for those applications which are genuinely high risk and high impact. I hope that, through the recently created AI standards hub, the Alan Turing Institute will take this forward at pace. All this has been emphasised by the debate on the deployment of live facial recognition technology, the use of biometrics in policing and schools, and the use of AI in criminal justice, recently examined by our own Justice and Home Affairs Committee.

My third heading is government co-ordination and strategy. Throughout our reports we have stressed the need for co-ordination between a very wide range of bodies, including the Office for Artificial Intelligence, the AI Council, the CDEI and the Alan Turing Institute. On our follow-up inquiry, we still believed that more should be done to ensure that this was effective, so we recommended a Cabinet committee which would commission and approve a five-year national AI strategy, as did the AI road map.

In response, the Government did not agree to create a committee but they did commit to the publication of a cross-government national AI strategy. I pay tribute to the Office for AI, in particular its outgoing director Sana Khareghani, for its work on this. The objectives of the strategy are absolutely spot on, and I look forward to seeing the national AI strategy action plan, which it seems will show how cross-government engagement is fostered. However, the Committee on Standards in Public Life—I am delighted that the noble Lord, Lord Evans, will speak today—report on AI and public standards made the deficiencies in common standards in the public sector clear.

Subsequently, we now have an ethics, transparency and accountability framework for automated decision-making in the public sector, and more recently the CDDO-CDEI public sector algorithmic transparency standard, but there appears to be no central and local government compliance mechanism and little transparency in the form of a public register, and the Home Office appears to be still a law unto itself. We have AI procurement guidelines based on the World Economic Forum model but nothing relevant to them in the Procurement Bill, which is being debated as we speak. I believe we still need a government mechanism for co-ordination and compliance at the highest level.

The fourth heading is impact on jobs and skills. Opinions differ over the potential impact of AI but, whatever the chosen prognosis, we said there was little evidence that the Government had taken a really strategic view about this issue and the pressing need for digital upskilling and reskilling. Although the Government agreed that this was critical and cited a number of initiatives, I am not convinced that the pace, scale and ambition of government action really matches the challenge facing many people working in the UK.

The Skills and Post-16 Education Act, with its introduction of a lifelong loan entitlement, is a step in the right direction and I welcome the renewed emphasis on further education and the new institutes of technology. The Government refer to AI apprenticeships, but apprentice levy reform is long overdue. The work of local digital skills partnerships and digital boot camps is welcome, but they are greatly underresourced and only a patchwork. The recent Youth Unemployment Select Committee report Skills for Every Young Person noted the severe lack of digital skills and the need to embed digital education in the curriculum, as did the AI road map. Alongside this, we shared the priority of the AI Council road map for more diversity and inclusion in the AI workforce and wanted to see more progress.

At the less rarefied end, although there are many useful initiatives on foot, not least from techUK and Global Tech Advocates, it is imperative that the Government move much more swiftly and strategically. The All-Party Parliamentary Group on Diversity and Inclusion in STEM recommended in a recent report a STEM diversity decade of action. As mentioned earlier, broader digital literacy is crucial too. We need to learn how to live and work alongside AI.

The fifth heading is the UK as a world leader. It was clear to us that the UK needs to remain attractive to international research talent, and we welcomed the Global Partnership on AI initiative. The Government in response cited the new fast-track visa, but there are still strong concerns about the availability of research visas for entrance to university research programmes. The failure to agree and lack of access to EU Horizon research funding could have a huge impact on our ability to punch our weight internationally.

How the national AI strategy is delivered in terms of increased R&D and innovation funding will be highly significant. Of course, who knows what ARIA may deliver? In my view, key weaknesses remain in the commercialisation and translation of AI R&D. The recent debate on the Science and Technology Committee’s report on catapults reminded us that this aspect is still a work in progress.

Recent Cambridge round tables have confirmed to me that we have a strong R&D base and a growing number of potentially successful spin-outs from universities, with the help of their dedicated investment funds, but when it comes to broader venture capital culture and investment in the later rounds of funding, we are not yet on a par with Silicon Valley in terms of risk appetite. For AI investment, we should now consider something akin to the dedicated film tax credit which has been so successful to date.

Finally, we had, and have, the vexed question of lethal autonomous weapons, which we raised in the original Select Committee report and in the follow-up, particularly in the light of the announcement at the time of the creation of the autonomy development centre in the MoD. Professor Stuart Russell, who has long campaigned on this subject, cogently raised the limitation of these weapons in his second Reith Lecture. In both our reports we said that one of the big disappointments was the lack of definition of “autonomous weapons”. That position subsequently changed, and we were told in the Government’s response to the follow-up report that NATO had agreed a definition of “autonomous” and “automated”, but there is still no comprehensive definition of lethal autonomous weapons, despite evidence that they have clearly already been deployed in theatres such as Libya, and the UK has firmly set its face against laws limitation in international fora such as the CCW.

For a short report, our follow-up report covered a great deal of ground, which I have tried to cover at some speed today. AI lies at the intersection of computer science, moral philosophy, industrial education and regulatory policy, which makes how we approach the risks and opportunities inherent in this technology vital and difficult. The Government are engaged in a great deal of activity. The question, as ever, is whether it is focused enough and whether the objectives, such as achieving trustworthy AI and digital upskilling, are going to be achieved through the actions taken so far. The evidence of success is clearly mixed. Certainly there is still no room for complacency. I very much look forward to hearing the debate today and to what the Minister has to say in response. I beg to move.


Government should use procurement process to secure good work

Recently in the context of its duties under  the Procurement Bill I argued for an obligation on Government to ensure that it had regard to the need to secure good work for those carrying out contracts under its procurement activities. This is what I said: 

My own interests, and indeed concerns, in this area go back to the House of Lords Select Committee on AI. I chaired this ad hoc inquiry, which produced two reports: AI in the UK: Ready, Willing and Able? and a follow-up report via the Liaison Committee, AI in the UK: No Room for Complacency, which I mentioned in the debate on a previous group.

The issue of the adoption of AI and its relationship to the augmentation of human employment or substitution is key. We were very mindful of the Frey and Osborne predictions in 2013, which estimated that 47% of US jobs are at risk of automation—since watered down—relating to the sheer potential scale of automation over the next few years through the adoption of new technology. The IPPR in 2017 was equally pessimistic. Others, such as the OECD, have been more optimistic about the job-creation potential of these new technologies, but it is notable that the former chief economist of the Bank of England, Andrew Haldane, entered the prediction game not long ago with a rather pessimistic outlook.

Whatever the actual outcome, we said in our report that we need to prepare for major disruption in the workplace. We emphasised that public procurement has a major role in terms of the consequences of AI adoption on jobs and that risk and impact assessments need to be embedded in the tender process.

The noble Lord, Lord Knight, mentioned the All-Party Parliamentary Group on the Future of Work, which, alongside the Institute for the Future of Work, has produced some valuable reports and recommendations in the whole area of the impact of new technology on the workplace. In their reports—the APPG’s The New Frontier and the institute’s Mind the Gap—they recommend that public authorities be obliged to conduct algorithmic impact assessments as a systematic approach to and framework for accountability and as a regulatory tool to enhance the accountability and transparency of algorithmic systems. I tried to introduce in the last Session a Private Member’s Bill that would have obliged public authorities to complete an algorithmic impact assessment where they procure or develop an automated 

decision-making system, based on the Canadian directive on artificial intelligence’s impact assessments and the 2022 US Algorithmic Accountability Act.

In particular, we need to consider the consequences for work and working people, as well as the impact of AI on the quality of employment. We also need to ensure that people have the opportunity to reskill and retrain so that they can adapt to the evolving labour market caused by AI. The all-party group said:

“The principles of Good Work should be recognised as fundamental values … to guide development and application of a human-centred AI Strategy. This will ensure that the AI Strategy works to serve the public interest in vision and practice, and that its remit extends to consider the automation of work.”

The Institute for the Future of Work’s Good Work Charter is a useful checklist of AI impacts for risk and impact assessments—for instance, in a workplace context, issues relating to

“access … fair pay … fair conditions … equality … dignity … autonomy … wellbeing … support”

and participation. The noble Lord, Lord Knight, and the noble Baroness, Lady Bennett, have said that these amendments would ensure that impacts on the creation of good, local jobs and other impacts in terms of access to, terms of and quality of work are taken into account in the course of undertaking public procurement.

As the Work Foundation put it in a recent report,

“In many senses, insecure work has become an accepted part of the UK’s labour market over the last 20 years. Successive governments have prioritised raising employment and lowering unemployment, while paying far less attention to the quality and security of the jobs available.”

The Taylor review of modern working practices, Good Work—an independent report commissioned by the Department for Business, Energy and Industrial Strategy that remains largely unimplemented—concluded that there is a need to provide a framework that better reflects the realities of the modern economy and the spectrum of work carried out.

The Government have failed to legislate to ensure that we do not move even further down the track towards a preponderantly gig economy. It is crucial that they use their procurement muscle to ensure, as in Good Work, that these measures are taken on every major public procurement involving AI and automated decision-making.


Music Touring : The problems remain

The Earl of Clancarty recently initiated a debate on Music Touring. Many of us have been campaigninmg for a number of years to ensure that the huge impact opf Brexit on touring by music artists and other performers and creative creative artists is mitigated.  

This what I said:

As we have continuously emphasised in the last two years, we are talking about not only touring by the music industry—one of the most successful and fastest growing sectors, where real jobs and livelihoods now risk being lost—but by a number of other important parts of the creative sector as well: museums, theatre and the wider visual arts sector, as described by the Contemporary Visual Arts Network, and indeed the sports sector, as described by the noble Lord, Lord Moynihan. The ramifications are very broad. The right reverend Prelate reminded us that this impacts on levelling up and on values. We heard from the noble Baroness, Lady Fleet, about the impact on the talent pipeline and the potential to impact on communities through music education.

The dual registration deal on cabotage, which we have debated previously, falls short of satisfying the greater number of smaller specialist hauliers and own-account operators—it was described as a sticking plaster by my noble friend Lord German, and he is correct. On these Benches, we pointed out that the issues on cabotage were just one part of a huge cloud now hanging over the creative sector as a result of Brexit. The noble Viscount, Lord Stansgate, my noble friend Lord Strasburger and the noble Lord, Lord Hannay, all described that, including the requirement for work permits or visa exemptions in many EU countries, CITES certificates for musical instruments, ATA carnets for all instruments and equipment, and proof of origin requirements for merchandise. It is a real return to the past, as described by my noble friend Lord Jones.

The failure to secure a reciprocal exemption to permit freedom of movement for creatives on tour or short-term paid engagements and their support staff when we left the EU has been catastrophic for UK and EU touring creatives. The sheer disparity of treatment was described by my noble friend Lord German. As the noble Lord, Lord Hannay, said, it was very clear from the outset that that would be the impact.

The reason we are in this mess is that the Home Office refused to grant particular categories of EU citizens, including sportspersons or artists performing an activity on an ad hoc basis, the right to 90 days permitted paid engagement, and so the EU would not reciprocate. We are still pursuing freedom of information requests to find out exactly what the UK Government put forward. The problems with merchandise, carnets and CITES are, if anything, worse, as described by a number of noble Lords. As the noble Baroness, Lady Bull, confirmed, the ISM says:

“In fact, almost nothing has changed since the TCA came into effect, as recent accounts from musicians resuming EU tours have demonstrated.”

As the Classical Music APPG, LIVE, UK Music, the ISM and many others have advocated, what is urgently needed are permanent solutions which will secure the kind of future that the noble Viscount, Lord Stansgate, referred to.

Some require bilateral negotiation and some can be done unilaterally through greater engagement, but the key to this is multilateral action. As a number of noble Lords have said, we need more productive, collaborative relationships. This was mentioned by the noble Lords, Lord Hannay and Lord Cormack, my noble friend Lord German and the noble Baroness, Lady Bull. The noble Baroness made some very constructive, detailed suggestions about how we can get to that point on those multilateral negotiations. We need comprehensive negotiation on road haulage for cultural purposes, a cultural waiver in relation to ATA carnets and CITES, and a visa waiver agreement.

There is a very depressing letter from former Minister Lopez to my colleague in the Commons Jamie Stone, which sets out very few constructive proposals. I hope the Minister here today does rather better. Will we get the kind of new beginning that the noble Lord, Lord Cormack, mentioned? We need something simple and effective.

 

A couple of weeks earlier I had an exchange with Baroness Vere the transport Minister when I asked a question as follows. The Government's response is clearly totally unsatisfactory.

Music Touring

Lord Clement-Jones To ask Her Majesty’s Government, further to their announcement on 6 May regarding “dual registration” for specialist touring hauliers, what assessment they have made of the impact this will have on artists and organisations which tour in their own vehicles and operate under “own account”; and whether they have considered support for smaller hauliers operating which do not have the resources to operate dual registration.

The Parliamentary Under-Secretary of State, Department for Transport 

(Baroness Vere of Norbiton) (Con)

My Lords, specialist touring hauliers operating under “own account” can utilise the dual-registration measure if they have a standard international operator licence, which they must apply for, and a base in Great Britain and another country. Operators will need to make their own decisions on whether they choose to do so based on business need and resources available to them.

Lord Clement-Jones (LD)

My Lords, this is all very much half a loaf. If a comprehensive solution is not found, the damage to the UK music industry and the events support industry will be massive. The Prime Minister has assured us that the Government are working “flat out” on the touring issue. Can the Minister assure the House that her department is urgently working on finding a wider solution, such as an exemption from cabotage for all trucks engaged on cultural events?

Baroness Vere of Norbiton 

(Con)

Certainly, the department has worked incredibly hard on this and continues to do so. We had a public consultation back in February, and we are deeply engaged with the industry, particularly the specialist haulage industry, which is so important. We know that about one in five hauliers has already set up within the EU, and many more have plans to do so. We recognise that the dual-registration system will not benefit absolutely everybody. However, it is the case under the TCA that many hauliers will be able to make use of their two cross-trades within the bilateral EU-UK movements that they can make. So it does not mean that all touring is off the table. We believe that, at the moment, we have the best possible solution, in light of the current response from the EU.

Lord Clement-Jones 

My Lords, is the gist of what the Minister has said today that everything is satisfactory and nothing further needs to be done?

Baroness Vere of Norbiton 

I completely reject that—that is not what I am saying at all. The Government absolutely recognise that the measures that we have put in place help the sector and mean that a large proportion of the UK industry can continue to operate, but we acknowledge that not all specialist operators will be in a position to establish a base overseas. As I have said before, our door remains open; we would wish to discuss this with the EU but so far, unfortunately, it has not wanted to do so.

 


We need to end the confusion and build public trust over health data

During recent debates  during the passage of the Health and Scare Bill I helped move amendmendments designed to end the confusion on the use and sharing of Health Data. The complications involved in using health data for public benefit and the lack of public engagement has led to a massive loss of trust. The transfer of health data responsibilities to NHS England from NHS Digital without proper consultation and the GP data opt out fiasco are a particular examples of how public trust can be lost. The review by Profressor Ben Goldacre has recognized this and I hope will lead to a much more comprehensive and clear framework for the protection and use of health data.

 

The first debate was on the subject of digital transformation of the health service generally

I start by warmly thanking the noble Lord, Lord Hunt of Kings Heath, for allowing me to speak to and lead on this set of amendments, to which his is the leading name. By the same token, I am delighted to see that he is now back in his place and able to advocate much more knowledgeably than I can the merits of the amendments in this group, which relate to the digital aspects of the NHS and the importance of digital transformation in the health service. They are designed to ensure that a digital transformation duty is set out, five-year plans are made, digital issues are high up on the agenda of the ICBs, and progress in this area is assessed and reported on.

I am sorry that I was not able to contribute at Second Reading on digital or data matters. However, as Chris Hopson, chief executive of NHS Providers, said in his Observer piece two Sundays ago,

“we need a national transformation programme that embeds modern technology, 21st century medicine, integrated care closer to home and much greater emphasis on prevention at the heart of our health and care system.”

There is huge potential for technology to help health and care professionals to communicate better and to enable people to access the care they need quickly and easily when it suits them. Quite apart from its impact on planning and administration, the technology, as the NHSE digital transformation website emphasises, goes all the way from ambulance iPads through fitness apps to digital home care technology. It ranges from websites and apps that make care and advice easy to access wherever you are to connected computer systems that give NHS staff the test results, history and evidence they need to make the best decisions for patients.

As the recent Wade-Gery report points out:

“Digital technology is transforming every industry including healthcare. Digital and data have been used to redesign services, raising citizen expectations about self-service, personalisation, and convenience, and increasing workforce productivity.”

It says that the NHS should be in the vanguard. It goes on to say:

“The pandemic has accelerated the shift to online and changed patient expectations and clinical willingness to adopt new ways of working.”

It also says that

“the vaccine programme, supported by so many brilliant volunteers and staff, was only possible through the use of advanced data analytics to drive the risk stratification, population segmentation and operational rollout.”

However, the review also says:

“The need is compelling. The NHS faces unprecedented demand and severe operational pressure as we emerge from the coronavirus pandemic, and we need new ways of working to address this. Now is the moment to put data, digital and technology at the heart of  how we transform health services … Effective implementation will require a significant cultural shift away from the current siloed approach in the centre with conscious management to ensure intentions translate to reality … This system leadership should be responsible, in a partnership model between the centre and ICSs, for setting out the business and technology capability requirements of ICSs and the centre with the roadmaps to realise these, and for determining the appropriate high level technical standards, and blueprints for transformed care pathways.”

I have quoted the Wade-Gery review at length but the What Good Looks Like framework set out by NHSX last year is an important document too, designed as it is to be used to accelerate digital and data transformation. It specifies in success measure 1:

“Your ICS has a clear strategy for digital transformation and collaboration. Leaders across the ICS collectively own and drive the digital transformation journey, placing citizens and frontline perspectives at the centre. All leaders promote digitally enabled transformation to efficiently deliver safe, high quality care. Integrated Care Boards (ICBs) build digital and data expertise and accountability into their leadership and governance arrangements, and ensure delivery of the system-wide digital and data strategy.”

Wade-Gery recommends, inter alia, that we

“reorientate the focus of the centre to make digital integral to transforming care”.

In the light of all this, surely that must apply to ICBs as well.

We need to adopt the measures set out in the amendments in this group; namely there should be a director of digital transformation for each ICB. ICBs need clear leadership to devise, develop and deliver the digital transformation that the NHS so badly needs, in line with all the above. There also needs to be a clear duty placed on ICBs to promote digital transformation. It must be included as part of their performance assessment—otherwise, none of this will happen—and in their annual report..

The resources for digital transformation need to be available. Capital  expenditure budgets for digital transformation must not  be raided for other purposes and digital transformation should take place as planned. It is clear from the Wade-Gery report that we should be doubling and lifting our NHS capital expenditure to 5% of total NHS expenditure, as recommended by the noble Lord, Lord Darzi, and the Institute for Public Policy Research back in June 2018. We should have done that by June 2022 to accord with his recommendations but we are still suffering from chronic underinvestment in digital technology. Indeed, what are the Government’s expenditure plans on NHS digital transformation? We should be ring-fencing the 5% as firmly as we can. As Wade-Gery says:

“NHSEI should therefore as a matter of urgency determine the levels of spend on IT across the wider system and seek to re-prioritise spend from within the wider NHSE budget to support accelerated digital transformation.”

It adds up to asking why these digital transformation aspirations have been put in place without willing the means.

I am also mindful of the other side of the coin of the adoption of digital transformation: there needs to be public information and engagement. 

Our amendments are designed to ensure the provision of information about the deployment of treatments and technology as part of ICBs’ patient involvement and patient choice duties. Without that kind of transparency, there will not be the patient and public trust in the NHS adoption of digital technology that is needed. Rightly, success measure 1 of the NHSX What Good Looks Like framework includes that an ICS should, inter alia,

“identify ICS-wide digital and data solutions for improving health and care outcomes by regularly engaging with partners, citizen and front line groups”.

Success measure 5, titled “Empower citizens”, says:

“What does good look like? Citizens are at the centre of service design and have access to a standard set of digital services that suit all literacy and digital inclusion needs. Citizens can access and contribute to their healthcare information, taking an active role in their health and well-being.”

So in the NHS’s view the engagement and provision of information about the deployment of new technologies is absolutely part of the delivery of a digital transformation strategy.

In essence, the amendments would enshrine what is already there in Wade-Gery and best practice guidance where it relates to digital technology and transformation. We should be making sure that our NHS legislation is fully updated in line with that report and with the guidance on what success looks like for the digital age. I hope the Minister agrees to take the amendments on board, and I look forward to hearing his reply.

The second two committee debates were specifically on health data

These amendments relate to the abolition of the Health and Social Care Information Centre and the implications for the integrity of patient data. Clauses 88 and 89 give the Secretary of State powers through regulations to transfer a function from one relevant body to another, and the relevant bodies are defined as Health Education England, the Health and Social Care Information Centre, the Health Research Authority, the Human Fertilisation and Embryology Authority, the Human Tissue Authority and NHS England. Other than NHS England, each of those bodies can be abolished under the clause as the result of a transfer of functions.

 Clause 88 provides for the abolition of the Health and Social Care Information Centre. The Government have announced that they will be using the powers in that clause to merge NHS Digital to form part of the new transformation directorate within NHSE, and of course we have seen that NHSX has now been abolished and the relevant personnel have moved into the transformation directorate. The Health and Social Care Information Centre is an executive non-departmental public body created by statute, usually known by the term “NHS Digital”. This amendment, which would prevent that from happening to the HSCIC, is designed to ensure that NHS Digital continues as an entity to safeguard patient data. The merger of NHS Digital with NHSE risks losing the skills and experience that currently sit within NHS Digital. I have mentioned that NHSX has ceased to exist.

There are two risks for patients. One is that important knowledge and skills will be lost as talented people leave the organisation and time is devoted to the nuts and bolts of making the organisation function rather than on achieving its aims. The other is that the new merged organisation will just be too big and unwieldy to respond in an agile way to major challenges such as workforce planning and digital innovation. If NHSE leaders understand how important these challenges are then they will be able to prioritise them and make them part of the organisation’s core function.

I turn to the functions of the statutory safe havens in relation to Clause 89. Part 9 in Chapter 2 of the Health and Social Care Act 2012 lays out the functions and obligations of what is described as the statutory safe haven for patient data from across the health and social care system required for the production of national statistics and for commissioning, regulatory and research purposes, in addition to supporting patient care. Amendment 228 seeks to keep these statutory protections in place and ensure that NHS England does not take on that responsibility, because of a potential conflict of interest in its role.

The bottom line is that we need to retain NHS Digital’s statutory safe haven functions separate from NHS England. As the BMA has said, it is of the utmost importance to retain a quasi-autonomous body for the purposes of collecting, storing and distributing sensitive patient data—something that would be lost under a merger of NHSD and NHSE.

There is one other major advantage of keeping NHS Digital as the digital safe haven. The statutory safe haven’s legal name is the Health and Social Care Information Centre, so there is some obligation to social care. NHSD has always given some thought to integration, even when there was very little on the social care side to integrate with, and little interest from NHSE in doing that work itself. If it all gets merged into NHSE then how will the obligation to collect social care data continue to exist, since NHSE’s responsibility is to the NHS? If this transfer of functions takes place, who will be responsible for the national collection of social care data? Each bit of the social care world will see NHSE as a different entity from NHS Digital. What are the Government’s joining-up plans in respect of the future governance of this kind of data? I beg to move.

I take this opportunity to come back to the Minister to add a query about the data governance regime which he has described this evening and into which we dipped our toe with the last group of amendments. My noble friend anticipated me in discussing the White Paper, which, in turn, follows from the Data Saves Lives draft strategy. I hope we will have the opportunity to meet the Minister to discuss this further because it is a very complex area.

I want to add to that conversation the fact that we variously have IGARD, CAG and the National Data Guardian for Health and Care—as well as NHS Digital, which we hope will remain separate, but we will come to that shortly. We have all these different bodies, but we need a simple regime which helps us understand, for instance, whether the Minister will say, “Yes, it’s already happening”, to the noble Baroness, Lady McIntosh, or, “No, it’s not going to happen.” I could not tell you the answer to that question in my current state of knowledge about the ability to transfer information across the health service and internationally.

There is a balance to be struck between the established protections and new provisions which might expedite the development of access to new and improved treatments and technologies—but it must be done in a safe way. I hope that, between Committee and Report, the Minister will take the opportunity to ensure that we have all the information we need on plans to perform a so-called reset of or new direction for—or however he might like to describe it—the NHS’s use of our health data.

At Report Stage we made some progress 

The Minister in his letter—which the noble Lord, Lord Hunt, addressed in his response—seemed a bit affronted that we should raise the credentials of NHSEI as a holder and protector of NHS data. I would refer to the BMJ letter, which I think came online yesterday, from Kingsley Manning, a former chair of NHS Digital. He really does set it all out. I shall not go into great detail but, for instance, he says that merging NHS Digital with NHSE

“is an important and retrograde step.”

Your Lordships may dispute this, but from where he sat this is important. He said:

“In my experience the general approach of NHS England, including of its clinicians, was that much of the guidance and regulations with respect to the use of patient” data “was seen as unnecessary”. That is a pretty big statement and a fairly damning verdict from the former chair of NHS Digital. I do not think that the Minister can simply remedy the situation by assurances, so I support the amendment in the name of the noble Lord, Lord Hunt, and if it is put to a vote, I very much hope that the House will support it.

Finally, whether or not these amendments are pressed, I hope that the Minister will reconsider whether the Goldacre review should be published before the final version of the new NHS data strategy, Data Saves Lives. I welcome the fact that the Goldacre review is going to deal with information governance, but it is important that we should see that before the final version of Data Saves Lives.

Later when the Bill came back to the Lords but before the Golacre review was published I acknowledged that the Government had promised action 

My Lords, briefly, I support the remarks of the noble Lord, Lord Hunt, regarding Motions F and F1. He, assisted by my noble friend Lady Brinton and I, has pursued the question of the future of data governance in the NHS with great determination and persistence. I pay tribute to him and to medConfidential in that respect. I know that the Minister, the noble Lord, Lord Kamall, is equally determined to make sure that data governance in the new structures is designed to secure public trust. I very much hope that he will give the assurances sought by the noble Lord, Lord Hunt.

The key problem we identified early on was the conflict of interest referred to by the noble Lord, Lord Hunt, with NHS England in effect marking its own homework, and those who have data governance responsibility reporting directly to senior managers within the digital transformation directorate. I hope that the assurances to be given by the Minister will set out a clear set of governance duties on transparency of oversight, particularly where NHS England is exercising its own statutory powers internally. I look forward to what the Minister has to say