What is the role of government policy in protecting society and democracy from threats arising from misinformation? Two leading experts and members of the UK Parliament, House of Lords, help us understand the report Digital Technology and the Resurrection of Trust.
About the House of Lords report on trust, technology, and democracy
Michael Krigsman: We’re discussing the impact of technology on society and democracy with two leading members of the House of Lords. Please welcome Lord Tim Clement-Jones and Lord David Puttnam. David, please tell us about your work in the House of Lords and, very briefly, about the report that you’ve just released.
Lord David Puttnam: Well, the most recent 18 months of my life were spent doing a report on the impact of digital technology on democracy. In a sense, the clue is in the title because my original intention was to call it The Restoration of Trust because a lot of it was about misinformation and disinformation.
The evidence we took, for just under a year, from all over the world made it evident the situation was much, much worse, I think, than any other committee, any of the 12 of us, had understood. I ended up calling it The Resurrection of Trust and I think that, in a sense, the switch in those words tells you how profound we decided that the issue was.
Then, of course, along comes January the 6th in Washington, and a lot of the things that we had alluded to and things that we regarded as kind of inevitable all, in a sense, came about. We’re feeling a little bit smug at the moment, but we kind of called it right at the end of June last year.
Michael Krigsman: Our second guest today is Lord Tim Clement-Jones. This is his third time back on the CXOTalk. Tim, welcome back. It’s great to see you again.
Lord Tim Clement-Jones: It’s great to be back, Michael. As you know, my interest is very heavily in the area of artificial intelligence, but I have this crossover with David. David was not only on my original committee, but artificial intelligence is right at the heart of these digital platforms.
I speak on digital issues in the House of Lords. They are absolutely crucial. The whole area of online harms (to some quite high degree) is driven by the algorithms at the heart of these digital platforms. I’m sure we’re going to unpack that later on today.
David and I do work very closely together in trying to make sure we get the right regulatory solutions within the UK context.
Michael Krigsman: Very briefly, Tim, just tell us (for our U.S. audience) about the House of Lords.
Lord Tim Clement-Jones: It is a revising chamber, but it’s also a chamber which has the kind of expertise because it contains people who are maybe at the end of their political careers, if you like, with a small p, but have a big expertise, a great interest in a number of areas that they’ve worked on for years or all their lives, sometimes. We can draw on real experience and understanding of some of these issues.
We call ourselves a revising chamber but, actually, I think we should really call ourselves an expert chamber because we examine legislation, we look at future regulation much more closely than the House of Commons. I think, in many ways, actually, government does treat us as a resource. They certainly treat our reports with considerable respect.
Key issues covered by the House of Lords report
Michael Krigsman: David, tell us about the core issues that your report covered. Tim, please jump in.
Lord David Puttnam: I think Tim, in a sense, set it up quite nicely. We were looking at the potential danger to democracy—of misinformation, disinformation—and the degree to which the duty of care was being exercised by the major platforms (Facebook, Twitter, et cetera) in understanding what their role was in a new 21st Century democracy, both looking at the positive role they could play in terms of information, generating information and checking information, but also the negative in terms of the amplification of disinformation. That’s an issue we looked at very carefully.
This is where Tim and my interests absolutely coincide because within those black boxes, within those algorithmic structures is where the problem lies. The problem century-wise—maybe this will spark people a little, I think—is that these are flawed business models. The business model that drives Facebook, Google, and others is in the advertising-related business model. That requires volume. That requires hits and what their incomes generate on the back of hits.
One of the things we tried to unpick, Michael, which was, I think, pretty important, was we took the vision that it’s about reach, not about freedom of speech. We felt that a lot of the freedom of speech advocates misunderstood the problem here. Really, the problem was the amplification of misinformation which in turn benefited or was an enormous boost to the revenues of those platforms. That’s the problem.
We are convinced through evidence. We’re convinced that they could alter their algorithms, that they can actually dial down and solve many, many of the problems that we perceive. But, actually, it’s not in their business interest to. They’re trapped, in a sense between demands or requirements of their shareholders to optimize that, to optimize share value, and the role and responsibility they have as massive information platforms within a democracy.
Lord Tim Clement-Jones: Of course, governments have been extremely reluctant, in a sense, to come up against big tech in that sense. We’ve seen that in the competition area over the advertising monopoly that the big platforms have. But I think many of us are now much more sensitive to this whole aspect of data, behavioral data in particular.
I think Shoshana Zuboff did us all a huge benefit by really getting into detail on what she calls exhaust data, in a sense. It may seem trivial to many of us but, actually, the use to which it’s put in terms of targeting messages, targeting advertising, and, in a sense, helping drive those algorithms, I think, is absolutely crucial. We’re only just beginning to come to grips with that.
Of course, David and I are both, if you like, tech enthusiasts, but you absolutely have to make sure that we have a handle on this and that we’re not giving way to unintended consequences.
Impact of social media platforms on society
Michael Krigsman: What is the deep importance of this set of issues that you spend so much time and energy preparing that report?
Lord David Puttnam: If you value, as certainly I do—and I’m sure we all do value—the sort of democracy we were born and brought up in, for me it’s rather like carrying a porcelain bowl across a very slippery floor. We should be looking out for it.
I did a TED Talk in 2012 … [indiscernible, 00:07:19] entitled The Duty of Care where I made the point that we use the concept of duty of care with many, many things: in the medical sense, in the educational sense. Actually, we haven’t applied it to democracy.
Democracy, of all the things that we value, may end up looking like the most fragile. Our tolerance, if you like, of the growth of these major platforms, our encouragement of the reach because of the benefits of information, has kind of blindsided us to what was also happening at the same time.
Someone described the platforms as outrage factories. I’m not sure if anyone has come up with a better description. We’ve actually actively encouraged outrage instead of intelligent debate.
The whole essence of democracy is compromise. What these platforms do not is encourage intelligent debate and reflect the atmosphere of compromise that any democracy requires in order to be successful.
Lord Tim Clement-Jones: The problem is that the culture has been, to date, against us really having a handle on that. I think it’s only now, and I think that it’s very interesting to see what the Biden Administration is doing, too, particularly in the competition area.
One of the real barriers, I think, is thinking of these things in only individual harm. I think we’re now getting to the point where maybe if somebody is affected by hate speech or racial slurs or whatever as individuals, then I think governments are beginning to accept that that kind of individual harm is something that we need to regulate and make sure that the platforms deal with.
I think that the area that David is raising, which is so important and there is still resistance in governments where it’s, if you like, societal harms that are being caused by the platforms. Now, this is difficult to define, but the consequences could be severe if we don’t get it right.
I think, across the world, you only have to look at Myanmar, for instance, [indiscernible, 00:09:33]. If that wasn’t societal harm in terms of use by the military of Facebook, then I don’t know what is. But there are others.
David has used the analogy of January the 6th, for instance. There are analogies and there are examples across the world where democracy is at risk because of the way that these platforms operate.
We have to get to grips with that. It may be hard, but we have to get to grips with it.
Michael Krigsman: How do you get to grips with a topic that, by its nature, is relatively vague and unfocused? Unlike individual harms, when you talk about societal harm, you’re talking about very diffuse and broad impacts.
Lord David Puttnam: Michael, I sit on the Labor benches at the House of Lords and probably, unsurprising, I’m a Louey Grandise [phonetic, 00:10:27] fan, so I think the most interesting thing that’s taking place at the moment is people who look back to the early part of the 20th Century and the railroads, the breaking up of the railroads, and understanding why that had to happen.
It wasn’t just about the railroads. It was about the railroads’ ability to block and distort all sorts of other markets. The obvious one was the coal market, but others. Then indeed it blocked and made extraordinary advances on the nature of shipping.
What I think legislators have woken up to is, this isn’t just about platforms. This is actually about the way we operate as a society. The influence of these platforms is colossal, but most important of all, the fact that what we have allowed to develop is a business model which acts inexorably against our society’s best interest.
That is, it inflames fringe views. It inflames misinformation. Actually, not only inflames it. It then profits from that inflammation. That can’t be right.
Lord Tim Clement-Jones: Of course, it is really quite opaque because, if you look at this, the consumer is getting a free ride, aren’t they? Because of the advertising, it’s being redirected back to them. But it’s their data which is part of the whole business model, as David has described.
It’s very difficult sometimes for regulators to say, “Ah, this kind of consumer detriment,” or whatever it may be. That’s why you also need to look at the societal aspects of this.
If you purely look (in conventional terms) at consumer harm, then you’d actually probably miss the issues altogether because—with things like advertising monopoly, use of data without consent, and so on, and misinformation and disinformation—it is quite difficult (without looking at the bigger societal picture) just to pin it down and say, “Ah, well, there’s a consumer detriment. We must intervene on competition grounds.” That’s why, in a sense, we’re all now beginning to rewrite the rules so that we do catch these harms.
Balancing social media platforms rights against the “duty of care”
Michael Krigsman: We have a very interesting point from Simone Jo Moore on LinkedIn who is asking, “How do you strike this balance between intelligent questioning and debate versus trolling on social media? How should lawmakers and policymakers deal with this kind of issue?
Lord David Puttnam: We came up with, we identified an interesting area, if you like, of comprise – for want of a better word. As I say, we looked hard at the impact on reach.
Now, Facebook, if you were a reasonably popular person on Facebook, you can quite quickly have 5,000 people follow what you’re saying. At that point, you get a tick.
It’s clear to us that the algorithm is able to identify you as a super-spreader at that point. What we’re saying is, at that moment not only have you got your tick but you then have to validate and verify what it is you’re saying.
That state of outrage, if you like, is what blocks the 5,000 and then has to be explained and justified. That seemed to us an interesting area to begin to explore. Is 5,000 the right number? I don’t know.
But what was evident to us is the things that Tim really understands extremely well. These algorithmic systems inside that black box can be adjusted to ensure that, at a certain moment, validation takes place. Of course, we saw it happen in your own election that, in the end, warnings were put up.
Now, you have to ask yourself, why wasn’t that done much, much, much sooner? Why? Because we only reasonably recently became aware of the depth of the problem.
In a sense, the whole Russian debacle in the U.S. in the 2016 election kind of got us off on the wrong track. We were looking at the wrong place. It wasn’t what Russia had done. It was what Russia was able to take advantage of. That should have been the issue and it us a long time to get there.
Lord Tim Clement-Jones: That’s why, in a sense, you need new ways of thinking about this. It’s the virality of the message, exactly as David has talked about, the super-spreader.
I like the expression used by Avaaz in their report that came out last year looking at, if you like, the anti-vaxx messages and the disinformation over the Internet during the COVID pandemic. They talked about detoxing the algorithm. I think that’s really important.
In a sense, I don’t think it’s possible to lay down absolutely hard and fast rules. That’s the benefit of the duty of care that it is a blanket legal concept, which has a code of practice, which is effectively enforced by a regulator. It means that it’s up to the platform to get it right in the first place.
Then, of course – David’s report talked about it – you need forms of redress. You need a kind of ombudsman, or whatever may be the case, independent of the platforms who can say, “They got it wrong. They allowed these messages to impact on you,” and so on and so forth. There are mechanisms that can be adopted, but at the heart of it, as David said, is this black box algorithm that we really need to get to grips with.
Michael Krigsman: You’ve both used terms that are very interestingly put together, it seems to me. One, Tim, you were just talking about duty of care. David, you’ve raised (several times) this notion of flawed business models. How do these two, duty of care and the business model, intersect? It seems like they’re kind of diametrically opposed.
Lord David Puttnam: It depends on your concept of what society might be, Michael. The type of society I spent my life arguing for, they’re not opposed at all, the role of the peace, because that society would have a combination of regulation, but also personal responsibility on the part of the people who run businesses.
One of the things that I think Tim and I are going to be arguing for, which we might have problems in the UK, is the notion of personal responsibility. At what point do the people who sit on the board at Facebook have a personal responsibility for the degree to which they exercise duty of care over the malfunction of their algorithmic systems?
Lord Tim Clement-Jones: I don’t see a conflict either, Michael. I think that you may see different regulators involved. You may see, for instance, a regulator imposing a way of working over content, user-generated content on a platform. You may see another regulator (more specialist, for instance) on competition. I think it is going to be horses for courses, but I think that’s the important thing to make sure that they cooperate.
I just wanted to say that I do think that often people in this context raised the question of freedom of expression. I suspect that people will come on the chat and want to raise that issue. But again, I don’t see a conflict in this area because we’re not talking about ordinary discourse. We’re talking about extreme messages: anti-vaxxing, incitement of violence, and so on and so forth.
The one thing David and I absolutely don’t want to do is to impede freedom of expression. But that’s sometimes used certainly by the platforms as a way of resisting regulation, and we have to avoid that.
How to handle the cross-border issues with technology governance?
Michael Krigsman: We have another question coming now from Twitter from Arsalan Khan who raises another dimension. He’s talking about if individual countries create their own policies on societal harm, how do you handle the cross-border issues? It seems like that’s another really tricky one here.
Lord David Puttnam: I think what is happening, and this is quite determined, I think, on the part of the Biden Administration—the UK and, actually, Europe, the EU, is probably further advanced than anybody else on this—is to align our regulatory frameworks. I think that will happen.
Now, in a sense, these are big marketplaces. The Australian situation with Facebook has stimulated this. Once you get these major markets aligned, it’s extremely hard to see how Facebook, Google, and the rest of them could continue with their advertising with their current model. They would have to adjust to what those marketplaces require.
Bear in mind, what troubles me a lot, Michael, is that, if you think back, Mr. Putin and President Xi must be laughing their heads off at the mess we got ourselves into because they’ve got their own solution to this problem – a lovely, simple solution.
We’ve got our knickers in a twist in an extraordinary situation quite unintended in most states. The obligation is on the great Western democracies to align the regulatory frameworks and work together. This can’t be done on a country-by-country basis.
Lord Tim Clement-Jones: Once the platforms see the writing on the wall, in a sense, Michael, I think they will want to encourage people to do that. As you know, I’ve been heavily involved in the AI ethics agenda. That is coming together on an international basis. This, if anything, is more immediate and the pressures are much greater. I think it’s bound to come together.
It’s interesting that we’ve already had a lot of interest in the duty of care from other countries. The UK, in a sense, is a bit of a frontrunner in this despite the fact that David and I are both rather impatient. We feel that it hasn’t moved fast enough.
Nevertheless, even so, by international standards, we are a little bit ahead of the game. There is a lot of interest. I think, once we go forward and we start defining and putting in regulation, that’s going to be quite a useful template for people to be able to legislate.
Lord David Puttnam: Michael, it’s worth mentioning that it’s interesting how things bubble up and then become accepted. When the notion of fines of up to 10% of turnover was first mooted, people said, “What?! What?!”
Now, that’s regarded as kind of a standard around which people begin to gather, so there is momentum. Tim is absolutely right. There is momentum here. The momentum is pretty fierce.
Ten percent of turnover is a big fine. If you’re sitting on a board, you’ve got to think several times before you sign up on that. That’s not just the cost of doing business.
Michael Krigsman: Is the core issue then the self-interest of platforms versus the public good?
Lord David Puttnam: Yes, essentially it is. Understand, if you look back and look at the big anti-trust decisions that were made in the first decade of the 20th Century. I think we’re at a similar moment and, incidentally, I think that it is as certain that these things will be resolved within the next ten years in a very similar manner.
I think it’s going to be up to the platforms. Do they want to be broken up? Do they want to be fined? Or do they want to get rejoined in society?
Lord Tim Clement-Jones: Yeah, I mean I could get on and really bore everybody with the different forms of remedies available to our competition regulators. But David talked about big oil, which was broken up by what are called structural remedies.
Now, it may well be that, in the future, regulators—because of the power of the tech platforms—are going to have to think about exactly doing that, say, separating Facebook from YouTube or from Instagram, or things of that sort.
We’re not out of the era of “move fast and break things.” We now are expecting a level of corporate responsibility from these platforms because of the power they wield. I think we have to think quite big in terms of how we’re going to regulate.
Should governments regulate social media?
Michael Krigsman: We have another comment from Twitter, again from Arsalan Khan. He’s talking about, do we need a new world order that requires technology platforms to be built in? It seems like as long as you’ve got this private sector set of incentives versus the public good, then you’re going to be at loggerheads. In a practical way, what are the solutions, the remedies, as you were just starting to describe?
Lord Tim Clement-Jones: What are governments for? Arsalan always asks the most wonderful questions, by the way, as he did last time.
What are governments for? That is what the role of government is. It is, in a sense, a brokerage. It’s got to understand what is for the benefit of, if you like, society as a whole and, on the other hand, what are the freedoms that absolutely need preserving and guaranteeing and so on.
I would say that we have some really difficult decisions to make in this area. But David and I come from the point of view of actually creating more freedom because the impact of the platforms (in many, many ways) will be to reduce our freedoms if we don’t do something about it.
Lord David Puttnam: It’s very, very much, and that’s why I would argue, Michael, that the Facebook reaction or response in Australia was so incredibly clumsy because what it did is it begged a question we could really have done without, which is, are they more powerful than the sovereign nations?
Now, you can’t go there because you get the G7 together or the G20 together, you know, you’re not going to get into a situation where any prime minister is going to concede that, actually, “I’m afraid there’s nothing we can do about these guys. They’re bigger than us. We’re just going to have to live with it.” That’s not going to happen.
Lord Tim Clement-Jones: The only problem there was the subtext. The legislation was prompted by one of the biggest media organizations in the world. In a sense, I felt pretty uncomfortable taking sides there.
Lord David Puttnam: I think it was just an encouragement to create a new series of an already long-running TV series.
Lord Tim Clement-Jones: [Laughter]
Lord David Puttnam: You’re absolutely right about that. I had to put that down as an extraordinary irony of history. The truth is you don’t take on nations, and many have.
Some of your companies have and genuinely believe that they were bigger. But I would say don’t go there. Frankly, if I were a shareholder in Facebook – I’m not – I’d have been very, very, very cross with whoever made that decision. It was stupid.
Michael Krigsman: Where is all of this going?
Lord Tim Clement-Jones: We’re still heavily engaged in trying to get the legislation right in the UK. But David and I believe that our role is to kind of keep government honest and on track and, actually, go further than they’ve pledged because this question of individual harm, remedies for that, and a duty of care in relation to individual harm isn’t enough. It’s got to go broader into societal harm.
We’ve got a road to travel. We’ve got draft legislation coming in very, very soon this spring. We’ve got then legislation later on in the year, but actually getting it right is going to require a huge amount of concentration.
Also, we’re going to have to fight off objections on the basis of freedom of expression and so on and so forth. We are going to have to reroute our determination in principle, basically. I think there’s a great deal of support out there, particularly in terms of protection of young people and things of that sort that we’re actually determined to see happen.
Political messages and digital literacy
Michael Krigsman: Is there the political will, do you think, to follow through with these kinds of changes you’re describing?
Lord David Puttnam: In the interest of a vibrant democracy, when any prime minister or president of any country looks at the options, I think they’re facing many alternatives. I can’t really imagine Macron, Johnson, or anybody else looking at the options available to them.
They may find those options quite uncomfortable, and the ability in some of these platforms to embarrass politicians is considerable. But when they actually look at the options, I’m not sure they’re faced with that many alternatives other than pressing down the vote that Tim just laid out for you.
Lord Tim Clement-Jones: I think the real Achilles heel, though, that David’s report pointed out really clearly, and the government failed to answer satisfactorily, was the whole question of electoral regulation, basically. The use of misleading political messaging during elections, the impact of, if you like, opaque political messaging where it’s not obvious where it’s coming from, those sorts of things.
I think the determination of governments, especially because they are in control and they are benefiting from some of that messaging, there’s a great reluctance to take on the platforms in those circumstances. Most platforms are pretty reluctant to take down any form of political advertising or messaging or, in a sense, moderate political content.
That for me is the bit that I think is going to be the agenda that we’ll probably be fighting on for the next ten years.
Lord David Puttnam: Michael, it’s quite interesting that both of the major parties – not Tim’s party, as you behave very well – both of the major parties actually misled us. I wouldn’t say lied to us, but they misled us in the evidence they gave about their use of the digital environment during an election, which was really lamentable. We called them out, but the fact that, in both places, they felt that they needed to, as necessary, break the law to give themselves an edge is a very worrying indicator of what we might be up against here.
Lord Tim Clement-Jones: The trouble is, political parties love data because targeted messages, microtargeting as it’s called, is very powerful, potentially, and gaining support. It’s like a drug. It’s very difficult to wean politicians off what they see as a new, exciting tool to gain support.
Michael Krigsman: I work with various software companies, major software companies. Personalization based on data is such a major focus of technology, of every aspect of technology with tentacles to invade our lives. When done well, it’s intuitive and it’s helpful. But you’re talking about the often indistinguishable case where it’s done invasively and insinuating itself into the pattern of our lives. How do you even start to grapple with that?
Lord Tim Clement-Jones: It kind of bubbled up in the Cambridge Analytica case where the guy who ran the company was stupid enough to boast about what they were able to do. What it illustrated is that that was the tip of a very, very worrying nightmare for all of us.
No, I mean this is where you come back to individual responsibility. The idea that the people, the management of Facebook, the management of Google are not appalled by that possibility and aren’t doing everything they can to prevent is, I think it’s what gives everyone at Twitter nightmares.
I don’t think they ever intended or wanted to have the power they have in these fringe areas, but they’re stuck with them. The answer is, how do we work with governments to make sure they’re minimized?
Lord Tim Clement-Jones: This, Michael, brings in one of David and my favorite subjects, which is digital literacy. I’m an avid reader of people who try and buck the trend. I love Jaron Lanier’s book Ten Reasons for Deleting your Facebook Account [sic]. I love the book by Carissa Veliz called Privacy is Power.
Basically, that kind of understanding of what you are doing when you sign up to a platform—when you give your data away, when you don’t look at the terms and conditions, you tick the boxes, you accept all cookies, all these sorts of things—it’s really important that people understand the consequences of that. I think it’s only a tiny minority who have this kind of idea they might possibly live off-grid. None of us can really do that, so we have to make sure that when we live with it, we are not giving away our data in those circumstances.
I don’t practice what I preach half the time. We’re all in a hurry. We all want to have a look at what’s on that website. We hit the accept all cookies button or whatever it may be, and we go through. We’ve got to be more considerate about how we do these things.
Lord David Puttnam: Chapter 7 of our report is all about digital literacy. We went into it in great depth. Again, fairly lamentable failure by most Western democracies to address this.
There are exceptions. Estonia is a terrific exception. Finland is one of the exceptions. They’re exceptions because they understand the danger.
Estonia sits right on the edge with its vast neighbor Russia with 20% of its population being Russian. It can’t afford misinformation. Misinformation for them is catastrophe. Necessarily, they make sure their young people are really educated in the way in which they receive information, how they check facts.
We are very complacent in the West; I’ve got to say. I’ll say this about the United States. We’re unbelievably complacent in those areas and we’re going to have to get smart. We’ve got to make sure that young people get extremely smart about the way they’re fed and react and respond to information.
Lord Tim Clement-Jones: Absolutely. Our politics, right across the West, demonstrate that there’s an awful lot of misinformation, which is believed – believed as the gospel, effectively.
Balancing freedom of speech on social media and cyberwarfare
Michael Krigsman: We have another question from Twitter. How do you balance social media reach versus genuine freedom of speech?
Lord David Puttnam: I thought I answered it. Obviously, I didn’t. It’s that you accept the fact that freedom of speech requires that people can say what they want. This goes back to the black boxes. At a certain moment, the box intervenes and says, “Whoa. Just a minute. There is no truth in what you’re saying, ” or worse on the case of anti-vaxxers. “There is actual harm and damage in what you’re saying. We’re not going to give you reach.”
What you do is you limit reach until the person making those statements can validate them or affirm them or find some other way of, as it were, being allowed to amplify. It’s all about amplification. It’s trying to stop the amplification of distortion and lies and really quite dangerous stuff like the anti-vaxx.
We’ve got a perfect trial run, really, with anti-vaxxing. If we can’t get this right, we can’t get much right.
Lord Tim Clement-Jones: There are so many ways. When people say, “Oh, how do we do this?” you’ve got sites like Reddit who have a community, different communities. You have rules applying to the communities that have to conform to a particular standard.
Then you’ve got the Avaaz not only detoxing the algorithm, but the duty of correction. Then you’ve got great organizations like NewsGuard who basically, in a sense, have a sort of star system, basically, to verify some of the accuracy of news outlets. We do have the tools, but we just have to be a bit determined about how we use them.
Michael Krigsman: We have another question from Twitter that I think addresses or asks about this point, which is, how can governments set effective constraints when partisan politics benefits from misusing digital technologies and even spreading misinformation?
Lord David Puttnam: Tim laid it out for you early on why the House of Lords existed. This is where it actually gets quite interesting.
We, both Tim and I, during our careers—and we both go back, I think, 25 years—had managed to get amendments into legislation against the head. That’s to say that didn’t suit either the government of the day or even the lead opposition of the day. The independence of the House of Lords is wonderfully, wonderfully valuable. It is expert and it does listen.
Just a tiny example, if someone said to me or David, “Why were you not surprised that your report didn’t get more traction?” it’s 77,000 words long. Yeah, it’s 77,000 words long because it’s a bloody complicated subject. We had the time and the luxury to do it properly.
I don’t think that will necessarily prove to be a stumbling block. We have enough … [indiscernible, 00:37:01] embarrassment. The quality of the House of Lords and the ability to generate public opinion, if you like, around good, sane, sensible solutions still do function within a democracy.
But if you go down the road that Tim was just saying, if you allow the platforms to go in the route they appear to have taken, we’ll be dealing with autocracy, not democracy. Then you’re going to have a set of problems.
Lord Tim Clement-Jones: David is so right. The power of persuasion still survives in the House of Lords. Because the government doesn’t have a majority, we can get things done if that power of persuasion is effective. We’ve done that quite a few times over the last 25 years, as David says.
Ministers know that. They know that if you espouse a particular cause that is clearly sensible, they’re going to find that they’re pretty sticky wicked or whatever the appropriate baseball analogy would be, Michael, in those circumstances. We have had some notable successes in that respect.
For instance, only a few years ago, we had a new code for age-appropriate design, which means that webpages now need to take account of the age of the individuals actually accessing them. It’s now called a Children’s Code. It came into effect last year and it’s a major addition to our regulation. It was quite heavily resisted by the platforms and others when it came in, but by a single colleague of David and mine (supported by us) she drove it through, greatly to her credit.
Michael Krigsman: We have two questions now, one on LinkedIn and one on Twitter, that relates to the same topic. That is the speed of government, the speed of change and government’s ability to keep up. On Twitter, for example, future wars are going to be cyber, and the government is just catching up. The technology is changing so rapidly that it’s very difficult for the legal system to track that. How do we manage that aspect?
Lord Tim Clement-Jones: Funny enough, government think that. Their first thought is about cybersecurity. Their first thought is about their cyber, basically, their data.
We’ve got a new, brand new, national cybersecurity center about a year or two old now. The truth is, particularly in view of Russian activities, we now have quite good cyber controls. I’m not sure that our risk management is fantastic but, operationally, we are pretty good at this.
For instance, things like the solar winds hack of last year have been looked at pretty carefully. We don’t know what the outcome is, but it’s been looked at pretty carefully by our national cybersecurity center.
Strangely enough, the criticism I have with government is, if only they thought of our data in the way that they thought about their data, we’d all be in a much happier place, quite honestly.
Lord David Puttnam: I think that’s true. Michael, I don’t know whether this is absolutely true in the U.S. because it’s such a vast country, but my experience of legislation is it can be moved very quickly when there’s an incident. Now, I’ll give you an example.
I was at the Department of Education at the moment where a baby was allowed to die under very unfortunate, catastrophic failure by different systems of the government. The entire department ground to a halt for about two months while this was looked at and whilst the government, whilst the department tried to explain itself and any amount of legislation was brought forward. Governments deal in crises, and this is going to be a series of crises.
The other thing governments don’t like is judicial review. I think we’re looking at an area here where judicial review—either by the platforms for a government decision or by civil society because of a government decision—is utterly inevitable. I actually think, longer-term, these big issues are going to be decided in the courts.
Advice for policymakers and business people
Michael Krigsman: As we finish up, can I ask you each for advice to several different groups? First is the advice that you have for governments and for policymakers.
Lord Tim Clement-Jones: Look seriously at societal harms. I think the duty of care is not enough just simply to protect individual citizens. It is all about looking at the wider picture because if you don’t, then you’re going to find it’s too late and your own democracy is going to suffer.
I think you’re right, Michael, in a sense that some politicians appear to have a conflict of interest on this. If you’re in control, you don’t think of what it’s like to have the opposition or to be in opposition. Nevertheless, that’s what they have to think about.
Lord David Puttnam: I was very impressed, indeed, tuning in to some of the judicial subcommittees at the congressional hearings on the platforms. I thought that the chairman … [indiscernible, 00:42:35] did extremely well.
There is a lot of expertise. You’ve got more expertise, actually, Michael, in your country than we have in ours. Listen to the experts, understand the ramifications, and, for God’s sake, politicians, it’s in their interests, all their interests, irrespective of Republicans or Democrats, to get this right because getting it wrong means you are inviting the possibility of a form of government that very, very, very few people in the United States wish to even contemplate.
Michael Krigsman: What about advice to businesspeople, to the platform owners, for example?
Lord David Puttnam: Well, we had an interesting spate, didn’t we, where a lot of advertisers started to take issue with Facebook, and that kind of faded away. But I would have thought that, again, it’s a question of regulatory oversight and businesses understanding.
How many businesses in the U.S. want to see democracy crumble? I mean I was quite interested immediately after the January 6th thing for where the businesses walked away from, not so much the Republican party, but away from Trump.
I just think we’ve got to begin to hold up a mirror to ourselves and also look carefully at what the ramifications of getting it wrong are. I don’t think there’s a single business in the U.S. (or if there are, there are very, very few) who wish to go down that road. They’re going to realize that that means they’ve got to act, not just react.
Lord Tim Clement-Jones: I think this is a board issue. This is the really important factor.
Looking on the other side, not the platform side because I think they are only too well aware of what they need to do, but if I’m on the other side and I’m, if you like, somebody who is using social media, as a board member, you have to understand the technology and you have to take the time to do that.
The advertising industry—really interesting, as David said—they’re developing all kinds of new technology solutions like blockchain to actually track where their advertising messages are going. If they’re directed in the wrong way, they find out and there’s an accountability down the blockchain which is really smart in the true sense of the word.
It’s using technology to understand technology, which I think you can’t leave it to the chief information officer or the chief technology officer. You as the CEO or the chair, you have to understand it.
Lord David Puttnam: Tim is 100% right. I’ve sat in a lot of boards in my life. If you really want to grab a board’s attention – I’m not saying which part of the body you’re going to grab – start looking at the register and then have a conversation about how adequate directors’ insurance is. It’s a very lively discussion.
Lord Tim Clement-Jones: [Laughter]
Lord David Puttnam: I think this whole issue of personal responsibility, the things that insurance companies will and won’t take on in terms of protecting companies and boards, that’s where a lot of this could land and very interestingly.
Importance of digital education
Michael Krigsman: Let’s finish up by any thoughts on the role of education and advice that you may have for educators in helping prepare our citizens to deal with these issues.
Lord Tim Clement-Jones: Funny enough, I’ve just developed (with a group of people) a framework for ethical AI for use in education. We’re going to be launching that in March.
The equivalent is needed in many ways because of course digital literacy, digital education is incredibly important. Actually, parents and teachers, this isn’t just a generation, a younger generational issue. It needs to go all the way through. I think we need to actually be much more proactive about the tools that are out there for parents and others, even main board directors.
You cannot spend enough time talking about the issues. That’s why, when David mentioned Cambridge Analytica, suddenly everybody gets interested. But it’s a very rare example of suddenly people becoming sensitized to an issue that they previously didn’t really think about.
Lord David Puttnam: It’s a parallel, really, in the sense of climate change. These are our issues. If we’re going to prepare our kids – I’ve got three grandchildren – if we’re going to prepare them properly for the remains of their lives, we have an absolute obligation to explain to them what the challenges their lives will far are, what forms of society they’re going to have to rally around, what sort of governance they should reasonably expect, and how they’ll participate in all of that.
If they’re left in ignorance—be it on climate change or, frankly, on all the issues we’ve been discussing this evening—we are making them incredibly vulnerable to a form of challenge and a form of life that we’ve lived very privileged lives. I think that the lives of our grandchildren, unless we get this right for them and help them, will be very diminished.
I use that word a lot recently. They will live diminished lives and they’ll blame us, and they’ll wonder why it happened.
Michael Krigsman: Certainly, one of the key themes that I’ve picked up from both of you during this conversation has been this idea of responsibility, individual responsibility for the public welfare.
Lord David Puttnam: Unquestionable. It’s summed up in the various duty of care. We have an absolutely overwhelming duty of care for future generations, and it applies as much to the digital environment as it does to climate.
Lord Tim Clement-Jones: Absolutely. In a way, what we’re now having to overturn is this whole idea that online was somehow completely different to offline, to the physical world. Well, some of us have been living in the online remote world for the whole of last year, but why should standards be different in that online world? They shouldn’t be. We should expect the same standards of behavior and we should expect people to be accountable for that in the same way as they are in the offline world.
Michael Krigsman: Okay. Well, what a very interesting conversation. I would like to express my deep thank you to Lord Tim Clement-Jones and Lord David Puttnam for joining us today.
David, before we go, I just have to ask you. Behind you and around you are a bunch of photographs and awards that seem distant from your role in the House of Lords. Would you tell us a little bit more about your background very quickly?
Lord David Puttnam: Yes. I was a filmmaker for many years. That’s an Emmy sitting behind me. The reason the Emmy is sitting there is the shelf isn’t deep enough to take it. But I got my Oscar up there. I’ve got four or five Golden Globes and three or four BAFTAs, David di Donatello, and Palme d’Or from Cannes. I had a very, very happy, wonderfully happy 30 years in the movie industry, and I’ve had a wonderful 25 years working with Tim in the legislature, so I’m a lucky guy, really.
https://www.cxotalk.com/episode/digital-technology-trust-social-impact
5th April 2021
The UK’s Role In The Future Of AI
18th July 2015
Lord C-J debates the future of the BBC
1st December 2014