In our digital world, data and identity are increasingly intertwined. As the world responds to data-related scandals with new business models, checks, and balances, what is the appropriate governance for digital identity?
Regulation can play a powerful role in ensuring everyone can fully and fearlessly engage in digital society. Good policy decisions can not only empower individuals but also protect their right to privacy.
In this series, we talk to industry leaders about the policies, technology and practices which affect digital identity systems on the road to Good ID.
We spoke to Tom Wheeler, former chairman of the United States Federal Communications Commission (FCC), author, businessman and senior research fellow at the Shorenstein Center on Media, Politics and Public Policy at Harvard Kennedy Business School.
Tom was pivotally involved in policy creation for a raft of fast moving communications technology issues at the FCC and has written many books on the topic of leadership in this area.
Hello Tom. Thanks for joining us today. So many know you as the former FCC Chairman, a giant of telecoms and a hugely influential figure in the regulation of communications technology, not least of all, of course, the high profile issue of net neutrality. You stepped out of the FCC in 2017 and you’re clearly thinking hard now about data and privacy. Can you tell us about your work at the Shorenstein Center? Why the focus on privacy?
Well, I was in privacy – in a way – when I was chairman of the FCC. We enacted a privacy rule for broadband network providers. Unfortunately when the Trump administration and the Republican FCC took over, they repealed those rules. But privacy is a seminal issue of the 21st century - some say it’s going to be one of the key civil rights issues of the 21st century.
And what we’re trying to do at the Shorenstein Center and Harvard Kennedy School is to focus on what we call the Platform Accountability Project, which is, “Just what should the accountability of both networks and the platforms that use those networks be?"
Because what we’ve seen thus far is that they get to make their own rules. Should society, through its elected officials, have the ability to say, “No, wait a minute, here’s some other ground rules”? So that’s the kind of question that we’re looking at.
When you boil everything down, the whole issue of privacy is kind of the gateway drug into this issue, because on one side of the equation you have companies who are unilaterally taking your privacy and making it their product, their asset, and then turning around and using their control of that asset to monopolize the market. And so the flow of information, information that is collected by impinging on your privacy, is at the heart of the economic model of the internet.
People will be very interested in your specific experiences in creating policy in a time of rapid technological advancement. Do you have any advice about how you go about doing that yourself? On how you’ve adapted to the constant change?
You know, the challenge of government is that our current structures - and I speak only for the United States and my experience in government there – but I think it also applies to other nations around the world – that the government structure that we now know was created in the industrial revolution. And it was set up in order to have an offsetting force, if you will, against the power of the industrialists.
When the policy makers went to build that, they looked to the existing management model of the companies, because that was how things were managed in those days. And so what do we know about the industrial management model? Well, we know that it was rules-based, OK? The guy on the floor had a set of rules and so on.
And so, should we be surprised that we find that government is a rules-based bureaucracy? Because that was the kind of management that existed until recently.
Now we find ourselves in an era where things are changing so fast that agility is an essential criterion.
Years ago, I ran a software company, and we developed software linearly. We did the waterfall process because it would fall off the edge. But now software is never done. It is constantly evolving as the realities in which it has to exist are done, and we need, and it’s called agile software development.
We need to bring the concepts of agile software development – 21st century management concepts – into government, to replace the 19th century management concepts on which government is currently structured.
So the leaders and policy-makers just adapting to that principle of agile development is really key, you think.
Well, for policy-makers to begin to think that way, clearly, and for companies to begin to recognise that.
So I tried this in three different areas, when I was chairman of the FCC. I tried to have agile regulation in net neutrality, privacy, and cyber security. And the industry all screamed, “We don’t know what the rules are! We need certainty!”
The difficulty is that the industry ends up talking out of both sides of their mouth. One side, they’re saying, “Oh, you can’t have these hard and fast rules because things move too fast”.
And then when you say, “OK, we’ll work in an agile way”, they say, “Oh no, we need hard and fast rules”. So everybody has to evolve.
Regarding developing policy at the FCC, there’s a lot of interest in how you address privacy, anti-trust, and monopolies amongst the telecoms. Can you speak about that?
Yes, happily. I think the key thing is that you have to say you’re going to step up and deal with the new technology, not flee the new technology.
I’ve got a new book out, called From Gutenberg to Google: the History of Our Future. And I think that’s one of the conclusions that you reach when you say, “Look at how the history of technology has moved for the last multiple hundreds of years”.
You must confront it, you can’t run away from it. The challenge is: what kind of a standard are you going to use when you confront it?
How do you think policy is going to evolve this year? Do you think it’s going to have a big leap forward, where everybody’s collectively working towards making systems work for governments, for people, for business, or do you think there’s a long time before that light bulb moment happens at a mass scale?
That’s a great question, and clearly the sensitivity is increasing on the importance of all of these issues. I’m not sure that you end up putting a calendar date on things, saying, “Well, the big year’s going to be 2019.”
It’s an evolutionary process, it’s Darwinian if you will, where things build on each other and evolve. But the challenge is to step up, instead of to flee.
There are a lot of folks who throw up their hands and say, “Oh well, let’s let the companies worry about that”, or “Oh, this is too complex”, or “I don’t understand the technology”. And that luxury doesn’t exist.
So moving onto Good ID as a movement and as a concept, is this something you believe in? Do you think we could achieve Good ID?
Well, I think it is a terrific initiative. I think that the idea of having an open platform, that is flexible for local laws and mores to be able to use, makes a lot of sense.
I think at the heart of the issue is the inherent contradiction between empowering identification and exploiting identification.
You know, there is a concept that goes back hundreds of years to English common law, called the duty of care, that when someone provides a service, they have the duty of care to make sure that adverse things don’t happen as a result of that.
And it seems to me that at the heart of the whole privacy and identity issue is the duty of care. That today the responsibility to protect privacy, for instance, has been fobbed off onto consumers, when really it ought to be the responsibility of the companies who are taking that information, and that is the duty of care.
So in the Good ID world, there needs to be embedded – beyond the technology – this concept of a duty of care.
So some very high-profile people are talking about data protection and privacy, a lot of corporate coalitions: clearly it’s a very busy space. What do you say to the many coalitions that are now emerging, say, on self-regulation? People and corporations who want to do the right thing before they are obliged to do the right thing?
Well, God bless ‘em. As someone who earlier in my career built these kinds of standard good operating standards for wireless carriers, for instance, the things I learned about that are:
One, it’s only as strong as your weakest link, and two, that there needs to be some kind of enforcement authorities associated with it.
In voluntary efforts, both of those realities tend to distract from the success of the voluntary efforts. So I think it’s great that they’re being done, but they’re probably not the be-all and end-all.
Mark Zuckerberg, for whom I have a great deal of respect, wrote an article in the Wall Street Journal last week in which he said that there needed to be regulation providing for transparency, choice and control. Those are three great terms.
What they mean needs to be carefully defined, number one; and number two, they are not the be-all and the end-all. There needs to be, as we discussed before, a responsibility of a duty of care that supercedes or helps define what those three things are. We need to have that kind of discussion, and not just retreat behind buzzwords.
How can competition be considered in the creation of systems that place the user or the individual at the centre. Can the two really work together: competition and user rights?
I think we have to recognize that what we’re dealing with in the digital market is a two-sided market, OK?
Where you’ve got one side of the market which is acquiring data, oftentimes under questionable processes, that then turns around and sells the use of that data to another market, advertisers for instance, and the control - the monopoly position that the companies have because of the way in which they have hoovered up, to use a great British term, so much personal data, allows them to establish a monopoly over that data that allows them to then go turn and extort monopoly rents on the other side of the equation.
Do you have any watchwords this year?
Watchwords! I think what history teaches us is that when the industrial revolution came around, driven by technology, and principally network technology – the railroad and the telegraph – we found that the rules for agrarian mercantilism didn’t work any more. They were insufficient.
Those who took advantage of the new technology ended up making new rules themselves, and those rules, as is the nature of capitalism, benefitted themselves.
I’m a capital ‘C’ Capitalist. But also, there was a need to create some guardrails on just what behavior would be. And we came up with the kind of regulatory structure that has governed us since the industrial revolution - served society well, served the economy well, and protected capitalism from its excesses.
But now we find ourselves in a situation where the rules for industrial capitalism are probably inadequate for internet capitalism. So we’ve got to go through the same kind of a process of saying, “How do we put guardrails in place that protect consumers, that protects competition, and in the process protects internet capitalism?”
That’s the watchword that I see going forward, and I think they are all consistent, and that we have to get to the point where people begin to see them as working towards similar goals.
So “guardrails” is the word?