Martin E. Hellman is a remarkable man. He is perhaps best known for his invention, with Diffie and Merkle, of public key cryptography– the technology which (amongst other uses) enables secure internet transactions and is used to transfer trillions of dollars each day. His work has been recognised by numerous honours including his election to the National Academy of Engineering, the National Inventors Hall of Fame and most recently- receiving the 2015 ACM Turing Award, the most prestigious honour In computer science.
Hellman has a deep interest in the ethics of technological development. As he says in his book, Breakthrough: Emerging New Thinking, “…In the present state of world affairs, one of the major sources of disparity is the discrepancy between our scientific and technical progress and our level of societal and individual development. The magnitude of the forces we command today are such that mankind can alter the environment of the planet as a whole, as we are now doing. The subsequent emergence of global problems and the recognition of their importance is certainly one of the great intellectual events of our time.”
In this exclusive interview, I spoke to Professor Hellman on how, as our capabilities accelerate, our society must approach the ethics of technology.
Q: In your experience, how do we fool ourselves into thinking that we’re being ethical?
[Martin Hellman]: It was in July 1981 when I came to realise how we fool ourselves. My wife had been dragging me to meetings and seminars (she’d actually been ready to leave me- which I didn’t know- and was desperately trying to find something that would make our relationship work). We were watching a video called Day after Trinity (‘Trinity’ was the codename for the first nuclear test, in 1945, at Alamogordo, New Mexico). In the video, they ask each of the Manhattan Project scientists what their motivation was for working on this horrible weapon of mass destruction that killed 200,000 men, women and children indiscriminately. They all had the same response- ‘fission, which was the basis for this weapon, had been discovered in Germany. If Hitler had got the bomb before we did, it would have been the end of civilisation as we know it. We had no choice but to work on it…’ This was all-out-war. But later in the documentary the scientists were asked, ‘…so when Hitler was defeated, why did you continue working on these weapons?’ – their faces dropped, they didn’t know why… One of the scientists, Robert Wilson said, ‘…I don’t know why I didn’t just walk-away from Los Alamos…’
Watching this documentary took me back 5 years earlier to 1976. DES, the data encryption standard, had been announced in March 1975. My partner, and fellow Turing Award winner, Whit Diffie and I had realised that the 56 bit key in DES was inadequate. At best, it was marginal. At first, the 100 thousand million, million keys you would need to search might appear impossible, but Moore’s law was increasing processing power and we quickly figured that if you could search a million keys per second on a chip, and you bought a million chips, you could search a million, million keys per second, and thus it would only take 100,000 seconds to search 100,000 million, million keys – roughly one day. The equivalent cost? Around $10,000 per solution. We thought we’d found a bug and so we wrote to the government expecting them to fix it. After 6 months of letter writing and exchanges, we realised that the bug was- in fact- a feature- not from our point of view, but from the point of view of the NSA. They didn’t want a publicly available cryptographic system they couldn’t break- and so we realised we had to go public, get media attention, and aim for a congressional hearing to improve security. It was now January 1976, we’re gearing-up to go public and 2 high-level NSA people fly out and tell us, ‘if you keep talking this way, you’re going to cause grave harm to national security…’
That night, I went home to try and figure out the right thing to do. As I did that, at one point it was almost like there was a devil whispering in my ear, standing on my shoulder like in the movies. The devil was saying, ‘…forget about right and wrong, you’ll never have a better chance to be famous, go for it!’ – at the time, I thought I’d brushed the devil off my shoulder and made a logical, ethical, decision to go public – and in many ways it was the logical, ethical decision… but 5 years later, watching Day After Trinity I realised I’d fooled myself. I’d made my decision incorrectly. Fortunately, it was the right decision, but it could very easily have not been.
I realised that instead of figuring out the right thing to do, and then doing it…. Most of the time, we figure out what we want to do and then rationalise it- whether it’s right or wrong- we fool ourselves.
I realised that what people do is, instead of figuring out the right thing to do and then doing it whether they want to or not, doing the ethical thing, what they do is they figure out what they want to do and then come up with the rationalisation for doing it, whether it’s right or wrong. And we fool ourselves. And I vowed I would never do that again. So that’s how I came to see that I had fooled myself five years earlier, when I thought I’d brushed the devil off my shoulder. I had only submerged him back into my deep unconscious, from which he’d emerged.
I moved my focus from cyber-security to international and nuclear security when Ronald Reagan became president and brought the nuclear threat into sharp focus. He spoke far too loosely about fighting and winning a nuclear war, and the cyber threat was far in the future. I still believe that nuclear is the greatest risk we face as a society today.
There are a whole bunch of post cold-war nuclear risks. As an example, in 1999 when the Kosovo ceasefire went into policy, an American 4* General gave orders to a British 3* general to take action that could (in the mind of the British General) have risked starting World War 3. A heated argument ensued which resulted in the British general telling his American counterpart, ‘Sir, I’m not starting World War 3 for you…’ – he didn’t technically disobey the order; as London backed him- and the way NATO works is that if your home office backs you- you are not disobeying the order.
Russian troops had occupied Pristina airport, and a tense stand-off ensued. The American general felt that if we confronted the Russians with a determined show of force, they would probably back down. And you know what? He was probably right; but what does probably mean? If it’s 90% then there was a 10% chance he was wrong- these are the kind of challenges faced, amplified with access to weapons of mass destruction, particularly our nuclear arsenal.
Q: Is there any way for us to be sure of the decisions we make?
[Martin Hellman]: I don’t think we can ever be sure of the decisions we make; but there are things we can do.
Firstly; get outside help. I’d vowed (after watching the Day After Trinity) never to fool myself again. How do I know I made the right decision at the time? Admiral Inman, who was Director of the NSA at the time, and who had been fighting me, has since changed his mind and in an interview said that national security depended on strong encryption- something they could not see clearly back then. Fast forward to around 1986 – we’d invented public key cryptography 10 years earlier – and RSA (who almost everyone has heard of) credited us in their paper for inventing public key cryptography but when we asked them to pay royalties? It was a different matter… I was really angry with them at the time. RSA sold their company for $250 million, and we made virtually nothing from our patents- which RSA had been fighting- even though $5 trillion a day in foreign exchange is protected by the cryptography that we invented. At the time, Jim Omura, who is a friend of mine, had been a Professor at UCLA and an information theorist. He had since left UCLA to co-found a company called Cylink with the late Lew Morris. Lew came to me with an introduction from Jim and said, ‘…you help me get an exclusive license for Cylink to Stanford’s patents on public key cryptography, and we’ll get those bastards [RSA] by the balls…’ At the time, I was angry at RSA and I couldn’t be sure whether I was fooling myself. I went to my wife, told her my problem, and she came up with a brilliant solution because she wasn’t emotionally invested in the problem, and didn’t have the same tunnel vision I had. Her solution was to give the decision to Stanford’s Director of Technology Licensing- who had just worked at MIT on loan for a year. He had no emotional involvement in this. His name was Niels Reimers and his advice was simple, ‘of course we should go with Cylink’s offer!’
It’s the same decision I would have made on my own, but this way I can be sure I didn’t fool myself again and make it unethically.
Q: Are our ethics evolving?
[Martin Hellman]: Alan Turing was a brilliant man- he made tremendous contributions to the war effort that defeated Nazism yet he was hounded to death over his sexual orientation. Today, we look back and say, ‘how could they have been so unethical…’ but at the time, practically every state in the USA considered it a felony to be homosexual, and Britain was similar. Without a doubt, what happened to Turing was awful, but it was ‘ethical’ within the framework of the time.
Conversely, 10,000 years ago, it was ethical to make-war on the tribe next door before they killed you. If you didn’t do that, you didn’t survive.
Our ethics are not static, they evolve over time and that gives me tremendous hope. If we had remained static, the progress we have made would have been impossible. The evolution of ethics is essential to our survival.
Q: How can we apply ethics to powerful technologies?
[Martin Hellman]: When I think about all the existential threats that we have, and are developing on the horizon, sometimes it makes me think ‘oh my god… we’re done for! Even if we solve one, there are so many more!’
In spite of those moments I do maintain optimism, and one of the things I’ve come to realise is that powerful technologies like nuclear weapons, genetic engineering, climate engineering… all of these are just a symptom of a deeper underlying problem… We are at a stage of development where we have ‘god like’ physical powers. In the bible, only God could create new life forms- yet with the genetics technologies we have, we can do that. In the Bible, only God was supposed to be able to throw down thunderbolts that could incinerate whole cities; we call them nuclear weapons.
We need to be responsible adults, yet society is- at best- irresponsible adolescents. Our society isn’t worried about long-term consequences, it’s ‘live for today, party now, forget about the hangover…’ – I’m hoping that if we can deal with these issues as systemic problems, closing the chasm between our god-like powers and our irresponsible behaviours, then solving one will solve them all.
Q: Are you hopeful about that for the future? Because again, and whilst I don’t wish to sound negative, it seems that more and more society is diverging from trust, and diverging from trusting experts, and collaboration, and… but these seem to be intrinsically required for us to be able to move forward and become more ethical and be able to deal with these challenges.
[Martin Hellman]: Let’s see. First of all, I assume you’re referring to the desire to turn to the good old days. That’s what President Trump was elected on, he’s going to ‘Make America Great Again’. By the way, the 1950s were not so great. I grew up in that era. Economically, most people, most families had one car at most. Whereas today in America 2, even 3 cars is not unusual. And medical care is much better than in those “good old days.” People died from heart attacks all the time whereas today many live on for years from bypass and other surgery. So the good old days were not as good as we think they were. And by the way this carries over also to the FBI and their equivalent in Britain that just wants to go back to the good old days when they got a wire tap authorisation and could listen in, whereas now they get encrypted data. What they miss is, in the 50s, yes they could get a wire tap and hear what was being said. But they didn’t have cellphone location, they didn’t have surveillance cameras with hard drives they could go to. They probably are getting 100x as much raw intelligence today, maybe a few percent of which is encrypted. And of course they focus on that few percent, and I wish that when there’s a true criminal case I wish they could get it. But we need to weigh the trade-offs. So coming back to this question of the good old days, I think there’s a nostalgia and we need to re-examine its assumptions. Were the good old days really as good as they thought we were? Alan Turing was hounded to death, that wasn’t so good. We had wars, I mean we’re much less war-like now than we were then. And it’s not just nuclear weapons. We don’t behead people publicly in England anymore. We don’t have public hangings in the United States. We are evolving.
Q: Are you optimistic about the future?
[Martin Hellman]: Firstly; what got me to shift from cybersecurity to international security initially had nothing to do with saving the world, it was to save my marriage. My wife and I met in 1966, got married in 1967, so by 1980- 13 years into our marriage- we’d screwed it up really well! The divorce rate is 50%, so it wasn’t surprising that my wife was ready to leave me, what was surprising is that she decided not to, and wanted to figure-out how we make it work. We fell madly in love when we met, and that was the spark that kept us going through inconceivable challenges in our relationship. Today? we’ve not had a fight in almost 20 years. We’ve been married 53 years next month, so it took us a long time to get here, but where we are now is not the end point, it’s a process that goes on and on, with new things to learn all the time.
I firmly believe that ‘inconceivable does not mean impossible’ – If we look at history, we’d see Gorbachev was inconceivable before he came on the scene, but he clearly was not impossible….. getting rid of sodomy laws in Britain and the United States would at the time have seemed inconceivable, but was not impossible… women voting in 1900 was inconceivable, but not impossible.
People need to apply peacebuilding skills in their interpersonal relationships, and it’s a process. Disagreements are an opportunity to find better solutions.
I also carry a lot of hope because of something I learned from one of my key mentors; Stanford Professor, Harry Rathbun. He had a bachelors degree in electrical engineering, but became an attorney and taught business law. Sandra Day O’Connor, the first woman on the supreme court, credits Harry with being a key influence in her life, if not the key influence. Harry was born in the 1890s and died in the 1980s, well into his old age. He taught me several things that have stuck with me. One of those things is Harry’s Nobler Hypothesis. Harry said there are two hypotheses; either humanity is capable of the radical changes required to survive the nuclear age (the nobler hypothesis), or we’re doomed (the less noble hypothesis) – He said that if we accept the less noble hypothesis, we’re doomed, even if we had the capacity to change, because we won’t believe ourselves capable of it. If we accept the nobler hypothesis, the worst that happens is we go down fighting, and the best that happens? We cheat the grim reaper. So, Harry concluded: “Why not accept the nobler hypothesis?” It made sense to me then, and makes sense to me now.
Martin E. Hellman is best known for his invention, with Diffie and Merkle, of public key cryptography, the technology that, among other uses, enables secure Internet transactions. It is used to transfer literally trillions of dollars every day. He has been a long-time contributor to the computer privacy debate, and was a key participant in the “first crypto war” of the late 1970s and early 80s that established the right of academic cryptographic researchers to publish their papers, free of government interference.
His work has been recognized by a number of honors and awards, including election to the National Academy of Engineering, induction as one of the first two dozen “Stanford Engineering Heroes,” the National Inventors Hall of Fame, and the Marconi International Fellowship – and, most recently, the 2015 ACM Turing Award, often called “the Nobel Prize of Computer Science.” More detailed information is available on his honors and awards, his university service, and his professional and civic service.
Hellman has a deep interest in the ethics of technological development, and one of his current activities is applying risk analysis to a potential failure of nuclear deterrence. That approach has been endorsed by a number of prominent individuals including former Director of the National Security Agency (NSA) Adm. Bobby Inman and Stanford’s President Emeritus Donald Kennedy.
He and his wife Dorothie wrote a book, A New Map for Relationships: Creating True Love at Home & Peace on the Planet, that is now on sale at Amazon, Barnes & Noble, and other booksellers. Click to download a free PDF (1.2 MB) of the full book. Former Secretary of Defense William Perry recommended that it “should be read by married couples seeking peace at home, as well as by diplomats seeking peace in the world.” It shows how the changes needed to build a strong marriage or other relationship are the same ones needed to build a more peaceful, sustainable world. They are using his half of the $1 million ACM Turing Award to promote those ideas.
During the 1980’s, Prof. Hellman helped develop a meaningful dialog between the Soviet and American scientific communities on how human thinking had to evolve for survival in the nuclear age. This effort culminated in his co-editing a book with Prof. Anatoly Gromyko of Moscow. Breakthrough: Emerging New Thinking was published simultaneously in Russian and English in 1987 during the rapid change in Soviet-American relations.
His efforts to overcome ethnic tension within the university have been recognized by three awards from minority student organizations.
Born in New York, NY in October 1945, he received his B.E. from New York University in 1966, and his M.S. and Ph.D. from Stanford University in 1967 and 1969, all in Electrical Engineering.
Prof. Hellman was at IBM’s Watson Research Center from 1968-69 and an Assistant Professor of Electrical Engineering at MIT from 1969-71. Returning to Stanford in 1971, he served on the regular faculty until becoming Professor Emeritus in 1996. He has authored over seventy technical papers (click for publication list), twelve US patents and a number of foreign equivalents.
Hellman has been involved with a number of high-tech startups, serving variously as a founder, advisor, and investor. In his spare time, he enjoys people, soaring, speed skating, and hiking, although the latter three are infrequent in recent years.