Norm Hardy’s Place in History
by Mark S. Miller
My speech at Norm Hardy’s memorial service, edited for clarity.
I am glad this memorial service is at the Computer History Museum. Having known Norm for many years, I have many personal things to say. But today I want to make clear Norm’s place in history. I will focus on my personal journey with Norm, since that is the part of Norm’s history I know best.
Solving the Whole Problem
In 1988 Eric Drexler and I co-authored a set of papers explaining the foundational requirements needed for general purpose, secure, distributed, market-based computation. We stated the core requirements as encapsulation and communication of information, access, and resources.
Encapsulation is essentially a kind of ownership right. Communication is essentially a form of rights transfer. Together they form a foundational rights theory for computational entities to coordinate with each other. Information, access, and resources map cleanly to confidentiality, integrity, and availability. We were not looking for a theory of security to be added to computing by other means, but rather, a theory of computation which is inherently modular and secure.
When we wrote this, we knew of no systems that met all these requirements; nor did we know how to meet them ourselves. We were not certain that a simultaneous solution was possible.
In 1988 Norm found the papers and contacted us. Over lunch Norm explained that he, with others at Key Logic, had built KeyKOS, an operating system that fully solved the simultaneous problem. Although they had not yet built computational markets on these foundations, Norm designed KeyKOS with this goal in mind. In addition, Norm explained how KeyKOS solves the confinement problem, which we had omitted from our papers because I thought it was impossible.
In the 1990s Norm, together with others in this room today and I, founded Agorics, to pursue the combined vision we now shared: market-based cooperative computation spanning the world, with no centralized controls or centralized vulnerabilities. (Not to be confused with Agoric.) At Agorics, I shared an office with Norm for two and a half years. In retrospect, that was my apprenticeship. It was the most profound learning experience in my life. It is when I learned to become a really good computer scientist, someone who could think the right way about hard problems.
During that time, I kept trying to do what I always try to do, find flaws and ways to improve things. I kept poking at the KeyKOS design thinking I could find something to improve, some alternate way that would be better in some dimension. At least a trade-off— at least some way to make it better on one dimension at the price of making it worse on another dimension.
For two and a half years, I did not find a single flaw or thing to improve, even in the sense of a trade-off. With every suggestion, Norm convinced me that the way KeyKOS already does it was better than my suggestion.
Norm’s 1985 Operating System review paper, KeyKOS Architecture, defines everything mutually recursively in terms of everything else. When I first read it I found it completely incomprehensible. I had to read it several times, letting the structure wash over me. Then eventually, there was one reading in which everything fit together perfectly. I understood how everything related to everything else. It was so mutually supportive that afterwards I could never think of building an operating system in any other way.
KeyKOS is the most perfect jewel of a technological artifact that I have ever seen emerge from a human mind.
Many Moments of Genius
During the time Norm and I shared an office, Agorics was translating ideas from capability operating system designs into object-capability programming languages. Even though these ideas in the context of operating systems are secure, I worried. How might the differences between operating systems and programming languages introduce new vulnerabilities? For example, the programming languages we were designing had garbage collection. KeyKOS, the operating system, does not.
I tried asking Norm what additional dangers garbage collection brings in. After several attempts I realized that the way to ask Norm a hard question is to pose it as a question about KeyKOS. “Norm, what if you added garbage collection to KeyKOS? What security problems would you have?” As I expected, I then had to sit through a long lecture about why it would be bad engineering to add garbage collection to KeyKOS. The only way to get to my question was to sit through that, which I did.
I patiently waited until that ran its course and then I said “Okay Norm, I understand all of that. But if you went ahead and added garbage collection to KeyKOS anyway, what would happen?” Norm closed his eyes for maybe a second or two, opened them, and said “Sensory keys would have to be weak.”
Please understand how weird this is. Weak references are off in this weird obscure corner of garbage collection. Sensory keys are off in this weird obscure corner of capability operating systems. It could easily have taken me five years doing object-capability languages before I spotted the information leak, the covert channel through the garbage collector, that Norm spotted closing his eyes for two seconds.
Dean Tribble and I are now proposing weak references for the JavaScript standard. Our proposal’s security builds on this insight of Norm’s. Without it, we might have introduced a security hole that nobody would have noticed until it was too late.
Asking the Right Question
When KeyKOS started as Gnosis, it was one of many capability operating systems during the OS golden age of the 1970s. Many of these capability operating systems were very interesting, intellectually fascinating stuff.
Aside from the line of thinking rooted in Norm, all the others from the OS golden age have withered and fossilized or died out. Only the line of thinking starting with Norm has continued to bear fruit, to grow, to become important in the world, to influence project after project. Why is Norm’s approach to computation so fertile?
KeyKOS is not just an incredible feat of engineering. KeyKOS is a philosophical achievement. Most people reasoning about computer security might ask: Is this program malicious or not? What permissions does this program have? Does it have adequate permissions to do what it needs to do? Does it have permissions with which it can misbehave? They look at the permission structure alone.
Norm thought about computation differently. Norm’s 1988 paper, The Confused Deputy, looked at knowledge, purpose, and permissions together. A “deputy” is any program built to carry out requests from multiple clients for multiple purposes. Norm asked a question about the deputy nobody had ever asked:
In what kind of architecture can the deputy use each permission only for the purpose it was given, without being led to use them for other purposes?
The Confused Deputy is the most important, but still subtle, criticism of the dominant paradigm: identity-based access control. Within that paradigm, by the time the deputy gets permissions, it does not know why it has which permission. Further, when the deputy acts, it cannot choose which permissions will determine if the action is allowed. The architecture co-mingles permissions in a way the deputy cannot untangle. Therefore the deputy cannot act in a way that respects the purposes for which it has been given each permission.
“A capability combines designation with authority.”
To express purposes, a request says what object the request is about. By saying that with a capability, the request grants the deputy permission to that object, so it can carry out that request. When the deputy acts, it says what the action is about by using only the capabilities it holds to serve that purpose. This coupling enables the deputy to know why it has been given each permission, and to use each permission only for its intended purpose.
We’ve seen subtle anomalies trigger scientific revolutions before. Until the Michelson-Morley experiment, Newtonian physics seemed to have a perfect record. The Michelson-Morley experiment detected a tiny, tiny anomaly in the most precise measurement anyone had ever done. Many kludges were invented to explain away this special case.
However, with further experiments, this tiny anomaly would always reappear. It ended up destroying the dominant paradigm. Confused Deputy is the anomaly of identity-based access control that will always reappear. Of all the flaws we trot out, this one is irreducible.
The Norm Hardy Prize
The Foresight Institute will have an annual Norm Hardy prize in computer security. Norm and I talked about what the prize should be about, what the criteria should be. Norm focused on knowledge, purpose, and permission again, but now with regard to the user.
How can we build computer systems so that normal human users can understand what their systems can do for them, what their systems can do to them, what the security implications are of the actions they might take? This field is sometimes called secure usability or usable secure computing. Two seminal works, by Marc Stiegler and Ka-Ping Yee, discovered an approach to user interface security that couples knowledge, purpose, and permission much like capabilities do. As Norm explains:
“The clipboard is inherently hostile to capabilities. Drag-and-drop is inherently friendly to capabilities.”
In the user interface, a drag-and-drop action is an act of designation. Our user interfaces are vast engines of designation, of selecting, of pointing, providing all these different rituals for the user to say: this thing should operate on that thing.
Norm was saying these acts of designation should also be the means by which this thing is given permission to operate on that thing. Applications would be given just the permission needed to carry out the user’s request, and know which permission to use for which purpose, without adding security-specific interactions to the user interface.
This insight, and these seminal works, point the way and give us hope that users may someday be able to securely use secure systems. But they only scratch the surface. We need new research to enable normal inattentive human beings to tacitly understand the security implications of the actions they might take. May the Norm Hardy prize encourage progress.
Norm Becomes Persistent
Norm is a pioneer of orthogonal persistence, first technically and then personally.
To cope with machine failures, today we code in terms of ephemeral processes and persistent files, making software both more complex and less robust. KeyKOS’s orthogonal persistence masks machine failures. KeyKOS software is written as if machines never fail. KeyKOS processes are immortal; they survive the destruction of the hardware they run on.
To achieve orthogonal persistence, you take a checkpoint, i.e., you snapshot the computation in a consistent and viable state to non-volatile storage. At a later time, you restore the suspended computation from that non-volatile storage onto a working machine.
Norm practices what he preaches. I refer to Norm in the present tense because he is not dead. Norm is suspended. Norm has been checkpointed to non-volatile storage.
Norm saw that his body was going to fail. He knew that if he did not checkpoint, the process would be lost. In order to checkpoint in a consistent and viable state, Norm is the first to use California’s so-called “Death with Dignity” legislation to choose the time of his cryonics suspension. Alcor reports this is one of the best suspensions they have ever had.
Our Mission
Norm’s situation leads to a mission for all of us. We cannot yet build the machines that can restore the Norm process from this checkpoint. In order to get to a future where we can build those machines, civilization has to survive and thrive long enough.
Civilization currently rests on the infrastructure that our industry has built. If this infrastructure had been built according to Norm’s prescriptions, I would be much less worried. It was not. Right now, civilization rests on computing systems that are not only insecure, but insecurable.
Norm showed us a better way. In order to survive into a future where we can revive him and see him again, we must rebuild our infrastructure by the principles Norm has taught us — so when Norm is revived he can look at it, he can appreciate it, and he can tell us what we did wrong.