Multics Security (original) (raw)
Paul Karger and Roger Schell, Oakland, 1994. Click for a larger view.
How the Air Force cracked Multics Security
Security was one of the "second system" issues we determined to do right on Multics. At that time in the mid-1960s, all then-existing computer systems could be cracked: that is, their file access controls could be defeated, and any user who could run a program on the machine could take the machine over. We wanted to build a system whose access controls couldn't be bypassed. This turned out to be a hard problem.
MIT's share of the development money for Multics came from Project MAC's contract with the US government's Advanced Research Projects Agency (ARPA), a part of the Department of Defense. (You can see an acknowledgment of ARPA support at the bottom of published Multics papers, such as the Multics intro paper.) ARPA never asked us for features that would help design bombs or destroy villages, but they did encourage us to work on secure systems, and we made this a central design goal for the system. (This was where we came into conflict with some MIT programmers who indulged in the traditional undergraduate practice of breaking into everything. These activities, especially if they crashed a system, justified our attempts to make the systems uncrackable, in our eyes.)
I was going to say that we never saw a colonel, only civilians; but it did turn out that our ARPA connection got us some extra people who worked on Multics development around 1967 and 1968 from other government agencies. One fellow was from the Weather Bureau, another from the CIA. They were just programmers, though, didn't make a big deal of injecting requirements into the system for their employers. And one of the brightest and hardest working graduate students, who did his thesis on dynamic reconfiguration of the system, was Roger Schell, who it turned out was a US Air Force major being sent to MIT to get an advanced degree.
As Multics developed further, Honeywell contracted with the Air Force to add features to extend Multics access control to match the traditional military security model of SECRET, TOP SECRET, and so on. This was a natural extension of the system, and it came with money we needed. (Many technical decisions on Multics were ones that led to extra people or funding.) The goal of the Air Force project was to come up with a time-sharing system that could be used by more than one clearance level of user, such that no user could get at data they weren't cleared to have. The Air Force team was led by Roger Schell; they also had a brilliant team from MITRE working with them. This project was around 1972 - 1974.
The whole project was called Project GUARDIAN. Honeywell was responsible for adding features to the system's resource management and access control. The MITRE crew laid down some basic theory. A team from MITRE and Air Force looked for security problems in the existing system: this tiger team called themselves Project ZARF. (A zarf, in their own private lingo, was one of those plastic holders for disposable plastic coffee cup inserts (which they called a finjan.). I think I have this right way around.) They tried to break Multics security on the MIT GE-645 that we all used as our time-sharing utility and development build & exposure site.
And break it they did, as described in a 1977 New Yorker article, "Dead Souls in the Computer," By Thomas Whiteside (and later in his book called Computer Capers). At a meeting in a Honeywell conference room, they handed me my password on a slip of paper. They'd exploited a bug in the obsolete interface put in for the XRAY facility, Jerry Grochow's thesis. This supervisor entry didn't do anything, but it accessed its arguments incorrectly, in a way that let the team cause the hardcore to patch itself. They'd used that hole to permanently install a tool that let them patch any location and read any file, and they'd obtained a copy of the password file from the MIT Multics site.
My code in the Multics User Control subsystem stored passwords one-way encrypted, at the suggestion of Joe Weizenbaum. (After an accident on CTSS in 1965 when two users edited files in the same directory, not realizing that the editor created intermediate files of a constant name: this caused the whole CTSS password file to type out in the message of the day on every login at 4PM on a Friday, until an alert user (Bill Mathews) noticed it and entered an XEC* instruction in the debugger to freeze the system. But I digress.) I was no cryptanalyst; Joe had suggested I store the square of the password, but I knew people could take square roots, so I squared each password and ANDed with a mask to discard some bits. The Project ZARF folks then had to try 32 values instead of one, no big deal: except that there was a PL/I compiler bug in squaring long integers that gave wrong answers. If the compiler bug had been discovered and fixed, nobody would have been able to log in. The crackers had to construct some fancy tables to compensate for the "Martian" arithmetic, but they still had only to try a few hundred values to invert the transform. (We quickly changed the encryption to a new stronger method, before Barry Wolman fixed the compiler bug.)
The second 645 Multics system sold was to the Air Force, at Rome Air Development Center. Project GUARDIAN went on to specify the Access Isolation Mechanism, an extensive set of changes to the system that enforced the classification system, labeled output, and so on; these features were made part of the standard system, set up so that if you didn't invoke them they didn't impose a penalty. The particular exploit that was the basis for this hack led Mike Schroeder and Jerry Saltzer to propose hardware changes to the ring implementation which Honeywell incorporated into the 6180 and subsequent Multics machines; it ensured that arguments were automatically checked by the hardware. In the 6180, crucial security features of 645 Multics migrated from software to hardware, and many software security bugs, such as the original argument handling bug, were tracked down and removed. No back door was ever discovered in any 6180 Multics. Five large Multics systems were also sold to the Air Force Data Services Center in the Pentagon, and to other government agencies concerned with secure operation. The final report of the Air Force team, Karger, P. A., and Schell, R. R., "Multics Security Evaluation: Vulnerability Analysis," ESD-TR-74-193 Vol. II, ESD/AFSC, Hanscom AFB, Bedford, MA (June 1974), mentioned, in a few lines, that one could hide a back door in the binary of the compiler.
Roger Schell, of course, went on to direct the NSA National Computer Security Center (NCSC), which created the famous Orange Book that defined what was necessary for different levels of computer security. There is a strong Multics flavor through the whole Orange Book and related documents. In the early 80s, Multics was the first system to be rated B2 by the NCSC.
When the Air Force bought the Multics for the Pentagon, and ran it with multiple security levels, they insisted that we provide them with source of everything, so that they could, at least in theory, recompile the whole system and inspect every line of source. This provided some additional assurance, about the level of today's much touted Open Source (hosanna music), but as Karger and Schell had pointed out previously, it was no guarantee.Ken Thompson, in his 1984 Turing Award lecture "Reflections on Trusting Trust," said,
The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code.
Unless the Air Force developed their own PL/I compiler from scratch, a hacked compiler could have added the back door to itself when bootstrapping and to some critical supervisor routine to open up security.
References
Karger, Paul A., and Roger R. Schell. "Thirty Years Later: Lessons from the Multics Security Evaluation" also contains a reprint of the 1974 evaluation report mentioned above.
05/21/93
Updated 11/08/94
Updated 02/15/95
Updated 06/14/02
Updated 07/29/02
Updated 10/14/02