Alan Kay - Academia.edu (original) (raw)
Papers by Alan Kay
Lecture Notes in Computer Science, 1994
Proceedings of the Programming Experience 2016 (PX/16) Workshop, 2016
One of the primary goals of our research group is to improve education through computing. We are ... more One of the primary goals of our research group is to improve education through computing. We are interested in unleashing the power of the computer to create automated "intelligent" tutor systems (ITS). This paper presents ideas that may guide the design of such a system, targeting the problem of computer-programming education in particular. We also outline a research and development plan to build this system. While this plan is just a straw-man (there is a lot of uncertainty), our hope is to get a discussion started on this important topic.
Proceedings of the ACM Conference on The history of personal workstations, 1986
Proceedings of the 4th ACM international conference on Embedded software, 2004
If we compare with "The Printing Revolution", and "Modern Science and ... more If we compare with "The Printing Revolution", and "Modern Science and Engineering in the Physical World", then we have to conclude that the computer versions of these haven't happened yet. The printing revolution was not the hardware technology that allowed the automation of hand written texts such as bibles, but a much longer learning curve that brought about new ways to think and argue, culminating in the 17th century with new ways to understand the physical and political universes in which humans live. This created our modern world. By contrast we are still generally "automating bibles" with computers and the "new ways to think and argue" are still being invented and haven't reached most computer users. Similarly, if we compare the state of software building against modern engineering - such as the building of the Empire State Building in New York City from scratch in less than one year with less than 3000 coordinated workers - then we can hardly claim "software engineering" to be much past ancient Egyptian pyramid building: millions of lines of code-bricks piled on top of each other with little coordination or discernable architecture.A science is generally about finding better models about structures in the universe. As Simon pointed out, ours is a "science of the artificial" like a "science of bridges". We first have to make structures and then study them to create better theories. We might claim that McCarthy's LISP-in-itself as a kind of "Maxwell's Equations", but we still lack the equivalent "Special Theory" that includes the handling of time in a reasonable and useful way. The immense commercialization of computing that happened in the 80s and 90s disrupted much of the progress in each of these areas, especially in the US. It's now time to ask again what these three ideas should mean, and then get back to the real work of advancing our field and making the real computer revolution happen.
Conference on Object-Oriented Programming Systems, Languages, and Applications, 1997
Proceedings of the May 16-18, 1972, spring joint computer conference on - AFIPS '72 (Spring), 1971
Computer Design in the seventies will be delineated by a number of past ghosts and present spectres.
Proceedings of the 11th international conference on Software engineering - ICSE '89, 1989
l Specification Based Development: The history of Computer Science has been chronicled by advance... more l Specification Based Development: The history of Computer Science has been chronicled by advances in programming languages, and these advances will surely continue. However, we’ve also always been limited by our compiler technology. Our programming languages only contain what we know how to (efficiently) compile. But now (uncompilable) specification languages have begun to appear. For these languages to become practical, they must not just be documentation for an implementation, but the means of developing it. Since these languages are uncompilable, human guidance (i.e. Design) must be used to reach the stage where compilers can take over, and reused when the specification is subsequently evolved. The challenge is providing technology to support this translation and later retranslation so that we can ride the wave of improving specification languages instead of the compiler limited programming language wave.
ACM Turing Centenary Celebration on - ACM-TURING '12, 2012
Part of Turing's fame and inspiration came from showing how a simple computer can simulat... more Part of Turing's fame and inspiration came from showing how a simple computer can simulate every other computer, and so "anything is possible". The "Turing Tarpit" is getting caught by "anything is possible but nothing is easy". One way to get caught is to stay close to the underlying machine with our languages so that things seem comprehensible in the
Fourth International Conference on Creating, Connecting and Collaborating through Computing (C5'06), 2006
Collaborative environments range from the very primitive (IM) to a rich mixture of media, computa... more Collaborative environments range from the very primitive (IM) to a rich mixture of media, computation, and threedimensional graphics. Critical to all such systems is scalability : how many users can a system support in a single session? How many simultaneous sessions can be supported? These questions ultimately reduce to bandwidth and computation consumption of the collaborative system; the values, in turn, depend upon the architectural choices made in the implementation. Should the system be client-server or peer-to-peer? Should data or computation be replicated? Are there tradeoffs? Does it depend on the nature of the collaboration? In this paper, we derive bounds on bandwidth and latency for consistent collaborative systems. A consistent system is one in which all participants agree on the order of events across all peers. We examine and calculate the bandwidth and latency bounds on various system architectures. We calculate a theoretical lower bound for computational and bandwidth complexity, and compare the architectural bounds to the lowest global bound.
Scientific American, 1984
Presenting a single-topic issue on the concepts and techniques needed to make the computer do one... more Presenting a single-topic issue on the concepts and techniques needed to make the computer do one's bidding. It is software that gives form and purpose to a programmable machine, much as a sculptor shapes clay
Lecture Notes in Computer Science, 1994
Proceedings of the Programming Experience 2016 (PX/16) Workshop, 2016
One of the primary goals of our research group is to improve education through computing. We are ... more One of the primary goals of our research group is to improve education through computing. We are interested in unleashing the power of the computer to create automated "intelligent" tutor systems (ITS). This paper presents ideas that may guide the design of such a system, targeting the problem of computer-programming education in particular. We also outline a research and development plan to build this system. While this plan is just a straw-man (there is a lot of uncertainty), our hope is to get a discussion started on this important topic.
Proceedings of the ACM Conference on The history of personal workstations, 1986
Proceedings of the 4th ACM international conference on Embedded software, 2004
If we compare with "The Printing Revolution", and "Modern Science and ... more If we compare with "The Printing Revolution", and "Modern Science and Engineering in the Physical World", then we have to conclude that the computer versions of these haven't happened yet. The printing revolution was not the hardware technology that allowed the automation of hand written texts such as bibles, but a much longer learning curve that brought about new ways to think and argue, culminating in the 17th century with new ways to understand the physical and political universes in which humans live. This created our modern world. By contrast we are still generally "automating bibles" with computers and the "new ways to think and argue" are still being invented and haven't reached most computer users. Similarly, if we compare the state of software building against modern engineering - such as the building of the Empire State Building in New York City from scratch in less than one year with less than 3000 coordinated workers - then we can hardly claim "software engineering" to be much past ancient Egyptian pyramid building: millions of lines of code-bricks piled on top of each other with little coordination or discernable architecture.A science is generally about finding better models about structures in the universe. As Simon pointed out, ours is a "science of the artificial" like a "science of bridges". We first have to make structures and then study them to create better theories. We might claim that McCarthy's LISP-in-itself as a kind of "Maxwell's Equations", but we still lack the equivalent "Special Theory" that includes the handling of time in a reasonable and useful way. The immense commercialization of computing that happened in the 80s and 90s disrupted much of the progress in each of these areas, especially in the US. It's now time to ask again what these three ideas should mean, and then get back to the real work of advancing our field and making the real computer revolution happen.
Conference on Object-Oriented Programming Systems, Languages, and Applications, 1997
Proceedings of the May 16-18, 1972, spring joint computer conference on - AFIPS '72 (Spring), 1971
Computer Design in the seventies will be delineated by a number of past ghosts and present spectres.
Proceedings of the 11th international conference on Software engineering - ICSE '89, 1989
l Specification Based Development: The history of Computer Science has been chronicled by advance... more l Specification Based Development: The history of Computer Science has been chronicled by advances in programming languages, and these advances will surely continue. However, we’ve also always been limited by our compiler technology. Our programming languages only contain what we know how to (efficiently) compile. But now (uncompilable) specification languages have begun to appear. For these languages to become practical, they must not just be documentation for an implementation, but the means of developing it. Since these languages are uncompilable, human guidance (i.e. Design) must be used to reach the stage where compilers can take over, and reused when the specification is subsequently evolved. The challenge is providing technology to support this translation and later retranslation so that we can ride the wave of improving specification languages instead of the compiler limited programming language wave.
ACM Turing Centenary Celebration on - ACM-TURING '12, 2012
Part of Turing's fame and inspiration came from showing how a simple computer can simulat... more Part of Turing's fame and inspiration came from showing how a simple computer can simulate every other computer, and so "anything is possible". The "Turing Tarpit" is getting caught by "anything is possible but nothing is easy". One way to get caught is to stay close to the underlying machine with our languages so that things seem comprehensible in the
Fourth International Conference on Creating, Connecting and Collaborating through Computing (C5'06), 2006
Collaborative environments range from the very primitive (IM) to a rich mixture of media, computa... more Collaborative environments range from the very primitive (IM) to a rich mixture of media, computation, and threedimensional graphics. Critical to all such systems is scalability : how many users can a system support in a single session? How many simultaneous sessions can be supported? These questions ultimately reduce to bandwidth and computation consumption of the collaborative system; the values, in turn, depend upon the architectural choices made in the implementation. Should the system be client-server or peer-to-peer? Should data or computation be replicated? Are there tradeoffs? Does it depend on the nature of the collaboration? In this paper, we derive bounds on bandwidth and latency for consistent collaborative systems. A consistent system is one in which all participants agree on the order of events across all peers. We examine and calculate the bandwidth and latency bounds on various system architectures. We calculate a theoretical lower bound for computational and bandwidth complexity, and compare the architectural bounds to the lowest global bound.
Scientific American, 1984
Presenting a single-topic issue on the concepts and techniques needed to make the computer do one... more Presenting a single-topic issue on the concepts and techniques needed to make the computer do one's bidding. It is software that gives form and purpose to a programmable machine, much as a sculptor shapes clay