Clark Thomborson | The University of Auckland (original) (raw)
Uploads
Papers by Clark Thomborson
Bookmarks Related papers MentionsView impact
Empirical Software Engineering, 2015
Bookmarks Related papers MentionsView impact
Communications in Computer and Information Science, 2014
Bookmarks Related papers MentionsView impact
Microelectronics Reliability, 1996
ABSTRACT
Bookmarks Related papers MentionsView impact
IFIP On-Line Library in Computer Science, 2005
Bookmarks Related papers MentionsView impact
Lecture Notes in Computer Science, 2008
ABSTRACT We develop a simple model of the processes by which identity and anonymity are managed b... more ABSTRACT We develop a simple model of the processes by which identity and anonymity are managed by complex systems. We explain New Zealand’s recently-proposed Identity Verification Service in terms of our model. We also indicate how our model might be used to guide the architecture of a next generation of trustworthy computing, and how it might be used to define a precise taxonomy of authentication.
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
IFIP TC11 Publications, 2004
Software fixes, patches and updates are issued periodically to extend the functional life cycle o... more Software fixes, patches and updates are issued periodically to extend the functional life cycle of software products. In order to facilitate the prompt notification, delivery, and installation of updates, the software industry has responded with update and patch management systems. Because of the proprietary nature of these systems, improvement efforts by academic researchers are greatly restricted. One solution to increasing
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Proceedings International Conference on Computer Design. VLSI in Computers and Processors (Cat. No.98CB36273), 1998
Bookmarks Related papers MentionsView impact
[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference, 1989
A description is given of several modifications to the Ziv-Lempel data compression scheme that im... more A description is given of several modifications to the Ziv-Lempel data compression scheme that improve its compression ratio at a moderate cost in run time (J. Ziv, A. Lempel, 1976, 1977, 1978). The best algorithm reduces the length of a typical compressed text file by about 25%. The enhanced coder compresses approximately 2000 bytes of text every second before optimization,
Bookmarks Related papers MentionsView impact
[1991] Proceedings. Data Compression Conference, 1991
Summary form only given, substantially as follows. The V.42bis standard for data compressing mode... more Summary form only given, substantially as follows. The V.42bis standard for data compressing modems is discussed from algorithmic, experimental, practical, theoretical, and marketing standpoints. On a test set of five 100-Kbyte text and executable files, the compression ratios are found to be markedly inferior to those of several other algorithms, notably the Cleary-Witten adaptive Markov code and the Fiala-Greene Ziv-Lempel variant. However, since all the better-compressing methods have much larger data structures none is superior to V.42bis for use in a contemporary modem
Bookmarks Related papers MentionsView impact
ABSTRACT Cognitive Informatics (CI) is a transdisciplinary approach to the cognitive and informat... more ABSTRACT Cognitive Informatics (CI) is a transdisciplinary approach to the cognitive and information sciences, emphasising the informational aspects of cognitive processes, with applications in the engineering of complex systems. Human cognition is a transcultural phenomeon, however to date all contributions to CI have been based on Western philosophy and science. In this article, we indicate how some of the fundamental concepts in Buddhist epistemology may be modeled in the CI framework. In particular: we develop a logical specification, in the Z notation, of cognitive processes which occur at levels 1 through 4 of the Layered Reference Model of the Brain (LRMB). We call these processes the Dhammic Framework. As with any axiomatic system, the validity of the Dhammic Framework cannot be proved by experimentation; but it could be invalidated if any of its implications were either logically inconsistent or in disagreement with experimental observation. Our formal statement of the Dhammic Framework will allow its axioms to be tested, scientifically, for contradiction within the framework of cognitive informatics. To this end, we propose a testable hypothesis about a way to avoid failures in systems engineering.
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Computer security is addressed from the economic point of view rather than the more traditional t... more Computer security is addressed from the economic point of view rather than the more traditional technical one. The reasons why security fails, such as the cost of security, incentive failures, Tragedy of the Commons and the lack of knowledge are investigated. Finally, some of the proposed (economics-based) solutions to security problems such as enforcing liabilities and government regulation are discussed.
Bookmarks Related papers MentionsView impact
Information Processing Letters, 1985
Bookmarks Related papers MentionsView impact
Information Processing Letters, 1998
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Empirical Software Engineering, 2015
Bookmarks Related papers MentionsView impact
Communications in Computer and Information Science, 2014
Bookmarks Related papers MentionsView impact
Microelectronics Reliability, 1996
ABSTRACT
Bookmarks Related papers MentionsView impact
IFIP On-Line Library in Computer Science, 2005
Bookmarks Related papers MentionsView impact
Lecture Notes in Computer Science, 2008
ABSTRACT We develop a simple model of the processes by which identity and anonymity are managed b... more ABSTRACT We develop a simple model of the processes by which identity and anonymity are managed by complex systems. We explain New Zealand’s recently-proposed Identity Verification Service in terms of our model. We also indicate how our model might be used to guide the architecture of a next generation of trustworthy computing, and how it might be used to define a precise taxonomy of authentication.
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
IFIP TC11 Publications, 2004
Software fixes, patches and updates are issued periodically to extend the functional life cycle o... more Software fixes, patches and updates are issued periodically to extend the functional life cycle of software products. In order to facilitate the prompt notification, delivery, and installation of updates, the software industry has responded with update and patch management systems. Because of the proprietary nature of these systems, improvement efforts by academic researchers are greatly restricted. One solution to increasing
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Proceedings International Conference on Computer Design. VLSI in Computers and Processors (Cat. No.98CB36273), 1998
Bookmarks Related papers MentionsView impact
[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference, 1989
A description is given of several modifications to the Ziv-Lempel data compression scheme that im... more A description is given of several modifications to the Ziv-Lempel data compression scheme that improve its compression ratio at a moderate cost in run time (J. Ziv, A. Lempel, 1976, 1977, 1978). The best algorithm reduces the length of a typical compressed text file by about 25%. The enhanced coder compresses approximately 2000 bytes of text every second before optimization,
Bookmarks Related papers MentionsView impact
[1991] Proceedings. Data Compression Conference, 1991
Summary form only given, substantially as follows. The V.42bis standard for data compressing mode... more Summary form only given, substantially as follows. The V.42bis standard for data compressing modems is discussed from algorithmic, experimental, practical, theoretical, and marketing standpoints. On a test set of five 100-Kbyte text and executable files, the compression ratios are found to be markedly inferior to those of several other algorithms, notably the Cleary-Witten adaptive Markov code and the Fiala-Greene Ziv-Lempel variant. However, since all the better-compressing methods have much larger data structures none is superior to V.42bis for use in a contemporary modem
Bookmarks Related papers MentionsView impact
ABSTRACT Cognitive Informatics (CI) is a transdisciplinary approach to the cognitive and informat... more ABSTRACT Cognitive Informatics (CI) is a transdisciplinary approach to the cognitive and information sciences, emphasising the informational aspects of cognitive processes, with applications in the engineering of complex systems. Human cognition is a transcultural phenomeon, however to date all contributions to CI have been based on Western philosophy and science. In this article, we indicate how some of the fundamental concepts in Buddhist epistemology may be modeled in the CI framework. In particular: we develop a logical specification, in the Z notation, of cognitive processes which occur at levels 1 through 4 of the Layered Reference Model of the Brain (LRMB). We call these processes the Dhammic Framework. As with any axiomatic system, the validity of the Dhammic Framework cannot be proved by experimentation; but it could be invalidated if any of its implications were either logically inconsistent or in disagreement with experimental observation. Our formal statement of the Dhammic Framework will allow its axioms to be tested, scientifically, for contradiction within the framework of cognitive informatics. To this end, we propose a testable hypothesis about a way to avoid failures in systems engineering.
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Bookmarks Related papers MentionsView impact
Computer security is addressed from the economic point of view rather than the more traditional t... more Computer security is addressed from the economic point of view rather than the more traditional technical one. The reasons why security fails, such as the cost of security, incentive failures, Tragedy of the Commons and the lack of knowledge are investigated. Finally, some of the proposed (economics-based) solutions to security problems such as enforcing liabilities and government regulation are discussed.
Bookmarks Related papers MentionsView impact
Information Processing Letters, 1985
Bookmarks Related papers MentionsView impact
Information Processing Letters, 1998
Bookmarks Related papers MentionsView impact