Daniel Brand - Academia.edu (original) (raw)
Papers by Daniel Brand
… IEEE Transactions on, 1980
Abstraft--The production of error-& protomls or complex process interactions is essential to reli... more Abstraft--The production of error-& protomls or complex process interactions is essential to reliable communications. This paper presents techniques for both the detection of errors in prdoeols and for prevention of errors in their design. The methods have been used suceesstully , t o detect and correct errors in existing protomls. A technique based on a reachability analysis is described which detects errors in a design. This "perturbation technique" has been implemented and has s u d y detected inconsistencies or errors in existing protocol designs including both X.21 and X.25. The types of errors handled are state deadlocks, unspecified receptions, nonexecutable interactions, and state ambiguities. Therrors are d s e d and their effects considered. An interactive design technique is then described that prevents design errors. The technique is based on a set of production rules which guarantee that complete reception capability is provided in the interacting processes. These rules have been implemented in the form of a tracking algorithm that prevents a designer from creating unspecified receptions and nonexecutable interactions and monitors for the presence of state deadlocks and ambiguities. ZAFIROPULO et al.: ANALYZING AND SYNTHESIZING PROTOCOLS IEEE
ACM SIGPLAN Notices, 1986
Frequently a domain of discourse can be described as a hierarchical structure. Then instead of us... more Frequently a domain of discourse can be described as a hierarchical structure. Then instead of using one place predicates to describe the type of a term one can associate types directly, and let the unification procedure take advantage of it. This prunes the search space without the use of the Cut operator.
BEAM is a tool for finding errors in C++ programs by source code analysis. It uses data flow anal... more BEAM is a tool for finding errors in C++ programs by source code analysis. It uses data flow analysis to find paths leading to an error. Classical data flow analysis propagates information along all paths, including non-executable ones, which results in reporting errors that are not real. To avoid this problem, BEAM restricts data flow analysis to paths that are executable, and in addition also consistent with user's assumptions on initial state of memory. However, the user is not required to specify his assumptions. The main contribution of this paper is an algorithm for finding an executable path to error, while avoiding path enumeration.
Journal of the ACM, 1983
1. Introduction The trend toward distributed computing and computer networks is increasing the co... more 1. Introduction The trend toward distributed computing and computer networks is increasing the complexity of communication protocols. Formal methods of specification and anal-ysis are being gradually introduced to handle the complexity. We will briefly review the most common ...
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 1993
Simulation is today the most common form of verification. One disadvantage of simulation is the e... more Simulation is today the most common form of verification. One disadvantage of simulation is the excessive number of tests needed for complete coverage. However, as will be shown, the number of tests may be substantially reduced if test case generation is combined with a structural analysis. The resulting set of test cases for exhaustive simulation may be smaller than exponential, which might make exhaustive simulation feasible. Even if the set of test cases is still too large, choosing tests from this reduced set results in better coverage than otherwise.
Proceedings of the 1993 IEEE/ACM international …, 1993
Page 1. Verification of Large Synthesized Designs Daniel Brand ... In our experiments the number ... more Page 1. Verification of Large Synthesized Designs Daniel Brand ... In our experiments the number of stages of sepmation did not significantly affect the performance of the atgorithm, provided for any miter that is too kwge we consider smaller miters in its place. ...
Artificial Intelligence, 1976
Analytic resolution is a proof procedure for predicate calculus based on the ideas of serrumtic t... more Analytic resolution is a proof procedure for predicate calculus based on the ideas of serrumtic trees and analytic tableaux. It is related to the unit preference with set-of-support strategy, and incorporates some features of model elimination. The philosophy is .to expect and compensate for "b//nd alleys" by a stack discipline. This eliminates pollution of the search space by a bad choice of the next step in a proof. Experimental results included compore favourably with others from the literature.
… Design: VLSI in …, 1994
- This paper describes some of the synthesis contmls in the BmleDozer] synthesis system which are... more - This paper describes some of the synthesis contmls in the BmleDozer] synthesis system which are unique in concept and implementation. Rather than attempting to achieve the minuun mur ~ of optimizatwn in the "wn amount of run time, the designer specifis the restructuring ...
Computer Networks (1976), 1978
A protocol verifier using symbolic execution has been designed and implemented as part of a gener... more A protocol verifier using symbolic execution has been designed and implemented as part of a general verifier (oriented towards microcode). This part describes how this method works for communication protocols involving timing assumptions, state changes depending on message contents, unreliable medium, an arbitrary number of communicating processes, etc. The method can detect design errors such as deadlock and tempo-blocking; in addition the user can add his own assertions to express other desired properties.
Proceedings of International Conference on Computer Aided Design, 1996
This paper studies the confidence with which power can be estimated at various levels of design a... more This paper studies the confidence with which power can be estimated at various levels of design abstraction. We report the results of experiments designed to identify and evaluate the sources of inaccuracies in gate-level power estimation. In particular, we are interested in power estimation during logic synthesis. Factors that may invalidate or diminish the accuracy of power estimates include optimization,
16th Design Automation Conference, 1979
... at this level for documentation, specification, and simulation [1,2,4]. This paper will descr... more ... at this level for documentation, specification, and simulation [1,2,4]. This paper will describe the use of a particular machine description language, LSS , to construct an abstract state machine [6] whose quantities are amenable to symbolic manipulation by computer program. ...
Science of Computer Programming, 2010
The size of today's programs continues to grow, as does the number of bugs they contain. Testing ... more The size of today's programs continues to grow, as does the number of bugs they contain. Testing alone is rarely able to flush out all bugs, and many lurk in difficult-to-test corner cases. An important alternative is static analysis, in which correctness properties of a program are checked without running it. While it cannot catch all errors, static analysis can catch many subtle problems that testing would miss.
Proceedings of IEEE International Conference on Computer Aided Design (ICCAD), 1995
This paper describes optimization techniques using don't-care conditions that span t... more This paper describes optimization techniques using don't-care conditions that span the domain of high-level and logic synthesis. The following three issues are discussed: 1) how to describe and extract don't-care conditions from high-level descriptions; 2) how to pass don't-care ...
Proceedings of the 2008 ACM symposium on Applied computing - SAC '08, 2008
Abstract We propose a new abstraction for pointer analysis that represents reads and writes to me... more Abstract We propose a new abstraction for pointer analysis that represents reads and writes to memory instead of traditional points-to relations. Compared to points-to graphs, our Assign-Fetch Graph (AFG) leads to concise procedure summaries that can be used in any calling context. Also, its flexibility supports new analysis techniques with different trade-offs between speed and precision.
Proceedings of IEEE International Conference on Computer Aided Design (ICCAD), 1995
Abstract It is commonly expected that any correct implementation can replace its specification in... more Abstract It is commonly expected that any correct implementation can replace its specification inside a larger design without violating the correctness of the whole design. This property (called replaceability) is automatically satisfied in the absence of don't cares ...
Ibm Journal of Research and Development, 2000
... several varieties of these, but the one to be chosen depends on information not available to ... more ... several varieties of these, but the one to be chosen depends on information not available to synthesis, such as the amount and type of fan-out or wire lengths on the ... John V. GerW IBM Data Systems Division, PO Box 390, Poughkeepsie, New York 12602. ... William H. Joyner, Jr. ...
A method of synthesizing scan designs that are testable with pseudorandom patterns is presented. ... more A method of synthesizing scan designs that are testable with pseudorandom patterns is presented. The logic is first simplified by various transformations in a logic synthesis system. A fault simulator is then used to guide the placement of control points and observation points. In order to reduce the overhead, control points are shared when possible and a condensation network is used with the observation points. Experimental results which indicate that pseudorandom testability can be achieved with small area overheads using simple techniques are presented
… IEEE Transactions on, 1980
Abstraft--The production of error-& protomls or complex process interactions is essential to reli... more Abstraft--The production of error-& protomls or complex process interactions is essential to reliable communications. This paper presents techniques for both the detection of errors in prdoeols and for prevention of errors in their design. The methods have been used suceesstully , t o detect and correct errors in existing protomls. A technique based on a reachability analysis is described which detects errors in a design. This "perturbation technique" has been implemented and has s u d y detected inconsistencies or errors in existing protocol designs including both X.21 and X.25. The types of errors handled are state deadlocks, unspecified receptions, nonexecutable interactions, and state ambiguities. Therrors are d s e d and their effects considered. An interactive design technique is then described that prevents design errors. The technique is based on a set of production rules which guarantee that complete reception capability is provided in the interacting processes. These rules have been implemented in the form of a tracking algorithm that prevents a designer from creating unspecified receptions and nonexecutable interactions and monitors for the presence of state deadlocks and ambiguities. ZAFIROPULO et al.: ANALYZING AND SYNTHESIZING PROTOCOLS IEEE
ACM SIGPLAN Notices, 1986
Frequently a domain of discourse can be described as a hierarchical structure. Then instead of us... more Frequently a domain of discourse can be described as a hierarchical structure. Then instead of using one place predicates to describe the type of a term one can associate types directly, and let the unification procedure take advantage of it. This prunes the search space without the use of the Cut operator.
BEAM is a tool for finding errors in C++ programs by source code analysis. It uses data flow anal... more BEAM is a tool for finding errors in C++ programs by source code analysis. It uses data flow analysis to find paths leading to an error. Classical data flow analysis propagates information along all paths, including non-executable ones, which results in reporting errors that are not real. To avoid this problem, BEAM restricts data flow analysis to paths that are executable, and in addition also consistent with user's assumptions on initial state of memory. However, the user is not required to specify his assumptions. The main contribution of this paper is an algorithm for finding an executable path to error, while avoiding path enumeration.
Journal of the ACM, 1983
1. Introduction The trend toward distributed computing and computer networks is increasing the co... more 1. Introduction The trend toward distributed computing and computer networks is increasing the complexity of communication protocols. Formal methods of specification and anal-ysis are being gradually introduced to handle the complexity. We will briefly review the most common ...
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 1993
Simulation is today the most common form of verification. One disadvantage of simulation is the e... more Simulation is today the most common form of verification. One disadvantage of simulation is the excessive number of tests needed for complete coverage. However, as will be shown, the number of tests may be substantially reduced if test case generation is combined with a structural analysis. The resulting set of test cases for exhaustive simulation may be smaller than exponential, which might make exhaustive simulation feasible. Even if the set of test cases is still too large, choosing tests from this reduced set results in better coverage than otherwise.
Proceedings of the 1993 IEEE/ACM international …, 1993
Page 1. Verification of Large Synthesized Designs Daniel Brand ... In our experiments the number ... more Page 1. Verification of Large Synthesized Designs Daniel Brand ... In our experiments the number of stages of sepmation did not significantly affect the performance of the atgorithm, provided for any miter that is too kwge we consider smaller miters in its place. ...
Artificial Intelligence, 1976
Analytic resolution is a proof procedure for predicate calculus based on the ideas of serrumtic t... more Analytic resolution is a proof procedure for predicate calculus based on the ideas of serrumtic trees and analytic tableaux. It is related to the unit preference with set-of-support strategy, and incorporates some features of model elimination. The philosophy is .to expect and compensate for "b//nd alleys" by a stack discipline. This eliminates pollution of the search space by a bad choice of the next step in a proof. Experimental results included compore favourably with others from the literature.
… Design: VLSI in …, 1994
- This paper describes some of the synthesis contmls in the BmleDozer] synthesis system which are... more - This paper describes some of the synthesis contmls in the BmleDozer] synthesis system which are unique in concept and implementation. Rather than attempting to achieve the minuun mur ~ of optimizatwn in the "wn amount of run time, the designer specifis the restructuring ...
Computer Networks (1976), 1978
A protocol verifier using symbolic execution has been designed and implemented as part of a gener... more A protocol verifier using symbolic execution has been designed and implemented as part of a general verifier (oriented towards microcode). This part describes how this method works for communication protocols involving timing assumptions, state changes depending on message contents, unreliable medium, an arbitrary number of communicating processes, etc. The method can detect design errors such as deadlock and tempo-blocking; in addition the user can add his own assertions to express other desired properties.
Proceedings of International Conference on Computer Aided Design, 1996
This paper studies the confidence with which power can be estimated at various levels of design a... more This paper studies the confidence with which power can be estimated at various levels of design abstraction. We report the results of experiments designed to identify and evaluate the sources of inaccuracies in gate-level power estimation. In particular, we are interested in power estimation during logic synthesis. Factors that may invalidate or diminish the accuracy of power estimates include optimization,
16th Design Automation Conference, 1979
... at this level for documentation, specification, and simulation [1,2,4]. This paper will descr... more ... at this level for documentation, specification, and simulation [1,2,4]. This paper will describe the use of a particular machine description language, LSS , to construct an abstract state machine [6] whose quantities are amenable to symbolic manipulation by computer program. ...
Science of Computer Programming, 2010
The size of today's programs continues to grow, as does the number of bugs they contain. Testing ... more The size of today's programs continues to grow, as does the number of bugs they contain. Testing alone is rarely able to flush out all bugs, and many lurk in difficult-to-test corner cases. An important alternative is static analysis, in which correctness properties of a program are checked without running it. While it cannot catch all errors, static analysis can catch many subtle problems that testing would miss.
Proceedings of IEEE International Conference on Computer Aided Design (ICCAD), 1995
This paper describes optimization techniques using don't-care conditions that span t... more This paper describes optimization techniques using don't-care conditions that span the domain of high-level and logic synthesis. The following three issues are discussed: 1) how to describe and extract don't-care conditions from high-level descriptions; 2) how to pass don't-care ...
Proceedings of the 2008 ACM symposium on Applied computing - SAC '08, 2008
Abstract We propose a new abstraction for pointer analysis that represents reads and writes to me... more Abstract We propose a new abstraction for pointer analysis that represents reads and writes to memory instead of traditional points-to relations. Compared to points-to graphs, our Assign-Fetch Graph (AFG) leads to concise procedure summaries that can be used in any calling context. Also, its flexibility supports new analysis techniques with different trade-offs between speed and precision.
Proceedings of IEEE International Conference on Computer Aided Design (ICCAD), 1995
Abstract It is commonly expected that any correct implementation can replace its specification in... more Abstract It is commonly expected that any correct implementation can replace its specification inside a larger design without violating the correctness of the whole design. This property (called replaceability) is automatically satisfied in the absence of don't cares ...
Ibm Journal of Research and Development, 2000
... several varieties of these, but the one to be chosen depends on information not available to ... more ... several varieties of these, but the one to be chosen depends on information not available to synthesis, such as the amount and type of fan-out or wire lengths on the ... John V. GerW IBM Data Systems Division, PO Box 390, Poughkeepsie, New York 12602. ... William H. Joyner, Jr. ...
A method of synthesizing scan designs that are testable with pseudorandom patterns is presented. ... more A method of synthesizing scan designs that are testable with pseudorandom patterns is presented. The logic is first simplified by various transformations in a logic synthesis system. A fault simulator is then used to guide the placement of control points and observation points. In order to reduce the overhead, control points are shared when possible and a condensation network is used with the observation points. Experimental results which indicate that pseudorandom testability can be achieved with small area overheads using simple techniques are presented