Derrick Kourie - Academia.edu (original) (raw)
Papers by Derrick Kourie
Fundamental Approaches to Software Engineering
Correctness-by-Construction (CbC) is an approach to incrementally create formally correct program... more Correctness-by-Construction (CbC) is an approach to incrementally create formally correct programs guided by pre-and postcondition specifications. A program is created using refinement rules that guarantee the resulting implementation is correct with respect to the specification. Although CbC is supposed to lead to code with a low defect rate, it is not prevalent, especially because appropriate tool support is missing. To promote CbC, we provide tool support for CbC-based program development. We present CorC, a graphical and textual IDE to create programs in a simple while-language following the CbC approach. Starting with a specification, our open source tool supports CbC developers in refining a program by a sequence of refinement steps and in verifying the correctness of these refinement steps using the theorem prover KeY. We evaluated the tool with a set of standard examples on CbC where we reveal errors in the provided specification. The evaluation shows that our tool reduces the verification time in comparison to post-hoc verification.
South African Computer Journal
Failure deterministic finite automata (FDFAs) represent regular languages more compactly than det... more Failure deterministic finite automata (FDFAs) represent regular languages more compactly than deterministic finite automata (DFAs). Four algorithms that convert arbitrary DFAs to language-equivalent FDFAs are empirically investigated. Three are concrete variants of a previously published abstract algorithm, the DFA-Homomorphic Algorithm (DHA). The fourth builds a maximal spanning tree from the DFA to derive what it calls a delayed input DFA. A first suite of test data consists of DFAs that recognise randomised sets of finite length keywords. Since the classical Aho-Corasick algorithm builds an optimal FDFA from such a set (and only from such a set), it provides benchmark FDFAs against which the performance of the general algorithms can be compared. A second suite of test data consists of random DFAs generated by a specially designed algorithm that also builds language-equivalent FDFAs, some of which may have non-divergent cycles. These random FDFAs provide (not necessarily tight) lo...
Proceedings of the 2007 Summer Computer Simulation Conference, 2007
A legacy non-distributed logical time simulator was previously migrated to a distributed architec... more A legacy non-distributed logical time simulator was previously migrated to a distributed architecture to parallelise execution. The existing Discrete Time System Specification (DTSS) modelling formalism was retained to simplify the reuse of existing models. This decision, however means that the high simulation frame rate of 100Hz used in the legacy system has to be retained in the distributed one-a known difficulty for existing distribution technologies due to interprocess communication latency. The specialised discrete time distributed peer-to-peer message passing architecture that resulted to support the parallelised simulator requirements is analysed and the questions surrounding its performance and flexibility answered. The architecture is shown to be a suitable and cost effective distributed simulator architecture for supporting a four to five times parallelised implementation of a 100 Hz logical time DTSS modelling formalism. From the analysis results it is however clear that the discrete time architecture poses a significant technical challenge in supporting large scale distributed parallel simulations. This is mainly due to sequential communication components within the discrete time architecture and system specification that cannot be parallelised. A hybrid DTSS/Discrete Event System Specification (DEVS) modelling formalism * Progress of Masters dissertation following on original article[1] by the same name. and simulator is proposed to lower the communication and synchronisation overhead between models and improve on the scalability of the discrete time simulator while still economically reusing the existing models. The proposed hybrid architecture is discussed. Ideas on implementing and then analysing the new architecture to complete the author's masters dissertation are then touched upon.
Sa Computer Journal, 2006
... Editorial introduction to special ARIMA / SACJ joint issue on advances in end-user data minin... more ... Editorial introduction to special ARIMA / SACJ joint issue on advances in end-user data mining techniques. Journal Title: South African Computer Journal; Issue: Issue 36; Publication Date: 2006; Pages: p.1 - 3; Authors: Sadok Ben Yahia; Derrick Kourie; Engelbert Mephu Nguifo; ...
Issa, 2005
This paper addresses some of the discriminants that make IPSec tunnel fingerprinting possible. Fi... more This paper addresses some of the discriminants that make IPSec tunnel fingerprinting possible. Fingerprinting of VPN-tunnel endpoints may be desirable for forensic purposes, but in the hands of individuals of ill-intent, it undermines an enterprise network's perimeter security. Three ways of preventing the ill-use of this type of fingerprinting are presented. The first two, apply to enterprises wishing to make their VPN tunnels immune to fingerprinting. The third delves deeper into the conceptual, and is directed at the standards definition process, as used by the Internet Engineering Task Force (IETF) and to authors of security-related RFCs in particular. It addresses aspects in the Internet Key Exchange version 1 (IKEv1) RFC that have led to misinterpretations on the part of IPSec implementers, and describes the use of a form of process algebra known as Communicating Sequential Processes (CSP) in defining security-related standards to overcome RFC-related ambiguities.
As long as software has been produced, there have been efforts to strive for quality in software ... more As long as software has been produced, there have been efforts to strive for quality in software products. In order to understand quality in software products, researchers have built models of software quality that rely on metrics in an attempt to provide a quantitative view of software quality. The aim of these models is to provide software producers with the capability to define and evaluate metrics related to quality and use these metrics to improve the quality of the software they produce over time. The main disadvantage of these models is that they require effort and resources to define and evaluate metrics from software projects. This article briefly describes some prominent models of software quality in the literature and continues to describe a new approach to gaining insight into quality in software development projects. A case study based on this new approach is described and results from the case study are discussed.
Proceedings of the 2006 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Oct 9, 2006
Page 1. A Service Oriented Architecture for Wireless Sensor and Actor Network Applications JACO M... more Page 1. A Service Oriented Architecture for Wireless Sensor and Actor Network Applications JACO M. PRINSLOO, CHRISTIAN L. SCHULZ, DERRICK G. KOURIE, WH MORKEL THEUNISSEN, TINUS STRAUSS, ROELF VAN ...
Stringology, 2006
We present a data structure to index a specific kind of factors, that is of substrings, called ga... more We present a data structure to index a specific kind of factors, that is of substrings, called gapped-factors. A gapped-factor is a factor containing a gap that is ignored during the indexation. The data structure presented is based on the suffix tree and indexes all the gapped-factors of a text with a fixed size of gap, and only those. The construction of this data structure is done online in O(n × |Σ|) time and space, with n the length of the text and |Σ| the size of the alphabet. Such a data structure may play an important role in some pattern matching and motif inference problems, for instance in text filtration.
Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Jul 20, 2005
Refactoring of a design before updating and modifying software has become an accepted practice in... more Refactoring of a design before updating and modifying software has become an accepted practice in order to prepare the design for the upcoming changes. This paper describes a refactoring of the design of a particular application to illustrate a suggested approach. In this approach, we represent the design using UML, more specifically the structure using class diagrams, and the behaviour of each class using statecharts. Examining metrics of the specific design, we suggest a refactoring that changes a centralized control structure into one that employs more delegation, showing the refactored class and statechart diagrams. As preserving behaviour is one of the defining attributes of a refactoring, we use a csp-based formalism to describe the refactoring, and show that the refactoring indeed preserves behaviour.
Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Jul 20, 2005
Corporate software developers are faced with many difficulties. Development windows are decreasin... more Corporate software developers are faced with many difficulties. Development windows are decreasing; scale and complexity are increasing; business requirements are vague and changing; and the underlying technology moves ever on. Agile methods have emerged as leading contenders to tame these challenges. Small teams, face-to-face communication, an emphasis on simplicity and a selection of development best practices contribute to software development which is relevant, yet fast and flexible. At the same time, Open Source Software is increasingly providing infrastructure, tools and components to companies. Progressive development teams are beginning to work in more open, collaborative, and distributed ways. In some respects these practices are similar to agile practices, but in important ways, very different. Yet, both are important and offer unique benefits. This paper discusses the prospects of combining the two in the context of corporate software development, and the approach we suggest to do this.
Proceedings Elmar 2010, Sep 1, 2010
To make provision for irregularly occurring updates a "variable-step" polynomial filter is derive... more To make provision for irregularly occurring updates a "variable-step" polynomial filter is derived which improves the smoothing-, and coasting capabilities of the 1-step predictor and the current-estimate polynomial filters. Similar to the original filters, the variable-step polynomial filter comprises an autoinitialising expanding memory filter which later switches to a fading memory filter. Results are compared by running the proposed and original filter versions in parallel.
Proceedings of the Southern African Institute For Computer Scientist and Information Technologists Annual Conference 2014, Sep 29, 2014
Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries - SAICSIT '07, 2007
A decision-supporting framework was applied in a pilot study to assist in the decision making abo... more A decision-supporting framework was applied in a pilot study to assist in the decision making about what software development methodology to use at a software engineering company. This paper critically assesses this decision making process and framework that was used at that company to decide on an appropriate software methodology for the analysis and design of business processes and software systems.
Telecommunication technology is increasingly converging towards software-intensive solutions. As ... more Telecommunication technology is increasingly converging towards software-intensive solutions. As the role of software increases, it naturally follows that traditional software development procedures need to be reassessed. In South Africa, this need is especially urgent because of recent legislation that presages an increase in the number of competitors. This paper assesses two important contemporary software development approaches-namely agile and open source-and their relationship to one another. A variant of open source software development that is highlighted, is Hewlett-Packard's so-called 'Progressive Open Source'. This is because it is specifically tailored to corporate use. The benefits and disadvantages of adopting the various development methods are assessed. Guidelines based on project characteristics are provided to indicate when each of the respective approaches should be considered.
Open Source Software has become a noticeable factor in the Information and Communication Technolo... more Open Source Software has become a noticeable factor in the Information and Communication Technology arena. This paper investigates this claim and its relevance to the telecommunication industry. A number of misconceptions surrounding open source are explored before introducing the progression from exclusively using open source products to participating in their development. Taking the next step with open source adoption is investigated with the aim of reaping the associated benefits of this development approach. We provide some examples of how these benefits might be gained through some scenarios related to the local telecommunication software development activities.
Fundamental Approaches to Software Engineering
Correctness-by-Construction (CbC) is an approach to incrementally create formally correct program... more Correctness-by-Construction (CbC) is an approach to incrementally create formally correct programs guided by pre-and postcondition specifications. A program is created using refinement rules that guarantee the resulting implementation is correct with respect to the specification. Although CbC is supposed to lead to code with a low defect rate, it is not prevalent, especially because appropriate tool support is missing. To promote CbC, we provide tool support for CbC-based program development. We present CorC, a graphical and textual IDE to create programs in a simple while-language following the CbC approach. Starting with a specification, our open source tool supports CbC developers in refining a program by a sequence of refinement steps and in verifying the correctness of these refinement steps using the theorem prover KeY. We evaluated the tool with a set of standard examples on CbC where we reveal errors in the provided specification. The evaluation shows that our tool reduces the verification time in comparison to post-hoc verification.
South African Computer Journal
Failure deterministic finite automata (FDFAs) represent regular languages more compactly than det... more Failure deterministic finite automata (FDFAs) represent regular languages more compactly than deterministic finite automata (DFAs). Four algorithms that convert arbitrary DFAs to language-equivalent FDFAs are empirically investigated. Three are concrete variants of a previously published abstract algorithm, the DFA-Homomorphic Algorithm (DHA). The fourth builds a maximal spanning tree from the DFA to derive what it calls a delayed input DFA. A first suite of test data consists of DFAs that recognise randomised sets of finite length keywords. Since the classical Aho-Corasick algorithm builds an optimal FDFA from such a set (and only from such a set), it provides benchmark FDFAs against which the performance of the general algorithms can be compared. A second suite of test data consists of random DFAs generated by a specially designed algorithm that also builds language-equivalent FDFAs, some of which may have non-divergent cycles. These random FDFAs provide (not necessarily tight) lo...
Proceedings of the 2007 Summer Computer Simulation Conference, 2007
A legacy non-distributed logical time simulator was previously migrated to a distributed architec... more A legacy non-distributed logical time simulator was previously migrated to a distributed architecture to parallelise execution. The existing Discrete Time System Specification (DTSS) modelling formalism was retained to simplify the reuse of existing models. This decision, however means that the high simulation frame rate of 100Hz used in the legacy system has to be retained in the distributed one-a known difficulty for existing distribution technologies due to interprocess communication latency. The specialised discrete time distributed peer-to-peer message passing architecture that resulted to support the parallelised simulator requirements is analysed and the questions surrounding its performance and flexibility answered. The architecture is shown to be a suitable and cost effective distributed simulator architecture for supporting a four to five times parallelised implementation of a 100 Hz logical time DTSS modelling formalism. From the analysis results it is however clear that the discrete time architecture poses a significant technical challenge in supporting large scale distributed parallel simulations. This is mainly due to sequential communication components within the discrete time architecture and system specification that cannot be parallelised. A hybrid DTSS/Discrete Event System Specification (DEVS) modelling formalism * Progress of Masters dissertation following on original article[1] by the same name. and simulator is proposed to lower the communication and synchronisation overhead between models and improve on the scalability of the discrete time simulator while still economically reusing the existing models. The proposed hybrid architecture is discussed. Ideas on implementing and then analysing the new architecture to complete the author's masters dissertation are then touched upon.
Sa Computer Journal, 2006
... Editorial introduction to special ARIMA / SACJ joint issue on advances in end-user data minin... more ... Editorial introduction to special ARIMA / SACJ joint issue on advances in end-user data mining techniques. Journal Title: South African Computer Journal; Issue: Issue 36; Publication Date: 2006; Pages: p.1 - 3; Authors: Sadok Ben Yahia; Derrick Kourie; Engelbert Mephu Nguifo; ...
Issa, 2005
This paper addresses some of the discriminants that make IPSec tunnel fingerprinting possible. Fi... more This paper addresses some of the discriminants that make IPSec tunnel fingerprinting possible. Fingerprinting of VPN-tunnel endpoints may be desirable for forensic purposes, but in the hands of individuals of ill-intent, it undermines an enterprise network's perimeter security. Three ways of preventing the ill-use of this type of fingerprinting are presented. The first two, apply to enterprises wishing to make their VPN tunnels immune to fingerprinting. The third delves deeper into the conceptual, and is directed at the standards definition process, as used by the Internet Engineering Task Force (IETF) and to authors of security-related RFCs in particular. It addresses aspects in the Internet Key Exchange version 1 (IKEv1) RFC that have led to misinterpretations on the part of IPSec implementers, and describes the use of a form of process algebra known as Communicating Sequential Processes (CSP) in defining security-related standards to overcome RFC-related ambiguities.
As long as software has been produced, there have been efforts to strive for quality in software ... more As long as software has been produced, there have been efforts to strive for quality in software products. In order to understand quality in software products, researchers have built models of software quality that rely on metrics in an attempt to provide a quantitative view of software quality. The aim of these models is to provide software producers with the capability to define and evaluate metrics related to quality and use these metrics to improve the quality of the software they produce over time. The main disadvantage of these models is that they require effort and resources to define and evaluate metrics from software projects. This article briefly describes some prominent models of software quality in the literature and continues to describe a new approach to gaining insight into quality in software development projects. A case study based on this new approach is described and results from the case study are discussed.
Proceedings of the 2006 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Oct 9, 2006
Page 1. A Service Oriented Architecture for Wireless Sensor and Actor Network Applications JACO M... more Page 1. A Service Oriented Architecture for Wireless Sensor and Actor Network Applications JACO M. PRINSLOO, CHRISTIAN L. SCHULZ, DERRICK G. KOURIE, WH MORKEL THEUNISSEN, TINUS STRAUSS, ROELF VAN ...
Stringology, 2006
We present a data structure to index a specific kind of factors, that is of substrings, called ga... more We present a data structure to index a specific kind of factors, that is of substrings, called gapped-factors. A gapped-factor is a factor containing a gap that is ignored during the indexation. The data structure presented is based on the suffix tree and indexes all the gapped-factors of a text with a fixed size of gap, and only those. The construction of this data structure is done online in O(n × |Σ|) time and space, with n the length of the text and |Σ| the size of the alphabet. Such a data structure may play an important role in some pattern matching and motif inference problems, for instance in text filtration.
Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Jul 20, 2005
Refactoring of a design before updating and modifying software has become an accepted practice in... more Refactoring of a design before updating and modifying software has become an accepted practice in order to prepare the design for the upcoming changes. This paper describes a refactoring of the design of a particular application to illustrate a suggested approach. In this approach, we represent the design using UML, more specifically the structure using class diagrams, and the behaviour of each class using statecharts. Examining metrics of the specific design, we suggest a refactoring that changes a centralized control structure into one that employs more delegation, showing the refactored class and statechart diagrams. As preserving behaviour is one of the defining attributes of a refactoring, we use a csp-based formalism to describe the refactoring, and show that the refactoring indeed preserves behaviour.
Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on It Research in Developing Countries, Jul 20, 2005
Corporate software developers are faced with many difficulties. Development windows are decreasin... more Corporate software developers are faced with many difficulties. Development windows are decreasing; scale and complexity are increasing; business requirements are vague and changing; and the underlying technology moves ever on. Agile methods have emerged as leading contenders to tame these challenges. Small teams, face-to-face communication, an emphasis on simplicity and a selection of development best practices contribute to software development which is relevant, yet fast and flexible. At the same time, Open Source Software is increasingly providing infrastructure, tools and components to companies. Progressive development teams are beginning to work in more open, collaborative, and distributed ways. In some respects these practices are similar to agile practices, but in important ways, very different. Yet, both are important and offer unique benefits. This paper discusses the prospects of combining the two in the context of corporate software development, and the approach we suggest to do this.
Proceedings Elmar 2010, Sep 1, 2010
To make provision for irregularly occurring updates a "variable-step" polynomial filter is derive... more To make provision for irregularly occurring updates a "variable-step" polynomial filter is derived which improves the smoothing-, and coasting capabilities of the 1-step predictor and the current-estimate polynomial filters. Similar to the original filters, the variable-step polynomial filter comprises an autoinitialising expanding memory filter which later switches to a fading memory filter. Results are compared by running the proposed and original filter versions in parallel.
Proceedings of the Southern African Institute For Computer Scientist and Information Technologists Annual Conference 2014, Sep 29, 2014
Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries - SAICSIT '07, 2007
A decision-supporting framework was applied in a pilot study to assist in the decision making abo... more A decision-supporting framework was applied in a pilot study to assist in the decision making about what software development methodology to use at a software engineering company. This paper critically assesses this decision making process and framework that was used at that company to decide on an appropriate software methodology for the analysis and design of business processes and software systems.
Telecommunication technology is increasingly converging towards software-intensive solutions. As ... more Telecommunication technology is increasingly converging towards software-intensive solutions. As the role of software increases, it naturally follows that traditional software development procedures need to be reassessed. In South Africa, this need is especially urgent because of recent legislation that presages an increase in the number of competitors. This paper assesses two important contemporary software development approaches-namely agile and open source-and their relationship to one another. A variant of open source software development that is highlighted, is Hewlett-Packard's so-called 'Progressive Open Source'. This is because it is specifically tailored to corporate use. The benefits and disadvantages of adopting the various development methods are assessed. Guidelines based on project characteristics are provided to indicate when each of the respective approaches should be considered.
Open Source Software has become a noticeable factor in the Information and Communication Technolo... more Open Source Software has become a noticeable factor in the Information and Communication Technology arena. This paper investigates this claim and its relevance to the telecommunication industry. A number of misconceptions surrounding open source are explored before introducing the progression from exclusively using open source products to participating in their development. Taking the next step with open source adoption is investigated with the aim of reaping the associated benefits of this development approach. We provide some examples of how these benefits might be gained through some scenarios related to the local telecommunication software development activities.