Serge Demeyer | University of Antwerp (original) (raw)

Papers by Serge Demeyer

Research paper thumbnail of Automatic deployment space exploration using refinement transformations

To manage the complex engineering information for real-time systems, the system under development... more To manage the complex engineering information for real-time systems, the system under development may be modelled in a high-level architecture description language. This high-level information provides a basis for deployment space exploration as it can be used to generate a low-level implementation. During this deployment mapping many platform-dependent choices have to be made whose consequences cannot be easily predicted. In this paper we present an approach to the automatic exploration of the deployment space based on platform-based design. All possible solutions of a deployment step are generated using a refinement transformation. Non-conforming deployment alternatives are pruned as early as possible using simulation or analytical methods. We validate the feasibility of our approach by deploying part of an automotive power window optimized for its real-time behaviour using an AUTOSAR-like representation. First results are promising and show that the optimal solution can indeed be found efficiently with our approach.

Research paper thumbnail of DEVS for AUTOSAR platform modelling

AUTOSAR (AUTomotive Open System ARchitecture) is an open and standardized automotive software arc... more AUTOSAR (AUTomotive Open System ARchitecture) is an open and standardized automotive software architecture, developed by automobile manufacturers, suppliers and tool developers. Its design is a direct consequence of the increasingly important role software plays in vehicles. As design choices during the software deployment phase may have a large impact on the real-time properties of the system, designers need a method to explore various trade-offs. In this paper we evaluate the appropriateness of DEVS, the Discrete-Event system Specification, for modelling and subsequent performance evaluation of AUTOSAR-based systems. We demonstrate and validate our work by means of a power window and ABS case study.

Research paper thumbnail of Calibration of deployment simulation models: a multi-paradigm modelling approach

ABSTRACT In embedded systems development, software engineers increasingly rely on modelling and s... more ABSTRACT In embedded systems development, software engineers increasingly rely on modelling and simulation to produce optimal design solutions. A bottleneck in the Modelling and Simulation Based Design (MSBD) process is model calibration. Setting up experiments to estimate parameter values such that the model accurately reflects real-world system structure and behaviour is technically complex and labour intensive. Parameters to be estimated are for example effective processor speed, memory consumption and network throughput of the hardware platform on which software is deployed. In this paper we show how Multi-Paradigm Modelling (MPM) allows for the synthesis of a model calibration infrastructure. This includes the synthesis, from a model, of a simulator for the "environment" in which a system-to-be built will operate. To demonstrate the feasibility of our approach, we calibrate the model of an automotive power window controller running on the AUTOSAR-platform.

Research paper thumbnail of Migrating from a Proprietary RTOS to the OSEK Standard Using a Wrapper

Lecture Notes in Electrical Engineering, 2011

ABSTRACT The drive towards standardization in the automotive sector puts a lot of pressure on sof... more ABSTRACT The drive towards standardization in the automotive sector puts a lot of pressure on software suppliers to comply with standards such as OSEK and AUTOSAR. However, many of these suppliers have a vested interest in proprietary software and are seeking ways to migrate their existing code base to comply with these standards. This paper reports on a feasibility study to migrate an automotive off-highway application to the OSEK-OS using a wrapper around the proprietary real-time operating system (RTOS). Besides investigating whether this is feasible, we also assess the performance impact in terms of computation time and memory consumption, as this is critically important for real-time systems. Finally some pitfalls are given when porting a given application to OSEK-compliant RTOS. As such, we evaluate the typical trade-offs one has to make when adopting an incremental migration strategy towards a standard compliant interface.

Research paper thumbnail of Wrapping a real-time operating system with an OSEK compliant interface—a feasibility study

The drive towards standardization in the automotive sector puts a lot of pressure on software sup... more The drive towards standardization in the automotive sector puts a lot of pressure on software suppliers to comply with standards such as OSEK and AUTOSAR. However, many of these suppliers have a vested interest in proprietary software and are seeking ways to migrate their existing code-base to comply to these standards. This paper reports on a feasibility study to wrap a proprietary real-time operating system with an OSEK compliant interface. Besides investigating whether this is feasible, we also assess the performance impact in terms of computation time and memory consumption, as this is critically important for real-time systems. As such, we evaluate the typical trade-offs one has to make when adopting an incremental migration strategy towards a standard compliant interface.

Research paper thumbnail of Change Impact Analysis for UML Model Maintenance

Research paper thumbnail of Studying Versioning Information to Understand Inheritance Hierarchy Changes

Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007), 2007

With the widespread adoption of object-oriented programming, changing the inheritance hierarchy b... more With the widespread adoption of object-oriented programming, changing the inheritance hierarchy became an inherent part of today's software maintenance activities. Unfortunately, little is known about the "state-of-thepractice" with respect to changing an application's inheritance hierarchy, and consequently we do not know how the change process can be improved. In this paper, we report on a study of the hierarchy changes stored in a versioning system to explore the answers to three research questions:

Research paper thumbnail of International workshop on software quality and maintainability

2014 Software Evolution Week - IEEE Conference on Software Maintenance, Reengineering, and Reverse Engineering (CSMR-WCRE), 2014

ABSTRACT SQM 2014 (http://sqm2014.sig.eu), the 8th International workshop on software quality and... more ABSTRACT SQM 2014 (http://sqm2014.sig.eu), the 8th International workshop on software quality and maintainability, was organized as a satellite event of the CSMR-WCRE 2014 conference in Antwerp, on February 3, 2014. The workshop received 18 submissions, focusing on research, empirical studies, industry practices and experiences in the area of software quality, maintainability and traceability. This year, the special theme of the workshop is “exploring the boundaries between the theory and practice of software quality”.

Research paper thumbnail of An Initial Investigation of a Multi-layered Approach for Optimizing and Parallelizing Real-Time Media and Audio Applications

2013 Eighth International Conference on P2P, Parallel, Grid, Cloud and Internet Computing, 2013

Research paper thumbnail of Characterizing the Relative Significance of a Test Smell

2006 22nd IEEE International Conference on Software Maintenance, 2006

Test code, just like any other code we write, erodes when frequently changed. As such, refactorin... more Test code, just like any other code we write, erodes when frequently changed. As such, refactoring, which has been shown to impact maintainability and comprehensibility, can be part of a solution to counter this erosion. We propose a metric-based heuristical approach, which allows to rank occurrences of so-called test smells (i.e. symptoms of poorly designed tests) according to their relative significance. This ranking can subsequently be used to start refactoring. Through an open-source case study, ArgoUML, we demonstrate that we are able to identify those test cases who violate unit test criteria.

Research paper thumbnail of Supporting inconsistency resolution through predictive change impact analysis

Proceedings of the 6th International Workshop on Model-Driven Engineering, Verification and Validation - MoDeVVa '09, 2009

ABSTRACT Today, model-driven software processes rely on consistency management approaches to deal... more ABSTRACT Today, model-driven software processes rely on consistency management approaches to deal with the multitude of inconsistencies that occur in large systems. To resolve a detected inconsistency, the software designer applies one resolution out of a set of resolution options to each inconsistency. To do so, the designer needs to understand the extent of changes posed by each resolution. In this paper we propose change impact analysis to support the designer in this task. We present a simple algorithm for predicting the impact of inconsistency resolutions by checking the instantiation of different meta-model relationships. Based on one small case study, we demonstrate that our algorithm provides a reasonable estimate for the number of changes that actually will be applied. We demonstrate the usage of impact analysis for inconsistency resolution and make a first step towards a decision support tool to help a software designer resolve inconsistencies.

Research paper thumbnail of Considering Polymorphism in Change-Based Test Suite Reduction

Lecture Notes in Business Information Processing, 2014

With the increasing popularity of continuous integration, algorithms for selecting the minimal te... more With the increasing popularity of continuous integration, algorithms for selecting the minimal test-suite to cover a given set of changes are in order. This paper reports on how polymorphism can handle false negatives in a previous algorithm which uses method-level changes in the base-code to deduce which tests need to be rerun. We compare the approach with and without polymorphism on two distinct cases -PMD and CruiseControl-and discovered an interesting trade-off: incorporating polymorphism results in more relevant tests to be included in the test suite (hence improves accuracy), however comes at the cost of a larger test suite (hence increases the time to run the minimal test-suite).

Research paper thumbnail of The Eclipse and Mozilla defect tracking dataset: A genuine dataset for mining bug information

2013 10th Working Conference on Mining Software Repositories (MSR), 2013

ABSTRACT The analysis of bug reports is an important subfield within the mining software reposito... more ABSTRACT The analysis of bug reports is an important subfield within the mining software repositories community. It explores the rich data available in defect tracking systems to uncover interesting and actionable information about the bug triaging process. While bug data is readily accessible from systems like Bugzilla and JIRA, a common database schema and a curated dataset could significantly enhance future research because it allows for easier replication. Consequently, in this paper we propose the Eclipse and Mozilla Defect Tracking Dataset, a representative database of bug data, filtered to contain only genuine defects (i.e., no feature requests) and designed to cover the whole bug-triage life cycle (i.e., store all intermediate actions). We have used this dataset ourselves for predicting bug severity, for studying bug-fixing time and for identifying erroneously assigned components. Sharing these data with the rest of the community will allow for reproducibility, validation and comparison of the results obtained in bug-report analyses and experiments.

Research paper thumbnail of Predicting the severity of a reported bug

2010 7th IEEE Working Conference on Mining Software Repositories (MSR 2010), 2010

The severity of a reported bug is a critical factor in deciding how soon it needs to be fixed. Un... more The severity of a reported bug is a critical factor in deciding how soon it needs to be fixed. Unfortunately, while clear guidelines exist on how to assign the severity of a bug, it remains an inherent manual process left to the person reporting the bug. In this paper we investigate whether we can accurately predict the severity of a reported bug by analyzing its textual description using text mining algorithms. Based on three cases drawn from the open-source community (Mozilla, Eclipse and GNOME), we conclude that given a training set of sufficient size (approximately 500 reports per severity), it is possible to predict the severity with a reasonable accuracy (both precision and recall vary between 0.65-0.75 with Mozilla and Eclipse; 0.70-0.85 in the case of GNOME).

Research paper thumbnail of Filtering Bug Reports for Fix-Time Analysis

2012 16th European Conference on Software Maintenance and Reengineering, 2012

ABSTRACT Several studies have experimented with data mining algorithms to predict the fix-time of... more ABSTRACT Several studies have experimented with data mining algorithms to predict the fix-time of reported bugs. Unfortunately, the fix-times as reported in typical open-source cases are heavily skewed with a significant amount of reports registering fix-times less than a few minutes. Consequently, we propose to include an additional filtering step to improve the quality of the underlying data in order to gain better results. Using a small-scale replication of a previously published bug fix-time prediction experiment, we show that the additional filtering of reported bugs indeed improves the outcome of the results.

Research paper thumbnail of Detecting move operations in versioning information

Conference on Software Maintenance and Reengineering (CSMR'06), 2006

Recently, there is an increasing research interest in mining versioning information, i.e. the ana... more Recently, there is an increasing research interest in mining versioning information, i.e. the analysis of the transactions made on version systems to understand how and when a software system evolves. One particular area of interest is the identification of move operations as these are key indicators for refactorings. Unfortunately, there exists no evaluation which identifies the quality (expressed in precision

Research paper thumbnail of Aspect-Oriented Programming

Lecture Notes in Computer Science, 1999

Aspect-oriented programming is a promising idea that can improve the quality of software by reduc... more Aspect-oriented programming is a promising idea that can improve the quality of software by reduce the problem of code tangling and improving the separation of concerns. At ECOOP'97, the first AOP workshop brought together a number of researchers interested in aspectorientation. At ECOOP'98, during the second AOP workshop the participants reported on progress in some research topics and raised more issues that were further discussed. This year, the ideas and concepts of AOP have been spread and adopted more widely, and, accordingly, the workshop received many submissions covering areas from design and application of aspects to design and implementation of aspect languages.

Research paper thumbnail of Redocumentation of a legacy banking system

Proceedings of the Joint ERCIM Workshop on Software Evolution (EVOL) and International Workshop on Principles of Software Evolution (IWPSE) on - IWPSE-EVOL '10, 2010

Successful software systems need to be maintained. In order to do that, profound knowledge about ... more Successful software systems need to be maintained. In order to do that, profound knowledge about their architecture and implementation details is required. This knowledge is often kept implicit (inside the heads of the experts) and sometimes made explicit in documentation. The problem is that systems often lack up-to-date documentation and that system experts are frequently unavailable (as they got another job or retired). Redocumentation addresses that problem by recovering knowledge about the system and making it explicit in documentation. Automating the redocumentation process can limit the tedious and error-prone manual effort, but it is no silver bullet. In this paper, we report on our experience with applying redocumentation techniques in industry. We provide insights on what (not) to document, what (not) to automate and how to automate it. A concrete lesson learned during this study is that the "less is more" principle also applies to redocumentation.

Research paper thumbnail of A Layered Approach to Dedicated Application Builders Based on Application Frameworks

Research paper thumbnail of Why Unified Is not Universal

Lecture Notes in Computer Science, 1999

UML is currently embraced as "the" standard in object-oriented modeling languages, the recent wor... more UML is currently embraced as "the" standard in object-oriented modeling languages, the recent work of OMG on the Meta Object Facility (MOF) being the most noteworthy example. We welcome these standardisation efforts, yet warn against the tendency to use UML as the panacea for all exchange standards. In particular, we argue that UML is not sufficient to serve as a tool-interoperability standard for integrating round-trip engineering tools, because one is forced to rely on UML's built-in extension mechanisms to adequately model the reality in source-code. Consequently, we propose an alternative meta-model (named FAMIX), which serves as the tool interoperability standard within the FAMOOS project and which includes a number of constructive suggestions that we hope will influence future releases of the UML and MOF standards. Keywords: meta model, unified modeling language (UML), meta-object facility (MOF), interoperability standard, famoos information exchange (FAMIX)

Research paper thumbnail of Automatic deployment space exploration using refinement transformations

To manage the complex engineering information for real-time systems, the system under development... more To manage the complex engineering information for real-time systems, the system under development may be modelled in a high-level architecture description language. This high-level information provides a basis for deployment space exploration as it can be used to generate a low-level implementation. During this deployment mapping many platform-dependent choices have to be made whose consequences cannot be easily predicted. In this paper we present an approach to the automatic exploration of the deployment space based on platform-based design. All possible solutions of a deployment step are generated using a refinement transformation. Non-conforming deployment alternatives are pruned as early as possible using simulation or analytical methods. We validate the feasibility of our approach by deploying part of an automotive power window optimized for its real-time behaviour using an AUTOSAR-like representation. First results are promising and show that the optimal solution can indeed be found efficiently with our approach.

Research paper thumbnail of DEVS for AUTOSAR platform modelling

AUTOSAR (AUTomotive Open System ARchitecture) is an open and standardized automotive software arc... more AUTOSAR (AUTomotive Open System ARchitecture) is an open and standardized automotive software architecture, developed by automobile manufacturers, suppliers and tool developers. Its design is a direct consequence of the increasingly important role software plays in vehicles. As design choices during the software deployment phase may have a large impact on the real-time properties of the system, designers need a method to explore various trade-offs. In this paper we evaluate the appropriateness of DEVS, the Discrete-Event system Specification, for modelling and subsequent performance evaluation of AUTOSAR-based systems. We demonstrate and validate our work by means of a power window and ABS case study.

Research paper thumbnail of Calibration of deployment simulation models: a multi-paradigm modelling approach

ABSTRACT In embedded systems development, software engineers increasingly rely on modelling and s... more ABSTRACT In embedded systems development, software engineers increasingly rely on modelling and simulation to produce optimal design solutions. A bottleneck in the Modelling and Simulation Based Design (MSBD) process is model calibration. Setting up experiments to estimate parameter values such that the model accurately reflects real-world system structure and behaviour is technically complex and labour intensive. Parameters to be estimated are for example effective processor speed, memory consumption and network throughput of the hardware platform on which software is deployed. In this paper we show how Multi-Paradigm Modelling (MPM) allows for the synthesis of a model calibration infrastructure. This includes the synthesis, from a model, of a simulator for the "environment" in which a system-to-be built will operate. To demonstrate the feasibility of our approach, we calibrate the model of an automotive power window controller running on the AUTOSAR-platform.

Research paper thumbnail of Migrating from a Proprietary RTOS to the OSEK Standard Using a Wrapper

Lecture Notes in Electrical Engineering, 2011

ABSTRACT The drive towards standardization in the automotive sector puts a lot of pressure on sof... more ABSTRACT The drive towards standardization in the automotive sector puts a lot of pressure on software suppliers to comply with standards such as OSEK and AUTOSAR. However, many of these suppliers have a vested interest in proprietary software and are seeking ways to migrate their existing code base to comply with these standards. This paper reports on a feasibility study to migrate an automotive off-highway application to the OSEK-OS using a wrapper around the proprietary real-time operating system (RTOS). Besides investigating whether this is feasible, we also assess the performance impact in terms of computation time and memory consumption, as this is critically important for real-time systems. Finally some pitfalls are given when porting a given application to OSEK-compliant RTOS. As such, we evaluate the typical trade-offs one has to make when adopting an incremental migration strategy towards a standard compliant interface.

Research paper thumbnail of Wrapping a real-time operating system with an OSEK compliant interface—a feasibility study

The drive towards standardization in the automotive sector puts a lot of pressure on software sup... more The drive towards standardization in the automotive sector puts a lot of pressure on software suppliers to comply with standards such as OSEK and AUTOSAR. However, many of these suppliers have a vested interest in proprietary software and are seeking ways to migrate their existing code-base to comply to these standards. This paper reports on a feasibility study to wrap a proprietary real-time operating system with an OSEK compliant interface. Besides investigating whether this is feasible, we also assess the performance impact in terms of computation time and memory consumption, as this is critically important for real-time systems. As such, we evaluate the typical trade-offs one has to make when adopting an incremental migration strategy towards a standard compliant interface.

Research paper thumbnail of Change Impact Analysis for UML Model Maintenance

Research paper thumbnail of Studying Versioning Information to Understand Inheritance Hierarchy Changes

Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007), 2007

With the widespread adoption of object-oriented programming, changing the inheritance hierarchy b... more With the widespread adoption of object-oriented programming, changing the inheritance hierarchy became an inherent part of today's software maintenance activities. Unfortunately, little is known about the "state-of-thepractice" with respect to changing an application's inheritance hierarchy, and consequently we do not know how the change process can be improved. In this paper, we report on a study of the hierarchy changes stored in a versioning system to explore the answers to three research questions:

Research paper thumbnail of International workshop on software quality and maintainability

2014 Software Evolution Week - IEEE Conference on Software Maintenance, Reengineering, and Reverse Engineering (CSMR-WCRE), 2014

ABSTRACT SQM 2014 (http://sqm2014.sig.eu), the 8th International workshop on software quality and... more ABSTRACT SQM 2014 (http://sqm2014.sig.eu), the 8th International workshop on software quality and maintainability, was organized as a satellite event of the CSMR-WCRE 2014 conference in Antwerp, on February 3, 2014. The workshop received 18 submissions, focusing on research, empirical studies, industry practices and experiences in the area of software quality, maintainability and traceability. This year, the special theme of the workshop is “exploring the boundaries between the theory and practice of software quality”.

Research paper thumbnail of An Initial Investigation of a Multi-layered Approach for Optimizing and Parallelizing Real-Time Media and Audio Applications

2013 Eighth International Conference on P2P, Parallel, Grid, Cloud and Internet Computing, 2013

Research paper thumbnail of Characterizing the Relative Significance of a Test Smell

2006 22nd IEEE International Conference on Software Maintenance, 2006

Test code, just like any other code we write, erodes when frequently changed. As such, refactorin... more Test code, just like any other code we write, erodes when frequently changed. As such, refactoring, which has been shown to impact maintainability and comprehensibility, can be part of a solution to counter this erosion. We propose a metric-based heuristical approach, which allows to rank occurrences of so-called test smells (i.e. symptoms of poorly designed tests) according to their relative significance. This ranking can subsequently be used to start refactoring. Through an open-source case study, ArgoUML, we demonstrate that we are able to identify those test cases who violate unit test criteria.

Research paper thumbnail of Supporting inconsistency resolution through predictive change impact analysis

Proceedings of the 6th International Workshop on Model-Driven Engineering, Verification and Validation - MoDeVVa '09, 2009

ABSTRACT Today, model-driven software processes rely on consistency management approaches to deal... more ABSTRACT Today, model-driven software processes rely on consistency management approaches to deal with the multitude of inconsistencies that occur in large systems. To resolve a detected inconsistency, the software designer applies one resolution out of a set of resolution options to each inconsistency. To do so, the designer needs to understand the extent of changes posed by each resolution. In this paper we propose change impact analysis to support the designer in this task. We present a simple algorithm for predicting the impact of inconsistency resolutions by checking the instantiation of different meta-model relationships. Based on one small case study, we demonstrate that our algorithm provides a reasonable estimate for the number of changes that actually will be applied. We demonstrate the usage of impact analysis for inconsistency resolution and make a first step towards a decision support tool to help a software designer resolve inconsistencies.

Research paper thumbnail of Considering Polymorphism in Change-Based Test Suite Reduction

Lecture Notes in Business Information Processing, 2014

With the increasing popularity of continuous integration, algorithms for selecting the minimal te... more With the increasing popularity of continuous integration, algorithms for selecting the minimal test-suite to cover a given set of changes are in order. This paper reports on how polymorphism can handle false negatives in a previous algorithm which uses method-level changes in the base-code to deduce which tests need to be rerun. We compare the approach with and without polymorphism on two distinct cases -PMD and CruiseControl-and discovered an interesting trade-off: incorporating polymorphism results in more relevant tests to be included in the test suite (hence improves accuracy), however comes at the cost of a larger test suite (hence increases the time to run the minimal test-suite).

Research paper thumbnail of The Eclipse and Mozilla defect tracking dataset: A genuine dataset for mining bug information

2013 10th Working Conference on Mining Software Repositories (MSR), 2013

ABSTRACT The analysis of bug reports is an important subfield within the mining software reposito... more ABSTRACT The analysis of bug reports is an important subfield within the mining software repositories community. It explores the rich data available in defect tracking systems to uncover interesting and actionable information about the bug triaging process. While bug data is readily accessible from systems like Bugzilla and JIRA, a common database schema and a curated dataset could significantly enhance future research because it allows for easier replication. Consequently, in this paper we propose the Eclipse and Mozilla Defect Tracking Dataset, a representative database of bug data, filtered to contain only genuine defects (i.e., no feature requests) and designed to cover the whole bug-triage life cycle (i.e., store all intermediate actions). We have used this dataset ourselves for predicting bug severity, for studying bug-fixing time and for identifying erroneously assigned components. Sharing these data with the rest of the community will allow for reproducibility, validation and comparison of the results obtained in bug-report analyses and experiments.

Research paper thumbnail of Predicting the severity of a reported bug

2010 7th IEEE Working Conference on Mining Software Repositories (MSR 2010), 2010

The severity of a reported bug is a critical factor in deciding how soon it needs to be fixed. Un... more The severity of a reported bug is a critical factor in deciding how soon it needs to be fixed. Unfortunately, while clear guidelines exist on how to assign the severity of a bug, it remains an inherent manual process left to the person reporting the bug. In this paper we investigate whether we can accurately predict the severity of a reported bug by analyzing its textual description using text mining algorithms. Based on three cases drawn from the open-source community (Mozilla, Eclipse and GNOME), we conclude that given a training set of sufficient size (approximately 500 reports per severity), it is possible to predict the severity with a reasonable accuracy (both precision and recall vary between 0.65-0.75 with Mozilla and Eclipse; 0.70-0.85 in the case of GNOME).

Research paper thumbnail of Filtering Bug Reports for Fix-Time Analysis

2012 16th European Conference on Software Maintenance and Reengineering, 2012

ABSTRACT Several studies have experimented with data mining algorithms to predict the fix-time of... more ABSTRACT Several studies have experimented with data mining algorithms to predict the fix-time of reported bugs. Unfortunately, the fix-times as reported in typical open-source cases are heavily skewed with a significant amount of reports registering fix-times less than a few minutes. Consequently, we propose to include an additional filtering step to improve the quality of the underlying data in order to gain better results. Using a small-scale replication of a previously published bug fix-time prediction experiment, we show that the additional filtering of reported bugs indeed improves the outcome of the results.

Research paper thumbnail of Detecting move operations in versioning information

Conference on Software Maintenance and Reengineering (CSMR'06), 2006

Recently, there is an increasing research interest in mining versioning information, i.e. the ana... more Recently, there is an increasing research interest in mining versioning information, i.e. the analysis of the transactions made on version systems to understand how and when a software system evolves. One particular area of interest is the identification of move operations as these are key indicators for refactorings. Unfortunately, there exists no evaluation which identifies the quality (expressed in precision

Research paper thumbnail of Aspect-Oriented Programming

Lecture Notes in Computer Science, 1999

Aspect-oriented programming is a promising idea that can improve the quality of software by reduc... more Aspect-oriented programming is a promising idea that can improve the quality of software by reduce the problem of code tangling and improving the separation of concerns. At ECOOP'97, the first AOP workshop brought together a number of researchers interested in aspectorientation. At ECOOP'98, during the second AOP workshop the participants reported on progress in some research topics and raised more issues that were further discussed. This year, the ideas and concepts of AOP have been spread and adopted more widely, and, accordingly, the workshop received many submissions covering areas from design and application of aspects to design and implementation of aspect languages.

Research paper thumbnail of Redocumentation of a legacy banking system

Proceedings of the Joint ERCIM Workshop on Software Evolution (EVOL) and International Workshop on Principles of Software Evolution (IWPSE) on - IWPSE-EVOL '10, 2010

Successful software systems need to be maintained. In order to do that, profound knowledge about ... more Successful software systems need to be maintained. In order to do that, profound knowledge about their architecture and implementation details is required. This knowledge is often kept implicit (inside the heads of the experts) and sometimes made explicit in documentation. The problem is that systems often lack up-to-date documentation and that system experts are frequently unavailable (as they got another job or retired). Redocumentation addresses that problem by recovering knowledge about the system and making it explicit in documentation. Automating the redocumentation process can limit the tedious and error-prone manual effort, but it is no silver bullet. In this paper, we report on our experience with applying redocumentation techniques in industry. We provide insights on what (not) to document, what (not) to automate and how to automate it. A concrete lesson learned during this study is that the "less is more" principle also applies to redocumentation.

Research paper thumbnail of A Layered Approach to Dedicated Application Builders Based on Application Frameworks

Research paper thumbnail of Why Unified Is not Universal

Lecture Notes in Computer Science, 1999

UML is currently embraced as "the" standard in object-oriented modeling languages, the recent wor... more UML is currently embraced as "the" standard in object-oriented modeling languages, the recent work of OMG on the Meta Object Facility (MOF) being the most noteworthy example. We welcome these standardisation efforts, yet warn against the tendency to use UML as the panacea for all exchange standards. In particular, we argue that UML is not sufficient to serve as a tool-interoperability standard for integrating round-trip engineering tools, because one is forced to rely on UML's built-in extension mechanisms to adequately model the reality in source-code. Consequently, we propose an alternative meta-model (named FAMIX), which serves as the tool interoperability standard within the FAMOOS project and which includes a number of constructive suggestions that we hope will influence future releases of the UML and MOF standards. Keywords: meta model, unified modeling language (UML), meta-object facility (MOF), interoperability standard, famoos information exchange (FAMIX)