Loïc Correnson - Academia.edu (original) (raw)
Papers by Loïc Correnson
International audienceThis article deals with the usage of Frama-C to detect runtime-errors. As s... more International audienceThis article deals with the usage of Frama-C to detect runtime-errors. As static analysis for runtime-error detection is not a novelty, we will present significant new usages in industrial contexts, which represent a change in the ways this kind of tool is employed. The main goal is to have a scalable methodology for using static analysis through the development process and by a development team. This goal is achieved by performing analysis on partial pieces of code, by using the ACSL language for interface definitions, by choosing a bottom-up strategy to process the code, and by enabling a well-balanced definition of actors and skills. The methodology, designed during the research project U3CAT, has been applied in industrial contexts with good results as for the quality of verifications and for the performance in the industrial process
The V&V practices of safety-critical industries (e.g. avionics) are currently based on either uni... more The V&V practices of safety-critical industries (e.g. avionics) are currently based on either unit testing or unit proof to verify that a function satisfies its low-level requirements in order to be compliant with the highest certification levels [26] (e.g. DO-178C level A for avionic software). In this context, the verification engineer must assess sufficient coverage of both code (structural coverage) and specification (functional coverage). However, there is no shared method for test and proof to measure structural coverage. In practice, this prevents the verification engineer from combining test and automatic proof to verify low-level requirements of a common piece of code in order to mitigate the verification cost. This paper fills this gap between test and proof by introducing a new notion of verification coverage based on mutation coverage. It subsumes functional coverage and structural coverage for both unit testing and unit proof. Consequently, it allows the verification en...
: La d#forestation d'un programme fonctionnel est une transformation qui consiste # #lim... more : La d#forestation d'un programme fonctionnel est une transformation qui consiste # #liminer la construction des structures interm#diaires qui sont introduites par les compositions de fonctions. La composition descriptionnelle, initialement introduite par Ganzinger et Giegerich, est une m#thode de d#forestation sp#ciøque, qui s'applique # la composition de deux grammaires attribu#es. Cet article propose une nouvelle technique de d#forestation, appel#e composition symbolique, qui est une extension et une am#lioration de la composition descriptionnelle. En traduisant automatiquement un programme fonctionnel en une grammaire attribu#e #quivalente, il est possible de lui appliquer la composition symbolique, et de traduire le r#sultat en un programme fonctionnel (par exemple, en utilisant la transformation de Johnsson). On obtient alors une transformation source # source de programmes fonctionnels. La m#thode de d#forestation ainsi obtenue donne de meilleurs r#sultats que les m#thodes fonc...
The V&V practices of safety-critical industries (e.g. avionics) are currently based on either uni... more The V&V practices of safety-critical industries (e.g. avionics) are currently based on either unit testing or unit proof to verify that a function satisfies its low-level requirements in order to be compliant with the highest certification levels [26] (e.g. DO-178C level A for avionic software). In this context, the verification engineer must assess sufficient coverage of both code (structural coverage) and specification (functional coverage). However, there is no shared method for test and proof to measure structural coverage. In practice, this prevents the verification engineer from combining test and automatic proof to verify low-level requirements of a common piece of code in order to mitigate the verification cost. This paper fills this gap between test and proof by introducing a new notion of verification coverage based on mutation coverage. It subsumes functional coverage and structural coverage for both unit testing and unit proof. Consequently, it allows the verification en...
Communications of the ACM, 2021
A panoramic view of a popular platform for C program analysis and verification.
Proceedings of the 40th International Conference on Software Engineering, 2018
Lecture Notes in Computer Science, 2012
Static analyzers usually return partial results. They can assert that some properties are valid d... more Static analyzers usually return partial results. They can assert that some properties are valid during all possible executions of a program, but generally leave some other properties to be verified by other means. In practice, it is common to combine results from several methods manually to achieve the full verification of a program. In this context, Frama-C is a platform for analyzing C source programs with multiple analyzers. Hence, one analyzer might conclude about properties assumed by another one, in the same environment. We present here the semantical foundations of validity of program properties in such a context. We propose a correct and complete algorithm for combining several partial results into a fully consolidated validity status for each program property. We illustrate how such a framework provides meaningful feedback on partial results.
Lecture Notes in Computer Science, 1999
. Software engineering has to reconcile modularity with efficiency.One way to grapple with this d... more . Software engineering has to reconcile modularity with efficiency.One way to grapple with this dilemma is to automatically transforma modular-specified program into an efficient-implementable one.This is the aim of deforestation transformations which get rid of intermediatedata structure constructions that occur when two functionsare composed. Beyond classical compile time optimization, these transformationsare undeniable tools for generic programming and softwarecomponent...
The deforestation of a functional program is a transformation which gets rid of intermediate data... more The deforestation of a functional program is a transformation which gets rid of intermediate data structures constructions that appear when two functions are composed. The descriptional composition, initially introduced by Ganzinger and Giegerich, is a deforestation method dedicated to the composition of two attribute grammars. This article presents a new functional deforestation technique, called symbolic composition, based on the descriptional composition mechanism, but extending it. An automatic translation from a functional program into an equivalent attribute grammar allows symbolic composition to be applied, and then the result can be translated back into a functional program. This yields a source to source functional program transformation. The resulting deforestation method provides a better deforestation than other existing functional techniques. Symbolic composition, that uses the declarative and descriptional features of attribute grammars is intrinsically more powerful t...
Recently, generic programming becomes of a major interest in several programming paradigms. A rec... more Recently, generic programming becomes of a major interest in several programming paradigms. A recurrent idea to achieve genericity is to specify algorithms on their convenient data structure, and to allow these specifications to be instantiated onto a large number of neighboring data structures. Polytypic programming, shapely types and generic attribute grammars are generic programming methods related to this approach. A framework for generic programming is proposed to embed these methods. It consists in tools for automatic generation of morphisms between data structures, and for program composition. Thanks to this compositional approach, the complete specialization of generic programs could be advantageously delegated to a general and powerful mechanism of "symbolic composition", which performs deforestation and partial evaluation. 1 Introduction In several programming paradigms, generic programming is being emerging. Although this concept is not new, genericity currently...
Many methods exist to perform program transformations, but most of them are dedicated to few prog... more Many methods exist to perform program transformations, but most of them are dedicated to few programming languages. We propose a new formalism able to encode an abstract representation of the operational semantics of a program. With this formalism, we define simple transformations that lead to complex ones such as deforestation or partial evaluation in several programming languages. Though highly theoretical and languageindependent, this method can be implemented and especially interfaced with real programming languages. For instance, a prototype dealing with a simple higher-order functional programming language has been implemented (with a call-by-value operational semantics). This prototype produces some more powerful transformations than other known functional methods, especially with deforestation on functions with accumulative parameters. Keywords: program transformation, partial evaluation, deforestation. 1 Introduction A program can be seen as a black box that produces output...
Informatica
Attribute grammars are well-designed to construct complex algorithms by composing several ones to... more Attribute grammars are well-designed to construct complex algorithms by composing several ones together. Actually, there exists a powerful transformation called descriptional composition which highly simplifies the composition of two attribute grammars by removing useless intermediate constructions. However, most of non-linear algorithms can not be expressed with attribute grammars. Thus, many compositions can not be simplified by the descriptional composition. In this paper, we present Equational Semantics, a formalism largely inspired by attribute grammars but where nonlinear algorithms can be encoded. More precisely, instead of being restricted to one input static tree as it is the case for attribute grammars, an algorithm encoded with Equational Semantics may use dynamically constructed trees. This formalism consists in an very poor abstract syntax. We present its semantics and some of its transformations such as partial evaluation and descriptional composition (also called defo...
Lecture Notes in Computer Science, 2013
Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity a... more Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity and hybrid nature of these systems often requires the use of interactive theorem provers to verify that these algorithms are logically correct. Usually, proofs involving numerical computations are conducted in the infinitely precise realm of the field of real numbers. However, numerical computations in these algorithms are often implemented using floating point numbers. The use of a finite representation of real numbers introduces uncertainties as to whether the properties verified in the theoretical setting hold in practice. This short paper describes work in progress aimed at addressing these concerns. Given a formally proven algorithm, written in the Program Verification System (PVS), the Frama-C suite of tools is used to identify sufficient conditions and verify that under such conditions the rounding errors arising in a C implementation of the algorithm do not affect its correctness. The technique is illustrated using an algorithm for detecting loss of separation among aircraft.
Lecture Notes in Computer Science, 1997
The functional programming community is paying increasing attention to static structure-based tra... more The functional programming community is paying increasing attention to static structure-based transformations. For example, generic control operators, such as fold, have been introduced in functional programming to increase the power and applicability of a particular kind of static transformation, called deforestation, which prevents the construction of useless intermediate data structures in function composition. This is achieved by making the structure of the data more explicit in program speci cations. We argue that one of the original concepts of Attribute Grammars is precisely to make data structures explicit in program speci cations. Furthermore, there exists a powerful static deforestation-like transformation in their context. In this paper, we present similarities between deforestation methods, on the one hand with the functional approach, and on the other hand with the Attribute Grammars approach. In order to gain a grasp of these similarities, we rst make a simple comparison: purely-synthesized Attribute Grammars and rst order folds. In this context, deforestation transformations are equivalent. This allows us to highlight the limitations of the fold formalism and to present how the hylomorphism approach generalizes it; hylomorphisms and attribute grammars are surprisingly alike. Finally, we show how the inherited attribute notion in Attribute Grammars solves some transformation problems in higher order functional programs.
Lecture Notes in Computer Science, 1999
well-designed to construct complex algorithms by composing several ones together. Actually, there... more well-designed to construct complex algorithms by composing several ones together. Actually, there exists a powerful transformation called descriptional composition which highly simpli es the composition of two attribute grammars by removing useless intermediate constructions. However, most of non-linear algorithms can not be expressed with attribute grammars. Thus, many compositions can not be simpli ed by the decriptional composition. In this paper, we present Equational Semantics, a formalism largely inspired by attribute grammars but where non-linear algorithms can be encoded. More precisely, instead of being restricted to one input static tree as it is the case for attribute grammars, an algorithm encoded with Equational Semantics may use dynamically constructed trees. This formalism consists in an very poor abstract syntax. We present its semantics and some of its transformations such as partial evaluation and decriptionnal composition (also called deforestation). In some sense, Equational Semantics is a kind of lambda-calculus dedicated to program transformations.
International audienceThis article deals with the usage of Frama-C to detect runtime-errors. As s... more International audienceThis article deals with the usage of Frama-C to detect runtime-errors. As static analysis for runtime-error detection is not a novelty, we will present significant new usages in industrial contexts, which represent a change in the ways this kind of tool is employed. The main goal is to have a scalable methodology for using static analysis through the development process and by a development team. This goal is achieved by performing analysis on partial pieces of code, by using the ACSL language for interface definitions, by choosing a bottom-up strategy to process the code, and by enabling a well-balanced definition of actors and skills. The methodology, designed during the research project U3CAT, has been applied in industrial contexts with good results as for the quality of verifications and for the performance in the industrial process
The V&V practices of safety-critical industries (e.g. avionics) are currently based on either uni... more The V&V practices of safety-critical industries (e.g. avionics) are currently based on either unit testing or unit proof to verify that a function satisfies its low-level requirements in order to be compliant with the highest certification levels [26] (e.g. DO-178C level A for avionic software). In this context, the verification engineer must assess sufficient coverage of both code (structural coverage) and specification (functional coverage). However, there is no shared method for test and proof to measure structural coverage. In practice, this prevents the verification engineer from combining test and automatic proof to verify low-level requirements of a common piece of code in order to mitigate the verification cost. This paper fills this gap between test and proof by introducing a new notion of verification coverage based on mutation coverage. It subsumes functional coverage and structural coverage for both unit testing and unit proof. Consequently, it allows the verification en...
: La d#forestation d'un programme fonctionnel est une transformation qui consiste # #lim... more : La d#forestation d'un programme fonctionnel est une transformation qui consiste # #liminer la construction des structures interm#diaires qui sont introduites par les compositions de fonctions. La composition descriptionnelle, initialement introduite par Ganzinger et Giegerich, est une m#thode de d#forestation sp#ciøque, qui s'applique # la composition de deux grammaires attribu#es. Cet article propose une nouvelle technique de d#forestation, appel#e composition symbolique, qui est une extension et une am#lioration de la composition descriptionnelle. En traduisant automatiquement un programme fonctionnel en une grammaire attribu#e #quivalente, il est possible de lui appliquer la composition symbolique, et de traduire le r#sultat en un programme fonctionnel (par exemple, en utilisant la transformation de Johnsson). On obtient alors une transformation source # source de programmes fonctionnels. La m#thode de d#forestation ainsi obtenue donne de meilleurs r#sultats que les m#thodes fonc...
The V&V practices of safety-critical industries (e.g. avionics) are currently based on either uni... more The V&V practices of safety-critical industries (e.g. avionics) are currently based on either unit testing or unit proof to verify that a function satisfies its low-level requirements in order to be compliant with the highest certification levels [26] (e.g. DO-178C level A for avionic software). In this context, the verification engineer must assess sufficient coverage of both code (structural coverage) and specification (functional coverage). However, there is no shared method for test and proof to measure structural coverage. In practice, this prevents the verification engineer from combining test and automatic proof to verify low-level requirements of a common piece of code in order to mitigate the verification cost. This paper fills this gap between test and proof by introducing a new notion of verification coverage based on mutation coverage. It subsumes functional coverage and structural coverage for both unit testing and unit proof. Consequently, it allows the verification en...
Communications of the ACM, 2021
A panoramic view of a popular platform for C program analysis and verification.
Proceedings of the 40th International Conference on Software Engineering, 2018
Lecture Notes in Computer Science, 2012
Static analyzers usually return partial results. They can assert that some properties are valid d... more Static analyzers usually return partial results. They can assert that some properties are valid during all possible executions of a program, but generally leave some other properties to be verified by other means. In practice, it is common to combine results from several methods manually to achieve the full verification of a program. In this context, Frama-C is a platform for analyzing C source programs with multiple analyzers. Hence, one analyzer might conclude about properties assumed by another one, in the same environment. We present here the semantical foundations of validity of program properties in such a context. We propose a correct and complete algorithm for combining several partial results into a fully consolidated validity status for each program property. We illustrate how such a framework provides meaningful feedback on partial results.
Lecture Notes in Computer Science, 1999
. Software engineering has to reconcile modularity with efficiency.One way to grapple with this d... more . Software engineering has to reconcile modularity with efficiency.One way to grapple with this dilemma is to automatically transforma modular-specified program into an efficient-implementable one.This is the aim of deforestation transformations which get rid of intermediatedata structure constructions that occur when two functionsare composed. Beyond classical compile time optimization, these transformationsare undeniable tools for generic programming and softwarecomponent...
The deforestation of a functional program is a transformation which gets rid of intermediate data... more The deforestation of a functional program is a transformation which gets rid of intermediate data structures constructions that appear when two functions are composed. The descriptional composition, initially introduced by Ganzinger and Giegerich, is a deforestation method dedicated to the composition of two attribute grammars. This article presents a new functional deforestation technique, called symbolic composition, based on the descriptional composition mechanism, but extending it. An automatic translation from a functional program into an equivalent attribute grammar allows symbolic composition to be applied, and then the result can be translated back into a functional program. This yields a source to source functional program transformation. The resulting deforestation method provides a better deforestation than other existing functional techniques. Symbolic composition, that uses the declarative and descriptional features of attribute grammars is intrinsically more powerful t...
Recently, generic programming becomes of a major interest in several programming paradigms. A rec... more Recently, generic programming becomes of a major interest in several programming paradigms. A recurrent idea to achieve genericity is to specify algorithms on their convenient data structure, and to allow these specifications to be instantiated onto a large number of neighboring data structures. Polytypic programming, shapely types and generic attribute grammars are generic programming methods related to this approach. A framework for generic programming is proposed to embed these methods. It consists in tools for automatic generation of morphisms between data structures, and for program composition. Thanks to this compositional approach, the complete specialization of generic programs could be advantageously delegated to a general and powerful mechanism of "symbolic composition", which performs deforestation and partial evaluation. 1 Introduction In several programming paradigms, generic programming is being emerging. Although this concept is not new, genericity currently...
Many methods exist to perform program transformations, but most of them are dedicated to few prog... more Many methods exist to perform program transformations, but most of them are dedicated to few programming languages. We propose a new formalism able to encode an abstract representation of the operational semantics of a program. With this formalism, we define simple transformations that lead to complex ones such as deforestation or partial evaluation in several programming languages. Though highly theoretical and languageindependent, this method can be implemented and especially interfaced with real programming languages. For instance, a prototype dealing with a simple higher-order functional programming language has been implemented (with a call-by-value operational semantics). This prototype produces some more powerful transformations than other known functional methods, especially with deforestation on functions with accumulative parameters. Keywords: program transformation, partial evaluation, deforestation. 1 Introduction A program can be seen as a black box that produces output...
Informatica
Attribute grammars are well-designed to construct complex algorithms by composing several ones to... more Attribute grammars are well-designed to construct complex algorithms by composing several ones together. Actually, there exists a powerful transformation called descriptional composition which highly simplifies the composition of two attribute grammars by removing useless intermediate constructions. However, most of non-linear algorithms can not be expressed with attribute grammars. Thus, many compositions can not be simplified by the descriptional composition. In this paper, we present Equational Semantics, a formalism largely inspired by attribute grammars but where nonlinear algorithms can be encoded. More precisely, instead of being restricted to one input static tree as it is the case for attribute grammars, an algorithm encoded with Equational Semantics may use dynamically constructed trees. This formalism consists in an very poor abstract syntax. We present its semantics and some of its transformations such as partial evaluation and descriptional composition (also called defo...
Lecture Notes in Computer Science, 2013
Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity a... more Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity and hybrid nature of these systems often requires the use of interactive theorem provers to verify that these algorithms are logically correct. Usually, proofs involving numerical computations are conducted in the infinitely precise realm of the field of real numbers. However, numerical computations in these algorithms are often implemented using floating point numbers. The use of a finite representation of real numbers introduces uncertainties as to whether the properties verified in the theoretical setting hold in practice. This short paper describes work in progress aimed at addressing these concerns. Given a formally proven algorithm, written in the Program Verification System (PVS), the Frama-C suite of tools is used to identify sufficient conditions and verify that under such conditions the rounding errors arising in a C implementation of the algorithm do not affect its correctness. The technique is illustrated using an algorithm for detecting loss of separation among aircraft.
Lecture Notes in Computer Science, 1997
The functional programming community is paying increasing attention to static structure-based tra... more The functional programming community is paying increasing attention to static structure-based transformations. For example, generic control operators, such as fold, have been introduced in functional programming to increase the power and applicability of a particular kind of static transformation, called deforestation, which prevents the construction of useless intermediate data structures in function composition. This is achieved by making the structure of the data more explicit in program speci cations. We argue that one of the original concepts of Attribute Grammars is precisely to make data structures explicit in program speci cations. Furthermore, there exists a powerful static deforestation-like transformation in their context. In this paper, we present similarities between deforestation methods, on the one hand with the functional approach, and on the other hand with the Attribute Grammars approach. In order to gain a grasp of these similarities, we rst make a simple comparison: purely-synthesized Attribute Grammars and rst order folds. In this context, deforestation transformations are equivalent. This allows us to highlight the limitations of the fold formalism and to present how the hylomorphism approach generalizes it; hylomorphisms and attribute grammars are surprisingly alike. Finally, we show how the inherited attribute notion in Attribute Grammars solves some transformation problems in higher order functional programs.
Lecture Notes in Computer Science, 1999
well-designed to construct complex algorithms by composing several ones together. Actually, there... more well-designed to construct complex algorithms by composing several ones together. Actually, there exists a powerful transformation called descriptional composition which highly simpli es the composition of two attribute grammars by removing useless intermediate constructions. However, most of non-linear algorithms can not be expressed with attribute grammars. Thus, many compositions can not be simpli ed by the decriptional composition. In this paper, we present Equational Semantics, a formalism largely inspired by attribute grammars but where non-linear algorithms can be encoded. More precisely, instead of being restricted to one input static tree as it is the case for attribute grammars, an algorithm encoded with Equational Semantics may use dynamically constructed trees. This formalism consists in an very poor abstract syntax. We present its semantics and some of its transformations such as partial evaluation and decriptionnal composition (also called deforestation). In some sense, Equational Semantics is a kind of lambda-calculus dedicated to program transformations.