Declarative specialization for object-oriented-program specialization (original) (raw)
Related papers
Towards automatic specialization of Java programs
RAPPORT DE RECHERCHE-INSTITUT NATIONAL DE RECHERCHE EN …
Automatic program specialization can derive efficient implementations from generic components, thus reconciling the often opposing goals of genericity and efficiency. This technique has proved useful within the domains of imperative, functional, and logical languages, but so far has not been explored within the domain of object-oriented languages. We present experiments in the specialization of Java programs. We demonstrate how to construct a program specializer for Java programs from an existing specializer for C programs and a Java-to-C compiler. Specialization is managed using a declarative approach that abstracts over the optimization process and masks implementation details. Our experiments show that program specialization provides a four-time speedup of an image-filtering program. Based on these experiments, we identify optimizations of object-oriented programs that can be carried out by automatic program specialization. We argue that program specialization is useful in the field of software components, allowing a generic component to be specialized to a specific configuration.
Automatic program specialization for Java
ACM Transactions on Programming Languages and Systems, 2003
The object-oriented style of programming facilitates program adaptation and enhances program genericness, but at the expense of efficiency. We demonstrate experimentally that state-of-the-art Java compilers fail to compensate for the use of object-oriented abstractions in the implementation of generic programs, and that program specialization can eliminate a significant portion of these overheads. We present an automatic program specializer for Java, illustrate its use through detailed case studies, and demonstrate experimentally that it can significantly reduce program execution time. Although automatic program specialization could be seen as being subsumed by existing optimizing compiler technology, we show that specialization and compiler optimization are in fact complementary.
Selective Specialization for Object-Oriented Languages
Sigplan Notices, 1995
Dynamic dispatching is a major source of run-time overhead in object-oriented languages, due both to the direct cost of method lookup and to the indirect effect of preventing other optimizations. To reduce this overhead, optimizing compilers for object-oriented languages analyze the classes of objects stored in program variables, with the goal of bounding the possible classes of message receivers enough so that the compiler can uniquely determine the target of a message send at compile time and replace the message send with a direct procedure call. Specialization is one important technique for improving the precision of this static class information: by compiling multiple versions of a method, each applicable to a subset of the possible argument classes of the method, more precise static information about the classes of the method's arguments is obtained. Previous specialization strategies have not been selective about where this technique is applied, and therefore tended to significantly increase compile time and code space usage, particularly for large applications. In this paper, we present a more general framework for specialization in object-oriented languages and describe a goaldirected specialization algorithm that makes selective decisions to apply specialization to those cases where it provides the highest benefit. Our results show that our algorithm improves the performance of a group of sizeable programs by 65% to 275% while increasing compiled code space requirements by only 4% to 10%. Moreover, when compared to the previous state-of-the-art specialization scheme, our algorithm improves performance by 11% to 67% while simultaneously reducing code space requirements by 65% to 73%. * In our syntax, λ(arg1:type){ ... code ... } is a closure that accepts one argument, and λ(type):type is a static type declaration for such a closure. self is the name of the receiver of a method. Other arguments of the form Class::arg represent methods that are dispatched on multiple arguments (multi-methods). Alternatively, in a singly-dispatched language, they could be written in a double-dispatching style [Ingalls 86]. Dynamically-dispatched message sends are shown in this font.
Towards Unifying Inheritance and Automatic Program Specialization
DAIMI Report Series, 2002
Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, efficient implementation. However, specialization of programs and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. <br /> This paper presents the initial development of Lapis, which uses inheritance with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be efficiently implemented and a simple yet powerful partial evaluator for an object-orient...
A Lightweight Approach to Program Specialization
2000
Within the imperative programming paradigm, program slic- ing has been widely used as a basis to solve many software engineering problems, like debugging, testing, differencing, specialization, and merg- ing. In this work, we present a lightweight approach to program spe- cialization of lazy functional logic programs which is based on dynamic slicing. The kind of specialization performed by our approach
An extension mechanism for the Java language
1999
This thesis presents the design and implementation of an extensible dialect of the Java language, named OpenJava. Although the Java language is well polished and dedicated to cover applications of a wide range of computational domain, it still lacks some mechanisms necessary for typical kinds of applications. Our OpenJava enables programmers to extend the Java language and implement such mechanisms on demand. It is an advanced macro processor based on the technique called compile-time reection.
An Effective Automated Approach to Specialization of Code
2007
Application performance is heavily dependent on the compiler optimizations. Modern compilers rely largely on the information made available to them at the time of compilation. In this regard, specializing the code according to input values is an effective way to communicate necessary information to the compiler. However, the static specialization suffers from possible code explosion and dynamic specialization requires runtime compilation activities that may degrade the overall performance of the application. This article proposes an automated approach for specializing code that is able to address both the problems of code size increase and the overhead of runtime activities. We first obtain optimized code through specialization performed at static compile time and then generate a template that can work for a large set of values through runtime specialization. Our experiments show significant improvement for different SPEC benchmarks on Itanium-II(IA-64) and Pentium-IV processors using icc and gcc compilers.
Experiments in program compilation by interpreter specialization
1998
Th me 2 G nie logiciel et calcul symbolique Projet COMPOSE Rapport de recherche n 3588 D cembre 1998 20 pages Abstract: Interpretation and run-time compilation techniques are becoming increasingly important due to the need to support heterogeneous architectures, evolving programming languages, and dynamically downloaded code. Although interpreters are easy to write and maintain, they are ine cient. On the other hand, run-time compilation provides e cient execution, but is costly to implement. One way to get the best of both approaches is to apply program specialization to an interpreter in order to generate an e cient implementation automatically. Recent advances in program specialization technology have resulted in important improvements in the performance of specialized interpreters. This paper presents and assesses experimental results for the application of program specialization to both bytecode and structured-language interpreters. The results show that for general-purpose bytecode, program specialization can yield speedups of up to a factor of four, while specializing certain structured-language interpreters can yield performance equivalent to code compiled by a general-purpose compiler. Téléphone : 02 99 84 71 00 -International : +33 2 99 84 71 00 Télécopie : 02 99 84 71 71 -International : +33 2 99 84 71 71
Applicability of Method Specialization Techniques to Java
2010
Method specialization is an optimization used to eliminate virtual call sites and open up opportunities for other compiler optimizations. Existing method specialization techniques do not explicitly handle dynamic class-loading or are suitable for a dynamic compilation environment. This thesis examines previous method specialization techniques and illustrates the transformations with a running example. These techniques are also reviewed to determine the applicability of each method for use in a dynamic compilation environment that support dynamic class-loading (such as Java). Additionally a new method specialization framework is given that is designed for a dynamic compilation environment and handles dynamic class-loading. Aspects that need to be examined when making method specialization decisions for a dynamic compiler are listed and analyzed. Finally numbers regarding opportunities for method specialization the SPECjvm98 and SPECjbb2000 benchmarks suites are listed and investigated. your selflessness, your understanding, and your love. At points I have trouble understanding how you could be so understanding with my schedule of research, swim practices, swim meets, and training camps. Every day I grow in awe of your character and grow in love for your heart. I am so lucky to have the privilege of being with you. I love you.