Incremental Compilation of Locally Optimized Code
Incremental Compilation of Locally Optimized Code
Incremental Compilation of Locally Optimized Code
Abstract
Although optimizing compilers have successfully been used to reduce the size and
running times of compiled programs, present incremental compilers only support
the incremental update of optimized code.
In this work, we extend the notion of incremental compilation to include optimized
code. Techniques to incrementally compile locally optimized code, given
intermediate code modifications are developed using a program representation
based on flow graphs and digs.
A model is designed to represent both optimized and optimized code and to
maintain an optimizing history. Changes to the optimized code which either destroy
optimizations or create conditions for further optimizations are incorporated into the
model and the optimized code without recompiling unaffected optimizations.
Features of an Incremental Optimizer
(I) Detects any change in the source program that invalidates current
optimized code.
(II) Detects any change in the source program that creates optimizations.
(III) Correctly and efficiently incorporates changes into the optimized code and
updates the optimizing history to maintain consistency between the
optimized code and the optimization history.
(IV) Uses techniques which are consistent with, and can be integrated
into, existing incremental compilers.
(V) And enables the interactive symbolic debugging of optimized code.
Effects of Program Change on Optimized Code
I. We first consider the effects of program modification on optimized code.
II. We examine program changes that do not alter the low graph structure and
changes that create a basic block.
III. divide a basic block into smaller blocks or combine two blocks into one.
IV. Edit changes that create, delete or replace intermediate code statements are
considered as well as modifications to individual operands within an intermediate
code statement.
V. The analysis consists of examining the requirements must be satisfied in order to
correctly perform each optimization and then detailing those types of edits which
either destroy.
Local Redundant Store Elimination
The following conditions cause a destruction of a redundant store optimization:
I. Insertion of a use of A between the 2 stores. (2) Deletion of the later store to A
II. Should be noted that deletion of the first store does not affect the optimized
code.
III. Changing the flow of control between the 2 stores to A, separating the stores
into different basic blocks.
IV. Deletion of the only use of A between 2 stores to A within a basic block.
V. Insertion of a store to A when there exists an earlier or later store to A within the
same basic block with no intervening uses of A,
VI. Changing the flow of control which results in merging 2 basic blocks such that
there exist 2 stores to A within the same basic block with no intervening uses of
A.
Local Common Sub expression Elimination
II. Insertion of a store to, or use of, the variable being defined in the later expression (D) between
the common sub expression evaluations.
V. Changing the flow of control between the 2 common sub expression evaluations, separating
the common sub expressions into different basic blocks.
Interconnection of Optimizations
Code optimization was analyzed in isolation to determine the program changes which invalidate or
create conditions for each optimization, the complete effects of a particular program change can
involve a combination of destruction and creation of several different optimizations.
This can occur by a program change directly causing multiple optimization changes or as a result of
modifying the code in order to update affected optimizations.
For example, we consider the effects on the optimized code of inserting the statement Z-B+C
after statement 3 in the following unoptimited code segment.
The three address code has been slightly modified such that multiple operands can appear on
the left hand side of an assignment.
(i.e., X.B-Y+Z means X-Y +Z and B-X.)
I. A=Y+2 I. A=Y+Z
II. z-x+w 2. z-x+w
III. X-Y-+2 Z-B+C 3. X,B-Y+2
IV. B-Y+2
Incremental Algorithms - Locally Optimized Code
If it has, the model and mappings are updated as well as the optimized code to
reflect the reversal of the optimization. Similarly, when an action is known to
create a necessary condition for an optimization, the algorithms use the model to
determine that all conditions for the new optimization are satisfied.
Incremental Optimization Algorithms
The current algorithms support common sub expression and redundant store
elimination and allow program edits which do not affect the flow graph
structure.
The insert and delete algorithms are presented in this paper. The replace
algorithm consists of execution of the delete algorithm for the replaced
intermediate code statement followed by execution of the insert algorithm to
incorporate the new intermediate code statement.
The algorithms use the model to detect the effects of program changes
disregarding any eliminated redundant store statement (represented by a
NOSTORE status) and using the current location of an optimized common sub
expression (rather than the original location).
In the discussion of the insert and delete algorithms in the next two sections,
we concentrate on the detection process for affected optimizations. The
model and optimized code updates are more fully described for both creation
and
Flow Graph Changes to Locally Optimized Code
The incremental optimization algorithms are currently being generalized to support program
edits which affect the flow graph structure. In order to detect optimizations affected by
control flow changes, the changes are identified as a sequence of insertions, deletions, or
changes to flow graph edges and separations, creations, deletions, or merges of basic blocks.
The intermediate code changes which can cause these control flow changes are: (1) insertion
of a label on an
1. A-G+F 1. A=G+F
2. F=D+E 2. F,B-DSE
3. H-A+2 3. H,D=A+2
4. B-D+E
5. D-A+2
Summary
The ideal incremental programming environment would closely integrate the incremental compiler.
incremental optimizer and a symbolic debugger.
The similarity of information needed for incremental compilation and the symbolic debugging of
optimized code suggests that variations of the same basic internal representation can be used for
both purports.
In summary, we present the first step in a project IO develop an incremental optimizing compiler.
The model and algorithms presented in this paper can be used to incrementally update locally
optimized code.
The algorithms modify the optimized code to reflect any destroyed or created optimizations due to
the source code changes.
They are currently being implemented in C6 running under UNIX? on a Vex I l/78, and form the basis
of a prototype incremental optimization system that we are building.
References
• A.V. Aho and J.D. Ullman, Principles compiler Design, Addison-Wesley, Menlo Park, CA
(1977).
• L.V. Atkinson, J.J. McGregor, and SD. North, “Context-sensitive editing as an approach to
incremental compilation,” The Computer Jourtd 24(3). pp.222- 229 (1981).
• Jay Early and Paul Caizergues, “A method for incrementally compiling languages with
nested statement structure,” Communications of the ACM 15(12), pp.1040-1044 (Dee
1972).
• Jeanne Ferranti, K. J. Ottenstein, and J. D. Warren “The program dependence graph and
its use in optimization,” RC 10208 (#44947) (August 1983).