The addition of xdef data has made qfo_to_progs unusable in qfprogs,
resulting in various invalid memory accesses. It always was an ugly hack
anyway, so this is the first step to proper qfo support in qfprogs.
I was originally going to put it in the debug syms file, but I realized
that the data persistence code would need access to both def type and
certainly correct def offsets for defs in far data.
This far better reflects the actual meaning. It is very likely that
ty_none is a holdover from long before there was full type encoding and
it meant that the union in qfcc's type_t had no data. This is still
true for basic types, but only if not a function, field or pointer type.
If the type was function, field or pointer, it was not true, so it was
misnamed pretty much from the start.
The encoding is 3:5 giving 3 bits for alignment (log2) and 5 bits for
size, with alignment in the 3 most significant bits. This keeps the
format backwards compatible as until doubles were added, all types were
aligned to 1 word which gets encoded as 0, and the size is unaffected.
This fixed the uninitialized temp warning in HUD.r. The problem was
caused by the flow analyzer not being able to detect that the struct
temp was being initialized by the move statement due to the address of
the temp being in a pointer temp. While it would be good to use a
constant pointer for the address of the struct temp or improving the
flow analyzer to track actual data, avoiding the temp in the first place
results in nicer code as it removes a move statement.
Only as scalars, I still need to think about what to do for vectors and
quaternions due to param size issues. Also, doubles are not yet
guaranteed to be correctly aligned.
I don't remember what the goal was (stopped working on it eight months
ago), but some possibilities include:
- better handling of nil (have trouble with assigning into struts)
- automatic forward declarations ala C# and jai (I was watching vids
about jai at the time)
- something for pascal
- simply that the default symbol type should not be var (in which case,
goal accomplished)
Currently, they can represent either vectors or quaternions, and the
quaternions can be in either [s, v] form or [w, x, y, z] form.
Many things will not actual work yet as the vector expression needs to be
converted into the appropriate form for assigning the elements to the
components of the "vector" type.
This is a nice feature found in fteqcc (also a bit of a challenge from
Spike). Getting bison to accept the new expression required rewriting the
state expression grammar, so this is mostly for the state expression. A
test to ensure the state expression doesn't break is included.
This goes towards complementing the "if not" logic extension. I need to
check if fteqcc supports "not" with "while" (the version I have access to
at the moment does not), and also whether it would be good to support
"not" with "for", and if so, what form the syntax should take.
It is syntactic sugar for if (!(foo)), but is useful for avoiding
inconsistencies between such things as if (string) and if (!string), even
though qcc can't parse if not (string). It also makes for easier to read
code when the logic in the condition is complex.
Dead nodes are those that generate unused values (unassigned leaf nodes,
expressions or destinationless move(p) nodes). The revoval is done by the
flow analysis code (via the dags code) so that any pre and post removal
flow analysis and manipulation may be done (eg, available expressions).
qfcc now does local common subexpression elimination. It seems to work, but
is optional (default off): use -O to enable. Also, uninitialized variable
detection is finally back :)
The progs engine now has very basic valgrind-like functionality for
checking pointer accesses. Enable with pr_boundscheck 2
When an alais def (or aliased def) is used, any overlapping aliases that
have previously been assigned need to be marked as live, and edges to the
aliases added to the new node. However, when assigned to, live-forcing
needs to be turned off.
This fixes the lost assignments to .super.
When the naive uninitialized variable detection finds a node with possible
uses of uninitialized variables, the statements in the node are scanned one
at a time checking each usage and removing uninitialized definitions as
appropriate. vectest.r now compiles without warnings. As an added bonus,
accurate line number information is reported for uninitialized variables.
Unfortunately, there is still a problem with uninitialized temps in
switch.r, but that might just be poor handling of temp op aliases.
The dummy nodes are for detectining uninitialized variables (entry dummy)
and making globals live at function exit (exit dummy). The reaching defs
and live vars code currently seg because neither node has had its sets
initialized.
Also move the ALLOC/FREE macros from qfcc.h to QF/alloc.h (needed to for
set.c).
Both modules are more generally useful than just for qfcc (eg, set
builtins for ruamoko).
Set of everything is implemented by inverting the meaning of bits in the
bitmap: 1 becomes non-member, 0 member. This means that set_size and
set_first/set_next become inverted and represent non-members as counting
members becomes impossible :)
With the need to handle aliasing in the optimizer, it has become apparent
that having the flow data attached to symbols is not nearly as useful as
having it attached to defs (which are views of the actual variables).
This also involves a bit of a cleanup of operand types: op_pointer and
op_alias are gone (this seems to greatly simplify the optimizer)
There is a bit of a problem with enums in switch statements, but this might
actually be a sign that something is not quite right in the switch code
(other than enums not being recognized as ints for jump table
optimization).
Simply "backed" and "virutal". Backed spaces have memory allocated to them
while virtual spaces do not. Virtual spaces are intended for local
variables and entity fields.
With this, alias defs become singletons based on the def they alias and the
type and offset of the alias. Thus, the removal of the free_def call in
emit.c.
alias_def now always creates an offset def (though the usual case has an
offset of 0). The if the alias escapes the bounds of the base def, an
internal error will be generated.
It really doesn't seem wise to allow the compiler to do so as it would
overwrite unrelated defs. The only time such a thing is valid is the return
statement (silly vm design), and that's read-only.
Also remove the extern for current_storage as it belongs in shared.h.
I'm not satisfied with the documentation for initialize_def, but it will do
for now. I probably have to rewrite the thing as it's a bit of a beast.
With the intoduction of the statement type enum came a prefix clash. As
"st" makes sense for "statement type", I decided that "storage class"
should be "sc". Although there haven't been any problems as of yet, I
decided it would be a good idea to clean up the clash now. It also helps
avoid confusion (I was a bit surprised after working with st_assign etc to
be reminded of st_extern etc).
It doesn't quite work yet, but...
It has proven necessary to know what type .return has at any point in the
function. The segfault in ctf is caused by the return statement added to
the end of the void function messing with the expr pointer stored in the
daglabel for .return. While this is actually by design (though the
statement really should have a valid expr pointer rather than), it actually
highlights a bigger problem: there's no stable knowledge of the current
type of .return. This is not a problem in expression statements as the
dagnodes for expression statements store the desired types of all operands.
However, when assigning from .return to attached variables in a leaf node,
the type of .return is not stored anywhere but the expression last
accessing .return.
Now information like dags or live variables are dumped separately, and the
live variable information replaces the flow node in the diagram (like dags
have recently).
They really should have been in statements.[ch] in the first place
(actually, they sort of were: is_goto etc, so some redundant code has been
removed, too).
The evil comment is not just "pragmas are bad, ok?", but switching between
advanced, extended and tradtitional modes when compiling truly is evil and
not guaranteed to work. However, I needed it to make building test cases
easier (it's mostly ok to go from advanced to extended or tradtional, but
going the other way will probably cause all sorts of fun).
In the process, opcode_init now copies the opcode table data rather than
modifying it.
After running across a question about lists of animation frames and states,
I decided giving qfcc the ability to generate such lists might be a nice
distraction from the optimizer :) Works for both progs.src and separate
compilation. No frame file is generated if no macros have been created.
It is necessary to know if a def is a function parameter so it can be
treated as initialized by the flow analyzer. The support for the flag in
object files is, at this stage, purely for debugging purposes.
.return and .param_N are not classed as global variables for data flow
analysis. .return is taken care of by return statements, and .param_N by
call statements.
With this, the menus work up to attempting to load the menu plist.
Something is corrupting zmalloc's blocks.
With temp types changing and temps being reused within the one instruction,
the def type is no longer usable for selecting the opcode. However, the
operand types are stable and more correct.
Nicely, the need for dag_gencode to recurse seems to have been removed.
At least for a simple case, correct code is generated :)
switch.r:49: case 1: *to = *from++;
003b loadbi.i *(from + 0), .tmp10
003c add.i from, .imm, from
003d storep.i .tmp10, *to
It doesn't make any difference yet, but that's because I need to add extra
edges indicating iter-node dependencies. However, the sort does seem to
work for its limited input.
While things are quite broken now (very incorrect code is being generated),
the dag is much easier to work with. The dag is now stored in an array of
nodes (the children pointers are still used for dagnode operands), and sets
are used for marking node parents, attached identifiers and (when done,
extra edges).
Instead of storing the generating statement in the dagnode, the generating
expression is stored in the daglabel. The daglabel's expression pointer is
updated each time the label is attached to a node. Now I know why debugging
optimized code can be... interesting.
It now seems to generate correct code for each node. However, node order is
still incorrect in places (foo++ is being generated as ++foo). quattest.r
actually executes and produces the right output :)
flow_analyze_statement uses the statement type to quickly determin which
operands are inputs and which are outputs. It takes (optional) sets for
used variables, defined variables and killed variables (only partially
working, but I don't actually use kill sets yet). It also takes an optional
array for storing the operands: index 0 is the output, 1-3 are the inputs.
flow_analyze_statement clears any given sets on entry.
Live variable analysis now uses the sets rather than individual vars. Much
cleaner code :).
Dags are completely broken.
The types are expression, assignment, pointer assignment (ie, write to a
dereferenced pointer), move (special case of pointer assignment), state,
function call/return, and flow control. With this classification, it will
be easier (less code:) to determine which operands are inputs and which are
outputs.
Surprisingly, I don't yet have to "throw one out", but things are still
problematic: rcall1 is getting two arguments, goto and return get lost,
rcall2 got an old temp rather than the value it was supposed to, but
progress :)
First, it turns out using daglabels wasn't such a workable plan (due to
labels being flushed every sblock). Instead, flowvars are used. Each actual
variable (whether normal or temp) has a pointer to the flowvar attached to
that variable.
For each variable, the statements that use or define the variable are
recorded in the appropriate set attached to each (flow)variable.
The flow graph nodes are now properly separated from the graph, and edge
information is stored in the graph struct. This actually made for much
cleaner code (partly thanks to the use of sets and set iterators).
Flow graph reduction has been (temporarily) ripped out as the entire
approach was wrong. There was also a bug in that I didn't really understand
the dragon book about selecting nodes and thus messed things up. The
depth-first search tree "fixed" the problem, but was really the wrong
solution (sledge hammer :P).
Also, now that I understand that dot's directed graphs must be acyclic, I
now have much better control over the graphs (back edges need to be
flipped).
The reduction is performed itteratively until the graph is irreducible, but
such that each reduction wraps the previous graph. Unfortunately, due
depth-first searching not being implemented, graphs that should be reduced
(ie, those with natural loops).
set_first() now returns a pointer to a setstate_t struct that holds the
state necessary for scanning a set. set_next() will automatically delete
the state block when the end of the set is reached. set_delstate() is also
provided to allow early termination of the scan.
They're now dot_sblock.c and print_sblock. The new names both better
reflect their purpose and free up "flow" for outputting the real flow
analysis graphs.
Much of the data recently added to sblock_t has been moved to flownode_t.
No graph reduction is carried out yet, but the initial (innermost level)
graph has been built.