And other related fields so integer is now int (and uinteger is uint). I
really don't know why I went with integer in the first place, but this
will make using macros easier for dealing with types.
This includes calls and unconditional jumps, relative and through a
table. The parameters are all lumped into the one object, with some
being unused by the different types (eg, args and ret_type used only by
call expressions). Just having nice names for the parameters (instead of
e1 and e2) makes it nice, even with all the sub-types lumped together.
No mysterious type aliasing bugs this time ;)
build_builtin_function does the right thing, and it was only legacy
syntax functions that were affected anyway. Certainly, external
variables should not be initialized, but klik uses @extern { } wrapped
around several builtin functions and I had added the feature to allow
just this as it is rather convenient.
It now takes a context pointer (opaque data) that holds the buffers it
uses for the temporary strings. If the context pointer is null, a static
context is used (making those uses of va NOT thread-safe). Most calls to
va use the static context, but all such calls have been formatted
consistently so they are easy to find when it comes time to do a full
audit.
There's still some cleanup to do, but everything seems to be working
nicely: `make -j` works, `make distcheck` passes. There is probably
plenty of bitrot in the package directories (RPM, debian), though.
The vc project files have been removed since those versions are way out
of date and quakeforge is pretty much dependent on gcc now anyway.
Most of the old Makefile.am files are now Makemodule.am. This should
allow for new Makefile.am files that allow local building (to be added
on an as-needed bases). The current remaining Makefile.am files are for
standalone sub-projects.a
The installable bins are currently built in the top-level build
directory. This may change if the clutter gets to be too much.
While this does make a noticeable difference in build times, the main
reason for the switch was to take care of the growing dependency issues:
now it's possible to build tools for code generation (eg, using qfcc and
ruamoko programs for code-gen).
It is now "consistent" with the rest of the type building in that it
uses find_type(append_type(return, params)) like the C version, thus
allowing append_type to do its thing with type aliases. This fixes the
overload test.
When a type is aliased, the alias has two type chains: the simple type
chain with all other aliases stripped, and the full type chain. There
are still plenty of bugs in it, but having the clean type chain takes
care of the major issue that was in the previous attempt as only the
head of the type-chain needs to be skipped for type comparison.
Most of the bugs are in finding the locations where the head needs to be
skipped.
All simple type checks are now done using is_* helper functions. This
will help hide the implementation details of the type system from the
rest of the compiler (especially the changes needed for type aliasing).
This fixes the problem of using the return value of a function as an
element in a compound initializer. The cause of the problem is that
compound initializers were represented by block expressions, but
function calls are contained within block expressions, so def
initialization saw the block expression and thought it was a nested
compound initializer.
Technically, it was a bug in the nested element parsing code in that it
wasn't checking the result value of the block expression, but using a
whole new expression type makes things much cleaner and the work done
paves the way for labeled initializers and compound assignments.
It proved to be too fragile in its current implementation. It broke
pointers to incomplete structs and switch enum checking, and getting it
to work for other things was overly invasive. I still want the encoding,
but need to come up with something more robust.a
Such declarations were being lost, thus in the following, the id field
never got added:
typedef struct qwaq_mevent_s {
int id;
int x, y, z;
int buttons;
} qwaq_mevent_t;
This is where constant folding should have happened all along. While
unary_expr should fold constants too, it seems to already try to do so
and it's a bit much of a mess to clean up right now.
I don't remember what the goal was (stopped working on it eight months
ago), but some possibilities include:
- better handling of nil (have trouble with assigning into struts)
- automatic forward declarations ala C# and jai (I was watching vids
about jai at the time)
- something for pascal
- simply that the default symbol type should not be var (in which case,
goal accomplished)
Currently, they can represent either vectors or quaternions, and the
quaternions can be in either [s, v] form or [w, x, y, z] form.
Many things will not actual work yet as the vector expression needs to be
converted into the appropriate form for assigning the elements to the
components of the "vector" type.
This is a nice feature found in fteqcc (also a bit of a challenge from
Spike). Getting bison to accept the new expression required rewriting the
state expression grammar, so this is mostly for the state expression. A
test to ensure the state expression doesn't break is included.
This goes towards complementing the "if not" logic extension. I need to
check if fteqcc supports "not" with "while" (the version I have access to
at the moment does not), and also whether it would be good to support
"not" with "for", and if so, what form the syntax should take.
It is syntactic sugar for if (!(foo)), but is useful for avoiding
inconsistencies between such things as if (string) and if (!string), even
though qcc can't parse if not (string). It also makes for easier to read
code when the logic in the condition is complex.
Empty structs are now (correctly) invalid. The hack of using an empty
struct to represent a handle returned from a builtin has been unnecessary
since opaque structs were implemented: now a pointer to an opaque struct
can be used. This is mostly safe as handles are aways negative and thus
attempting to dereference such a pointer should result in a VM error. It
will be even safer once const is implemented and the pointers can be made
constant (eg, typedef struct handle * const handle;)
This is needed to allow compile-time protocol conformance checks, though
nothing along those lines has been implemented yet.
id has been changed from TYPE to OBJECT, required to allow id <proto> to be
parsed. OBJECT uses symbol, allowing id to be redefined once suitable work
has been done on the parser.
Turns out there was only one place to fix (for qc, anyway: I don't have
tests for qp yet). func-static now passes :)
Hmm, how to test for static var naming... (not implemented yet)
With the intoduction of the statement type enum came a prefix clash. As
"st" makes sense for "statement type", I decided that "storage class"
should be "sc". Although there haven't been any problems as of yet, I
decided it would be a good idea to clean up the clash now. It also helps
avoid confusion (I was a bit surprised after working with st_assign etc to
be reminded of st_extern etc).
This required throwing out the primary rules that snax did up to help me
with conflicts many years ago, but they were now getting in the way. Now
the productions from primary are merged in with unary_expr.
Since gnu bison and flex are required anyway, no harm in using their api
prefix options. Now, qfcc can compile both QC/Ruamoko and Pascal files
(Pascal is (currently?) NOT supported in progs.src mode), selecting the
language based on the extension: .r, .qc and .c select QC/Ruamoko, .pas and
.p select Pascal, while anything else is treated as an object file (as
before).
Assignment of nil to a field function is permitted, but trying to use one
as a builtin or as a normal function is treated as an error.
.void (float y) func; OK
.void (float y) bi = #0; error
.void (float y) ni = nil; OK
.void (float y) co = { }; error
The parser wants to treat .float () foo; as a function returning a float
field, but qcc treats it as a field holding a function variable.
Fortuantely, field types are always "simple" (ie, at worst, just more
field type wrappers around the non-field type), so all that's needed to
obtain qcc grammar is to reach into the field type layers and do the
function type calculation based on the non-field type found there.
Due to ambiguities in the grammar, qc-style function params and c-style
function params had to be completely separated. This means that qc-style
functions can not use pointers and must use qc-style function declarations
for parameters, and c-style functions must use c-style param declarations.
While this rule is tedious for converting the Ruamoko library, it does
actually make for a more consistent language.
The root class's ivar symbol table needs to be connected to the global
symbol table while building the current class's ivar symbol table. This
allows access to other classes and typedefs.
find_type now operates recursively (depth-first) so built up typechains
work as expected.
@overload is treated as a specifier (directly as a storage class, similaar
to typedef).
While broken, it is good enough to compile modulo.r.
This gives qfcc C's type system with QC specifics bolted on top (rather
than trying to cram C's type system into QC's). Pointers and arrays now
use C syntax and semantics (though I think there's a lot of breakage in
there at the moment). Functions themselves are still dual mode: both QC
style and C style are available.