Nifty: if you pass a struct via reference to a function, and a field of
that struct may be both set and not set (eg, set only in an if statement),
gcc will report that field assuming that fields that are never set will be
set by the function (my interpretation).
* taniwha ponders the flow analysis for that
The configuration file had been renamed.
Also, mention docstrap in INSTALL and make the documentaion configure check
for the required tools (doxygen, dot (graphvis), mscgen, and transfig).
Native versions of qfcc and pak are now built automatically, and the
android toolchain now defaults to a more sensible place. Also, the separate
pkg-config replacement is no longer necessary.
It turns out gcc has a way to force functions to inline even when it thinks
doing so would not be a good idea (call to a modest sized function unlikely).
The depth limits in the gl and glsl renderers and in the trace code really
bothered me, but then the fix hit me: at load-time, recurse the trees
normally and record the depth in the appropriate place. The node stacks can
then be allocated as necessary (I chose to add a paranoia buffer of 2, but
I expect the maximum depth will rarely be used).
These are based on the ps3dev scripts, so native qfcc and pak are built
automatically.
Note that there may be a need to replace or even just nuke bison in the
toolchain as it is too old and can't build qfcc.
While accessing short foo[2][4]; as foo[0][0..7] should work in theory, who
knows what gcc does with it when optimizing. I don't know if this will fix
johnnyonflame's bsp loading problem, but no point in having rhinodemonic
code hanging around.
After running across a question about lists of animation frames and states,
I decided giving qfcc the ability to generate such lists might be a nice
distraction from the optimizer :) Works for both progs.src and separate
compilation. No frame file is generated if no macros have been created.
Using "=" was rather confusing, so changing it to "<CONV>" seems to be a
good idea. As the string is used only for selecting opcodes at compile
time, only qfcc is affected.
Because of the way it is used, the data in the type encodings space needs
to always be correct (ie, relocated), even for partially linked object
files.
Rather, only that it is neither external nor local. The idea was to catch
myself swapping the arguments to resolve_external_def, but for some reason
I decided type encoding defs would not be global (save game reasons?).
Fixes the bogus redefined errors when entity fields are used.
Also, rename extern_defs and defined_defs to extern_data_defs and
defined_data_defs (more consistent with the other tables).
The problem was caused by add_relocs and process_loose_relocs adjusting the
reloc offset based on the reloc's space's base address. This is fine for
most relocs, but as relocs for the type space have already been adjusted by
process_type_space, those relocs must be left alone by add_relocs and
process_loose_relocs. As a bonus, the duplicate code has been refactored
into a separate function :)
Now each encoding is copied across def by def using memcpy, with the
expectation that any references to other types will be handled via the
reloc system. Unfortunately, it seems there's an off-by-4 (hmm, suspicious
number...) in the reloc offsets, but I'll look into that after I get some
sleep.
defspace_alloc_loc can cause a realloc which will break the work qfo space
data pointers, so wrap it with alloc_data, which updates the appropriate
pointers and sizes.