The qc source path is specified via the dirpath property in the QF Entity
Classes panel of the scene data. The scanned entity classes are stored in a
plist in a blender text file for persistent storage (so the directory
doesn't need to be scanned every time). Also, so the data doesn't have to
be parsed every time, the data is stored in a normal python class hanging
off the properties class (evil hack?).
Slightly cleaner EntityClass building, and now the directory scanner is
part of the EntityClassDict class, which also supports reading/writing
plists (for persistent storage in blender).
They are written as normal string items, so anything using the plist later
on will need to know the context, but at least now there's no need to first
convert int or float data to strings before writing a plist.
More or less.
This is a bit of an experiment (which seems to work nicely) in that qfcc
and pak are build natively in one build tree, then the full system is
cross-built in another tree using the natively build qfcc and pak. Both
trees are created by cross-configure.sh as sub-directories of the current
directory. However, cross-configure.sh still assumes it is being run in a
subdirectory of the main quakeforge directory.
cross-make.sh checks if the native tree is in the current directory and if
so, builds it, otherwise it just runs make for the cross-build tree (this
allows for running cross-make in a sub-directory for things like sorting
out build issues).
CC_FOR_BUILD is the recommended name these days (HOST_CC was from an old
gcc version, and is confusing anyway). Also, CC_FOR_BUILD should be set by
configure.
Unfortunately, just because the header is there doesn't mean anything will
actually work :(. Also, the check is based on the host vendor/os for now.
Yes, it's rather lame but it will do for now.
With this, QF will build on an almost fresh ps3toolchain install. Only two
"fixes" are needed:
o In $PS3DEV/ppu/powerpc64-ps3-elf: ln -s ../include sys-include
o libsamplerate cross-built and installed.
One common use for a mesh having multiple UV maps is when combining several
mesh objects into one: the base UV map is the result of joining the meshes
(and will be a right mess of overlapping UV islands), but an additional UV
map is then setup as a copy of the first but with the islands re-packed so
nothing overlaps. The export script now searches for the active UV map and
uses that for both UV coordinates and the skin texture (when none is
specified).
If there's no export script, or the export script has no frame information,
animation data will be collected by running through blender frames 1 to the
current frame (inclusive). Each frame will be exported as a single frame
rather than as members of a frame group.
I'd forgotten I hadn't implemented exporting vertex normals. While I've
modified things for making better use of blender's tools and avoiding the
unnecessary use of objects, the code is taken from the ajmdl blender 2.4
export script.
Return statements never flow to the next block (or any other block, for
that matter), so drawing arrows leaving them not only messes up dot's
graphs, but is quite missleading.
When mering if/goto (ie, if skipping a goto), the rest of the dead code
remover is used to delete the goto. That part of the code unuses the goto's
label. The if was getting the goto's label without the lable's used count
being incremented (the usaged temporarily increases by one). I have no idea
why the problem showed up randomly, but this seems to fix it (it fixes /a/
bug, anyway).
The naive implementation of the if/goto merging was letting the old target
of the if get dropped because the block would lose its label and thus be
judged unreachable because the preceeding goto block was still in the list.
Instead, when the if/goto are "merged", mark the goto block as unreachable,
the following block as reachable, and break out of the analysis loop to
force the removal of the goto block. Since the dead block removal function
loops until no action is taken, all other dead blocks will be removed.
The output can be controlled via --block-dot (not yet documented). The
files a named <sourcefile>.<function>.<stage>.dot. Currently, stage will be
one of "initial" (after expression to statement conversion), "thread"
(after jump threading), "dead" (after dead block removal), "final" (final
state before actual code emission).
Labels can be shared between multiple flow-control instructions, so use the
label's used counter to determine when to remove the label. This was
causing problems with the jump threading.
The common cause seems to be casting a cast (very common, and I'm not sure
just realiasing the expression would be right). It does't cause any harm
(particularly, it doesn't trigger alias def chains), so I won't worry about
it.
The actual bug might still be elsewhere, but at least now I know the alias
chains were coming from accessing .return and .param_N, which are unions
(not directly usable by the progs engine). Emitting a reference to a union
(or struct) would create an alias def, but an alias expression was created
in the expression tree to simplify return/param access. The double layer
(sometimes 3 or 4) alias isn't really neaded, so rather than layering the
aliases, just re-alias the alaised def.
It is inteded for flagging buggy conditions in the compiler, particularly
after having fixed the original bug (in case something comes back from the
dead).
v6 progs expects .zero to be only 1 word. The code actually tried to keep
vector out of .zero, but it seems I'd rearranged the structure defintion
without updating the code that kills the vector field. Problem spotted by
divVerent.
Not sure if it actually works as the clients don't render the result
properly (can't see anything where the model should be), but the output
model does import back into blender properly.
Since qf does linear interpolation of verts, this seems to be reasonable.
Certaintly better than the rose-thorns I got because I haven't figured out
how to kick the auto-clamp.