With this renamed cvars can be rewritten when config.cfg is first
loaded. Please note that once this was done older YQ2 versions can't
parse that config.cfg anymore.
gl_maxfps > 1000 breaks things, and cl_maxfps starts to behave weird
at >90, and while up to 125 or so you get the bugfeature of higher
jumping, beyond that things just get even buggier, at some point causing
bugs like #261
If too many of these sounds are started in one frame (for example if the
player shoots with the super shotgun into the power screen of a Brain)
things get too loud and OpenAL is forced to scale the volume of several
other sounds and the background music down. That leads to a noticable
and annoying drop in the overall volume.
Work around that by limiting the number of sounds started. 16 was
choosen by empirical testing.
This was so broken... Casting the type of an array to silence a
warning... It worked on x86, of course. But gave a SIGBUS on ARM.
Do it right, cast / copy the content of the array into another
array of the correct type. Yeah.
This fixes issue #231.
There're two possible problems with the calculation of the number of
sound buffers for Vorbis if OpenAL is in use:
* We assume that the (more or less) maximum number buffers is allocated
during map load. This is not correct if in a multiplayer game a lot of
custom models with custom sound connect at a later time.
* 64 buffers (about 3 seconds worth of music) may be too low in some
situations.
Work around this by recalculating the number of buffers if necessary.
We're now reserving about 256 (== 12 seconds) buffers.
This may fix issue #252.
turns out clock_get_time() uses mach_timespec_t which is very similar
to POSIX timespec_t so we're back to just one Sys_Microseconds() function
with an #ifdef __APPLE__ for the (relatively small) differences
Older versions of OS X don't implement clock_gettime() and no(?) version
seems to implement CLOCK_MONOTONIC. Work around this by implementing an
OS X specific variant of Sys_Microseconds() that relies on Mach APIs
provided by all OS X versions...
While at it alter the generic variant so that CLOCK_MONOTONIC is used
only if it's available. CLOCK_REALTIME as a fallback should be good
enough in most cases.
This is believed to fix issue #239.
We need to take in account that scaling the characters makes them
bigger, thus they need need to be places depending on the scale and not
at a precaclulated position. This should fix issue #247.
Returning 'microseconds / 1000ll' at the first call is wrong, the game
would thing that the first frame too way too much time. For some reason
this wirks in (my) Win10, but breaks on (my) Win7...
The original client used single precision mode on Windows and the
default mode on all other platforms. Most platform (at least OS X,
FreeBSD, NetBSD up to 6.0, OpenBSD and Solaris) set double precision
as default, Linux sets extended double precision... When playing a
network game there're several possibilities:
* Same precision on both sides: This one is okay, of course.
* single precision <-> double precision: This one is okay, too. I guess
this is because the code allows a small deviation between client and
server to work around imprecisions introduced be the network protocol.
* double precision <-> extended double precision: This one is okay,
likely for the same reasons given above.
* single precision <-> extended double precision: This one gives a lot
of misspredictions at client side.
All of these are more or less academic these days. Yamagi Quake II used
the platforms default mode for ages. And both gcc and clang default to
SSE2 math (with double precision as default on all platforms) when
compiling for amd64. So the only reasonable case is Linux/i386 on one
side and the original client or another source port on Windows/i386 at
the other side.
Work around this by forcing the x87 to double precision mode.
Miscframes are coupled to renderframes and are just checking for
renderer changes (very cheap) and advancing CD audio if implemented.
There's no reason not to that at every frame.