There's an "enable alt joy keys" command now. If a key is bound to that
command, all joystick buttons (incl. hat and triggers) are turned from
K_JOYx into K_JOYx_ALT, which allows two keybindings on the same key,
one with the altselector pressed and one without.
If there's no keybinding for K_JOYx_ALT, it will use the binding for
just K_JOYx as a fallback (if it exists).
This is especially handy to create direct bindings for all the weapons
on the (limited amount of) Joystick buttons.
Seems like AMDs Windows driver doesn't like it when we call
glBufferData() *a lot* (other drivers, incl. Intels, don't seem to
care as much).
Even on an i7-4771 with a Radeon RX 580 I couldn't get stable 60fps
on Windows without this workaround (the open source Linux driver is ok).
This workaround can be enabled/disabled with the gl3_usebigvbo cvar;
by default it's -1 which means "enable if AMD driver is detected".
Enabling it when using a nvidia GPU with their proprietary drivers
reduces the performance to 1/3 of the fps we get without it, so it
indeed needs to be conditional...
use GL3_BufferAndDraw3D() instead of glBufferData() and glDrawArrays()
in each place it's needed.
This by itself doesn't make anything faster, but it will make trying out
different ways to upload data easier.
The developers tested their maps without the fix and decided that it
looked good. Add a new cvar gl_fixsurfsky defaulting to 0 that enables
the fix if someone really want it.
The software renderer already did this, but not the GL renderers. Maybe
the logic was lost somewhere on the long way... Without this change a
fullbright lightmap is generated for SURF_SKY surfaces and without the
SURF_DRAWSKY flags the surfaces aren't skipped in RecursiveLightPoint()
and GL3_LM_CreateSurfaceLightmap(). This isn't a problem under real
skyboxes, but in cases were SURF_SKY is abused fpr interior lightning.
rmine2.bsp in rogue is a good place to see the problem
Reported by @m-x-d, fixes#393.
At least with MinGW on Windows vsnprintf() treats buffer < size as an
error, returning -1 instead of the number of characters that would have
been printed without size restrictions. Therefor msgLen may be wrong,
leading to all kind of funny mistakes further down below... Buffer
overflow included. Work around this by handling the msgLen < 0 case and
adding an explicit terminating \0.
This is another case of "I wonder why nobody has never noticed this",
the GL1 renderers extension string triggered the buffer overflow each
time the game started.
I guess it makes sense to apply gamma to the color, we do the same
for the standard round particles.
Also, this way the fragment shader for square particles references the
uniCommon UBO (gamma is part of it) - apparently the Intel HD4000
(from Ivy Bridge) GPU driver for Windows has a bug that uniform blocks
that exist in the shader source but aren't actually used can't be found
(with glGetUniformBlockIndex(prog, name)), which we treat as an error
in gl3_shaders.c initShader3D().
fixes#391
If the vsync is enabled missuse it to slow the client down, e.g.
calculate the target framerate, add an security margin of 20% and
let the vsync handle the rest. This hopefully solves some problems
with frametime spikes. This is an idea by @DanielGibson.
If the vsync is disabled use a simple 1s / fps calculation.
GetSystemTimeAsFileTime() is okay as long as the game runs fullscreen.
For some reasons it's resolution degraded to ~16ms as soon as the game
runs widowed... Better use GetPerformanceCounter(), its more reliable
and the recommended API for timecounters.
Apparently the lightsource for exploding rockets/grenades is very close
to the surface, so the dot-Product between surface-normal and the
vector between the light and the pixel returns 0, basically disabling
the dynamic light for that surface.
As a workaround, move the lightposition (only for that dot product)
a bit above the surface, 32*surfaceNormal looks good.
fixes#386