In the savegame from that bugreport, "monster_zsec_shotgun_12" was
lying dead pretty much at its spawn point.
script/map_alphalabs4.script moves those "ride_of_death" platforms
around, and at the end of a cycle teleports "ride_of_death*_parent" back
to its starting position - and the "ride_of_death*" bound to it, which
is a pushing mover, just gets dragged along by the physics code and thus
can collide with that zombie cadaver, which then tries to push it along,
causing that assertion in TestHugeTranslation().
This is a map bug - Doom3 even prints a warning:
WARNING: script/map_alphalabs4.script(722): Thread
'map_alphalabs4::RideOfDeathPath': teleported 'ride_of_death2_parent'
which has the pushing mover 'ride_of_death2' bound to it
So I just disable that assertion for this specific case..
Also moved the assertion behind the corresponding warning, so that gets
printed before the assertion kills the game..
Also a small change to CMakeLists.txt that should make enabling
LINUX_RELEASE_BINS after CMake has already been run without it work
For some reason Wayland thought it would be clever to be the only
windowing system that (non-optionally) uses the alpha chan of the
window's default OpenGL framebuffer for window transparency.
This always caused glitches with dhewm3, as Doom3 uses that alpha-chan
for blending tricks (with GL_DST_ALPHA) - especially visible in the main
menu or when the flashlight is on.
So far the workaround has been r_waylandcompat which requests an OpenGL
context/visual without alpha chan (0 alpha bits), but that also causes
glitches.
There's an EGL extension that's supposed to fix this issue
(EGL_EXT_present_opaque), and newer SDL2 versions use it (when using
the wayland backend) - but unfortunately the Mesa implementation is
broken (seems to provide a visual without alpha channel even if one was
requested), see https://gitlab.freedesktop.org/mesa/mesa/-/issues/5886
and https://github.com/libsdl-org/SDL/pull/4306#issuecomment-1014770600
for the corresponding SDL2 discussion
To work around this issue, dhewm3 now disables the use of that EGL
extension and (optionally) makes sure the alpha channel is opaque at
the end of the frame.
This behavior is controlled with the r_fillWindowAlphaChan CVar:
If it's 1, this always is done (regardless if wayland is used or not),
if it's 0 it's not done (even on wayland),
if it's -1 (the default) it's only done if the SDL "video driver" is
wayland (this could be easily enhanced later in case other windowing
systems have the same issue)
r_waylandcompat has been removed (it never worked properly anyway),
so now the window always has an alpha chan
so if someone configured 16x AA on a system that doesn't support it
(like when using AMDs open source driver), 8x will be tried before
falling back to a 640x480 window with no AA at all.
(and then it'll try 4x and then 2x and then no AA at all, and only then
reducing color depth will start, and even later it'll fall back to
a small 640x480 window)
I don't have such hardware, so I can't test; I also didn't actually build
dhewm3 (would have to build dependencies for ARM first..), but when
configuring ARM or ARM64 targets for Visual C++ in CMake, that's now
correctly detected in CMake.
Also, now D3_ARCH should always be "arm" for 32bit ARM
and "arm64" for 64bit ARM, also when building with GCC or clang.
I was lazy and just renamed SDL_win32_main's stdout.txt - but I still
added the dhewm3log-old.txt backup function.
I also renamed Sys_GetHomeDir() to Win_GetHomeDir() as it's Win32-only
On Windows it's in Documents\My Games\dhewm3\dhewm3log.txt
and refactorings needed for that (I want to create the log right at the
beginning before much else has been initialized, so using things like
idStr or Sys_GetPath() isn't safe)
save_path being $XDG_DATA_HOME/dhewm3/ (usually ~/.local/share/dhewm3/)
on most POSIX systems, $HOME/Library/Application Support/dhewm3/ on Mac
If the log file already exists, it's renamed to dhewm3log-old.txt first,
so there's always the most recent and the last log available.
These are now used by idStr::(v)snPrintf(), and in the future can
be used if a (v)snprintf() that's guaranteed not to call
common->Warning() or similar is needed (e.g. used during early startup)
There were lots of places in the code that called Sys_GrabInput(),
some of them each frame.
Most of this is unified in events.cpp now, in handleMouseGrab() which
is called once per frame by Sys_GenerateEvents() - this makes reasoning
about when the mouse is grabbed and when not a lot easier.
Sys_GrabInput(false) still is called in a few places, before operations
that tend to take long (like loading a map or vid_restart), but
(hopefully) not regularly anymore.
The other big change is that the game now uses SDLs absolute mouse mode
for fullscreen menus (except the PDA which is an ugly hack), so the
ingame cursor is at the same position as the system cursor, which
especially helps when debugging with `in_nograb 1` and should also help
if someone wants to integrate an additional GUI toolkit like Dear ImGui.
https://github.com/microsoft/vcpkg/issues/18098 has been resolved
Apparently Homebrew (package manager for macOS) has changed their install directories, mention that
(thanks to @rullinoiz for pointing that out in #433)
glStencilOpSeparateATI() should behave exactly the same as
glStencilOpSeparate() so supporting it is easy enough and might help
some people with hardware or drivers that don't support OpenGL 2.0,
like the Mac OSX versions for PPC.
- Fix build with SDL <=2.0.3
SDL_GetGlobalMouseState was introduced in 2.0.4
(which doesn't support OSX 10.5 or older)
- Don't include execinfo.h on Mac OS X 10.4
This file isn't included in the 10.4 SDK
- Use custom typedef for PFNGLSTENCILOPSEPARATEPROC on OSX 10.4/10.5
because the system OpenGL headers for those versions don't have it
CMAKE_SYSTEM_PROCESSOR used to be broken, CMake "fixed" it by redefining
its meaning (from "Target CPU" to "Host CPU except when crosscompiling").
On Windows it always prints the host CPU, on Linux it at least made trouble
in chroots and when running 64bit kernels with 32bit userlands (this used
to be not totally uncommon on x86 before distros completely switched to
amd64, and apparently Raspbian/Raspberry Pi OS does this on RPi4, see #267)
Thankfully gcc and clang support "-dumpmachine" to print their (default)
target system, so use that instead (MSVC already had a special case).
On the upside, this allows getting rid of the MinGW special case.
I hope this also works with Apple Clang..
The problem was that negative values (from dhewm3tmpres.xyz) were passed
to POW, and POW doesn't have to support negative bases, according to
ARB_fragment_program.txt, and Intels Linux drive apparently doesn't,
see also https://gitlab.freedesktop.org/mesa/mesa/-/issues/5131
Using MUL_SAT instead of MUL to clamp the value that gets passed to POW
afterwards to [0, 1] fixes the problem without any disadvantages.
so far the code assumed that "result.color" is always used directly,
but the ARB shaders allow creating an alias with the aforementioned
syntax. So turn that into an variable-alias for dhewm3tmpres.
as long as it's chars Doom3 supports, i.e. it can be converted
to ISO-8859-1
also renamed kbdNames to _in_kbdNames to reduce likelyhood of clashes
(as it can't be static)
and scale the breakpoint dots accordingly - now they don't looked all
squashed anymore.
I think ResizeImageList() is more correct now, at least this helped with
the breakpoint dots.
The `const char* filename` arg is passed from idProgram::CompileText(),
where it's from idProgram::filename - and that filename can get modified
in idCompiler::NextToken() when it calls gameLocal.program.GetFilenum()
and if the idStr grows and reallocates for that modification,
the filename pointer becomes invalid.
So store `filename` in an idStr and use that when logging the
compile time.
If in_ignoreConsoleKey is set, the console can only be opened with
Shift+Esc, not `/^/whatever, so you can easily type whatever character
is on your "console key" into the game, or even bind that key.
Otherwise, with SDL2, that key (KEY_SCANCODE_GRAVE) always generates the
newly added K_CONSOLE.
in_kbd has a new (SDL2-only) "auto" mode which tries to detect the
keyboard layout based on SDL_GetKeyFromScancode( SDL_SCANCODE_GRAVE ).
Wherever Sys_GetConsoleKey() is called, I now take the current state of
Shift into account, so we don't discard more chars than necessary, esp.
when they keyboard-layout (in_kbd) is *not* correctly set.
(TBH the only reason besides SDL1.2 to keep in_kbd around is to ignore
the char generated by the "console key" in the console..)
It's set to 0 by default (which is the original behavior), if set to 1,
SDL2 will grab the keyboard, so Alt-Tab or the Windows Key etc will not
be handled by the operating system but by dhewm3 (=> you can bind the
Windows key like any normal key and it won't open the start menu)
If a key is pressed whichs SDL_Keycode isn't known to Doom3 (has no
corresponding K_* constant), its SDL_Scancode is mapped to the
corresponding newly added K_SC_* scancode constant.
I think I have K_SC_* constants for all keys that differ between
keyboard layouts (which is mostly printable characters; F1-F12, Ctrl,
Shift, ... should be the same on all layouts, which means that e.g.
SDL_SCANCODE_F1 always belongs to SDLK_F1 which the old code already
maps to Doom3's K_F1).
What's extra nice (IMO) is that when Doom3 requests a *localized* name
of the key (like for showing in the bindings menu), we actually use the
name of the SDL_Keycode that *currently* belongs to the scancode, and
esp. the "Western High-ASCII characters" (ISO-8859-1) supported by Doom3
like Ä or Ñ are displayed correctly.
(I already implemented a very similar hack in Yamagi Quake II and
reused the list of scancodes)
This should fix most of the problems reported in #323