it has a member with a vtable that gets overwritten then, that's bad
(even though I've never seen a crash caused by this?!)
Instead set the members to NULL/zero manually
After loading a texture, Doom3 calculates an MD4-sum of it.. this is
mostly pointless and only used for the "reportImageDuplication" console
command, but whatever.
The problem here was that the image was 32000x2000 pixels (due to some
error when creating it) which dhewm3 wanted to convert to the next
bigger power-of-two values with R_ResampleTexture(), but that function
clamps to 4096x4096, so the actually used pixeldata was for 2048x4096.
However, R_ResampleTexture() didn't communicate this to its caller,
and thus a too big size was used for calculating the MD4-sum and it
crashed.
That's fixed now and also a warning is printed about this.
BUILD_CPU has been replaced by D3_ARCH, which is also set by CMake on
most platforms, except for Windows, there it's set in neo/sys/platform.h
because CMake is not able to tell us what CPU platforms it's targeting
(for other platforms we parse the output of gcc/clang's -dumpmachine
option, but for MSVC that's not an option, of course)
in SkinDeep regs not being initialized caused random crashes
(in dhewm3 I haven't seen that so far, but fixing this won't hurt).
From SkinDeep commit message:
In idRegister::SetToRegs() at `registers[ regs[ i ] ] = v[i];`
regs[i] contained values like 21845 or 22010 or 32272, even though
the static registers array that's written to there only holds 4096
elements (it's `static float regs[MAX_EXPRESSION_REGISTERS];`
from `idWindow::EvalRegs()`).
So it overwrites other data, likely other global variables, like
`gameLocal.entities[4967]`, that now contain garbage and next time
someone tries to use them, bad things happen.
In this case, if someone tries to dereference gameLocal.entities[i]
and the pointer at i contains garbage, there's a segfault (crash).
462404af67
somehow the collision code managed to spread NaNs on Win32, which caused
a horrible framerate, "GetPointOutsideObstacles: no valid point found"
warnings in the console and assertions in debug builds.
Didn't happen in Vanilla Doom3 though.
At the location I changed the code in, I saw the following values in the
debugger:
normal: {x=0.00610326231 y=5.58793545e-09 z=1.19209290e-07 }
trmEdge->start: {x=-1358.00000 y=913.948975 z=25.2637405 }
start: {x=-1358.00000 y=916.000000 z=34.0000000 }
end: {x=-1358.00000 y=810.000000 z=34.0000000 }
dist (normal*trmEdge->start): -8.28822231
d1: 9.53674316e-07
d2: 9.53674316e-07
f1 (d1/(d1-d2)): inf
"normal" isn't normalized and also very small (in all directions),
"start" and "end" have quite different y values, but still doing scalar
multiplications of each with "normal" gave the same result..
No idea what this all means exactly, but checking if d1 - d2 is (almost)
0 to prevent INF solved the problems. In the end it will be some tiny
differences in floating point calculations between different platforms
and compilers..
In my test d1-d2 was exactly 0, but I compare with FLT_EPSILON to be
on the safer side.
incl. backwards compat for older savegames.
only partly useful: old savegames only work if you didn't change the
gamedata, with the CstDoom3 .gui files, loading them crashes. I don't
think that can be avoided, apparently Doom3 has no way to detect that
the GUIs have changed?
In idWindow::Redraw(), I had to make sure the menu scale fix (which,
if enabled for a window, renders that in 4:3 with empty or black bars
on the side if needed for widescreen etc, instead of stretching it)
is disabled if a window uses CST anchors, because the CST anchor code
also adjusts for the display aspect ratio and if we do both, things get
distorted in the other way.
The biggest change is that idDeviceContext::DrawStretchPic(Rotated) now
has code to adjust the coordinates for both CST and the menu scale fix,
so idDeviceContext::AdjustCoords() is mostly obsolete - it's only still
used by idRenderWindow.
Unlike DstDoom3 now that extra adjustCoords argument to those Draw
functions indicates that any coordinate adjustment should be done, so
if it's set by a caller, it's set to true.
I removed idDeviceContext::AdjustCursorCoords() because it was only used
in one place anyway
By writing that info into the demo when recording it (when demos are
played back, mylevel.map isn't read, only mylevel.proc, so the
worldspawn can't be accessed to get allow_nospecular from there)
D3::ImGuiHooks::NewFrame() was still called every frame, but EndFrame()
wasn't because idSessionLocal::UpdateScreen() exited early.
This caused an assertion in Dear ImGui, because it doesn't like calling
NewFrame() if it has been called before without EndFrame() afterwards
based on https://github.com/dhewm/dhewm3/pull/254
The "nospecular" parm will only be used if either
r_supportNoSpecular is set to 1
or r_supportNoSpecular is set to -1 (the default) and the maps spawnargs
contain "allow_nospecular" "1"
This probably doesn't work with (time)demos yet, because I think when
they're being played I can't access the worldspawn entity
If it (or Documents/My Games/dhewm3/) can't be created, show a windows
MessageBox with an error message and exit.
Would've made #544 easier to figure out
At least VS2017 doesn't like the big string literal of
proggyvector_font_base85.h (its limit is 64KB, Error C1091), so go back to
using proggyvector_font.h (which contains an int32 array) for MSVC..
Keep the base85 version around for proper compilers, because (unlike
the non-base85 version of the font) it works on Big Endian machines.
It seems like VS2022, maybe even some point release of VS2019 removed this
limitation (our CI build succeeds), but I couldn't find any details about
that change.
Based on whether handleMouseGrab() in events.cpp sets
GRAB_ENABLETEXTINPUT or not.
Should prevent the issue that on macOS pressing a button for longer
while playing (as one does, e.g. to run forward) opens a popup menu
with alternative characters (like "è", "é", "ê", etc for "e")
like in Doom3 BFG: If it's set to 1, no autosaves are created when
entering a level. Defaults to 0 (autosaves enabled)
While at it, I also documented com_numQuicksaves in Configuration.md
based on a fix from @dezo2 from the >60Hz support branch
(TBH I don't know why the crosshair must be scaled to 4:3 but the
grabber cursor not, but this works..)
Modern mice support ridiculously high DPI values, >20'000.
Not sure what that's actually good for, but if people use that, they
ran into the "idUsercmdGenLocal::MouseMove: Ignoring ridiculous
mouse delta" case which just threw away the mouse input values so the
game didn't respond to mouse input anymore or at least felt choppy.
I'm not sure what that code was originally good for, under which
(undesired) circumstances that happened, but for now it's disabled,
only the warning is still logged, but only once.
For these high DPI values to still be usable (camera not moving way
too fast), it probably makes sense if the mouse sensitivity can be set
to values < 1.0. The CVar always supported that, but I adjusted the
Dhewm3SettingsMenu so it sensitivity can also be set to values between
0.01 and 1 there (still going up to 30, like before).
fixes#616
fixes#632
The bug was most probably not caused by D3_SDL_X11 but by
GetDefaultDPI() returning -1.0 which GetDefaultScale() then divided by
96 and rounded to 0.0, which is not a good scaling factor.
I decided to kick the D3_SDL_X11 special case anyway.
also changed that logic a bit so FormatMessage() is only called when
actually used
and while at it, fixed the build with mingw-w64 on my system
(somehow an SDL header used strcmp() and that didn't work with
`#define strcmp idStr::Cmp` from Str.h)
When requesting < 1 MB, _alloca16() is used, otherwise Mem_Alloc16().
Furthermore you must pass a bool that will get true assigned if the
memory has been allocated on the stack, else false.
At the end of the function you must call Mem_FreeA( ptr, onStack )
(where onStack is the aforementioned bool), so Mem_Free16() can be
called if it was allocated on the heap.
idInterpreter::Push() is used only for int and (reinterpreted) float
values, not pointers (as far as I can tell), so 32bit values on all
relevant platforms.
It stored its value as intptr_t at `&localstack[ localstackUsed ]` - on
64bit platforms intptr_t is 64bit.
Unfortunately, all code reading from the stack just get got a pointer
to `&localstack[ localstackUsed ]` in the type they want to read
(like `int*` or `float*`) and read that. On Little Endian that happens
to work, on 64bit Big Endian it reads the wrong 4 bytes of the intptr_t,
so it doesn't work.
fixes#625, #472
All that code is kinda obfuscated, but the integer passing was plain
wrong (if sizeof(int) != sizeof(intptr_t), esp. noticeable on
Big Endian).
data[i] is used by Callbacks.cpp, and for everything but floats it's
passed directly as an argument (interpreted as either an integer or
a pointer to idVec3 or whatever).
So storing an int in there with `( *( int * )&data[ i ] ) = int(...)`
only sets the first 4 bytes of that intptr_t, which is 8 bytes on 64bit
machines. On Little Endian that just happens to work, on Big Endian
it's the wrong 4 bytes.
r_fillWindowAlphaChan is a hack to work around an older issue with
Wayland/Mesa, which has been fixed in Mesa 24.1 (and also seems to work
with current NVIDIA drivers). Additionally, in SDL3 the EGL-specific
(and thus mostly only affecting Wayland)
SDL_HINT_VIDEO_EGL_ALLOW_TRANSPARENCY has been replaced with the generic
SDL_PROP_WINDOW_CREATE_TRANSPARENT_BOOLEAN (that could also affect
other platforms), so it's harder to enable this only for Wayland.
I think most people using SDL3 will use a recent Mesa/driver versions,
so I don't enable it by default for SDL3 (SDL2 behaves like before).
However, with `r_fillWindowAlphaChan 1` the hack can be enabled anyway
(r_fillWindowAlphaChan defaults to "-1" which means "let dhewm3 decide
whether to enable this")
checking if Z_LARGE64 is defined doesn't make much sense because
that is from from zlib which this helps replace..
so on non-windows we always ran into the #define z_off64_t z_off_t
case which doesn't give us a 64bit offset on 32bit systems..
Should work better now.
fixes#622
.. mostly by not making it use SDL_main.h, because it implements its
own SDL main functionality anyway.
However, I have no way to test this code and as long as SDL3 is not in
homebrew testing it in the CI build isn't easy either.
Refactored the pseudo-custom SDL_main code a bit: SDL_win32_main.c
is now only used for SDL1.2, SDL2 and SDL3 have a WinMain() function
in win_main.cpp that works pretty much like the SDL2 SDL_main or SDL3
SDL_RunApp() code - except that the argv[] strings passed to the Doom3
main() function (now renamed to SDL_main()) are encoded in ANSI instead
of UTF-8, so paths passed as commandline arguments, like
dhewm3 +set fs_basepath C:\SüperGämes\Doom3
work with the Win32 ANSI function used by Doom3 to handle paths and files.
For this I also moved the stdout/stderr redirection code from
SDL_win32_main.c to win_main.cpp and cleaned it up a bit
- use SDL_SetHint() to set the video driver to "dummy" for the
dedicated server
- adjustments for some more functions that now return bool instead
of int. I hope I found all cases of that now, at least in the generic
and Linux code, may have to take a closer look at Windows- and Mac-
specific code