On Windows, ID_INLINE does __forceinline, which is bad if the function
calls alloca() and is called in a loop..
So use just __inline there so the compiler can choose not to inline
(if called in a loop).
This didn't cause actual stack overflows as far as I know, but it could
(and MSVC warns about it).
turns out tri->numIndexes can be quite big so _alloca16() would be
called with >1MB - and the Win32 stack only is 1MB, so that overflows.
As a workaround, use Mem_Alloc16() if we need >600KB
was reported in #265
The Doom3: Lost Mission (aka D3LE) mod uses string table entry keys that
don't follow the proper Doom3 scheme of "#str_01234", but look like
"#str_adil_exis_pda_01_audio_info" - the "hash" algorithm of idLangDict,
which basically just converts the number after "#str_" into an int,
probably doesn't work very well with this and there was an assertion
to prevent this..
However, it seems to work well enough, so now I only print one warning
for the first "invalid" key and otherwise accept those keys.
The stringtable (strings/*.lang) also has font-related entries
(like "#font" "Chainlink_Semi-Bold") that triggered another assertion,
now everything that starts with "#font" is silently skipped.
can be enabled/disabled with the r_useStencilOpSeparate for comparisons
(like Z-Fail, this doesn't really seem to make a difference on my main
machine, neither on my RPi4)
Partly based on Pat Raynor's Code:
2933cb5545/neo/renderer/draw_stencilshadow.cpp
the tables contain character constants like ('ä') that are supposed to
be interpreted as ISO8859-1 or WINDOWS-1252 or sth, but that doesn't
seem to work with MinGW (anymore) - seems like it assumes UTF-8 by
default, and for some reason -finput-charset=ISO8859-1 doesn't help
either, it complains about multichar character constants then..
Anyway, now the table entries are represented as the corresponding
integer constants which seems to work as intended.
Fixes#238
dhewm3 uses source_group(TREE ...) which is only supported in
CMake 3.8 and newer.
As it's only cosmetical (so files show up in correct path in IDEs like
Visual Studio), I just replace the source_group() function with a dummy
for affected CMake versions. At least people can still compile then..
fixes#232
Bringing back support for the (MFC-based Windows-only) editing tools.
Only tested 32bit (not sure if the code is 64bit-clean), needs a
"proper" (non-Express) version of Visual Studio *including* MFC support,
tested the free "Community" Editions of VS 2013 and 2017.
(For VS2013 I needed to install the "Visual C++ MFC MBCS Library for
Visual Studio 2013" because it either didn't include MFC support at all
or only the wrong version, don't remember, you'll need that one anyway)
In short, it uses a idGLDrawableMaterial widget that calls
renderSystem->BeginFrame(w, h); (with w and h being small for the texture
preview window) and BeginFrame() sets glConfig.vidWidth and vidHeight to
w/h and that never got reset to the original value (window width/height).
This breaks everything because for some reason
renderSystem->GetSCreenWidth()/Height() return glConfig.vidWidth/Height
so it will just continue to render everything at that resolution (in a
small rectangle on the lower left corner of the window).
This bug has already existed in Doom3 1.3.1 (but was less noticable because
apparently when switching away from Doom3 and back to the window it reset
vidWidth/Height to the window resolution)
I only implemented a workaround (restore glConfig.vid* after rendering the
texture preview), it's possible that the same issue exists in other
(probably editor-) code - but a "proper fix" might also break code (and I'm
not super-familiar with the editor code or even just using them)
When selecting a texture in Inspectors -> Media -> Textures that doesn't
really exist (.mtr defines it, but referenced image files are missing),
the game/editor used to crash.
The problem was that somehow people thought the best way to communicate
that a file wasn't found was by setting the timestamp to 0xFFFFFFFF
and then checking for that - sometimes (incorrectly) by comparing it to -1.
That worked for 32bit ID_TIME_T (typedefed to time_t), but not with 64bit
time_t, which now seems to be the default for Win32 in VS.
So I replaced a few -1 and 0xFF... with FILE_NOT_FOUND_TIMESTAMP and now it
works.
FILE_NOT_FOUND_TIMESTAMP is still defined as 0xFFFFFFFF, maybe I should
change that, unsure if that'd break anything though..
When changing it one should keep in mind that time_t might still be 32bit
on some platforms (Linux x86?) so that should still work.. (-1 could work)
...by including `vecLib/vecLib.h` only on `__APPLE__`, and not just any `__ppc__`
From https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=235668:
> On powerpc* platforms vecLib/vecLib.h is included by default.
> This header is only available on Mac OS X, so remove it. It
> doesn't seem to be actually needed to compile this port.
Upgrade CMakeLists.txt minimum SDK version for compiling in macOS to v.10.9 "Mavericks" for 64 bit builds.
Until now, make failed with this error:
warning: include path for stdlibc++ headers not found; pass '-std=libc++' on the command line to use the libc++ standard library instead [-Wstdlibcxx-not-found]
In file included from /Juegos/dhewm3/neo/idlib/bv/Bounds.cpp:29:
/Juegos/dhewm3/neo/sys/platform.h:185:10: fatal error: 'cstddef' file not found
#include <cstddef>
^~~~~~~~~
1 warning and 1 error generated.
Editor also seems to start, didn't test much further.
Only tested 32bit Windows, I fear the editor code isn't 64bit clean..
I hope I haven't broken anything elsewhere..
causes trouble on macOS, and we shouldn't interact with X11 directly
anyway, because SDL does it for us.
OpenBSD apparently needed it (at least it was added for OpenBSD
support), but the only place I can imagine it being needed is the
superfluous #include <SDL_syswm.h> in neo/sys/glimp.cpp - which I now
removed. In case it's needed after all please tell me, then I'll add it
again - but guarded by if(os STREQUALS "OpenBSD") or however one checks
for OpenBSD in CMake.
handling SIGTTIN/OU allows running Doom3 in the background (or even
sending it to the background with Ctrl-Z + bg) by disabling TTY input
(before it would get stuck when run in background without +set in_tty 0,
see #215)
While at it, I also added signal handlers for some common crash signals
(SIGILL, SIGABRT, SIGFPE, SIGSEGV) to print a backtrace before exiting
the game (partly based on Yamagi Quake II code).
For some reason CMake thinks it's a great default to display all the source
files of a project in Visual Studio as a flat list (instead of in the
directory structure they're physically in).
source_group(TREE ...) *per project/"target"* helps.
Also, so far there was no reason to list or otherwise use header files
in CMakeLists.txt - but for them to turn up in VS they must be added to
the source lists. I've done that, but I'm sloppily globbing them instead
of adding each headder manually like it's done for the source files
(because it doesn't matter if a header that's not really used turns up
in those lists)
Changing idSession would break the interface for Game DLLs, making the
existing ports of Mods incompatible.
Luckily we have idCommon::GetAdditionalFunction() for cases like this..
Also added a check for WEAPON_NETFIRING in idWeapon::EnterCinematic(),
I got an Assert() there in Debug builds.
Running the Demo seems to work now, at least I could finish it without
any problems (ignoring some warnings in the console)
it could happen that i is 1 but numPlanes is still 0
(=> if for i = 0: ( p[j] - p[i] ).LengthSqr() < 0.01f )
so planes[-1] would be accessed which of course is invalid and can crash
WIN_DESKTOP means that this can currently only be set for the top-level
window in a .gui (all its subwindows/widgets will be scaled implicitly)
There are two ways to make a GUI use this:
1. in the .gui add a window variable "scaleto43 1", like
windowDef Desktop {
rect 0 ,0 ,640 ,480
nocursor 1
float talk 0
scaleto43 1
// .. etc rest of windowDef
2. When creating the GUI from C++ code, you can afterwards make the
UserInterface scale to 4:3 like this:
idUserInterface* ui = Whatever(); // create it
ui->SetStateBool("scaleto43", true);
ui->StateChanged(gameLocal.time);
Both lines are important!
As you can see in my changes to Player.cpp, my primary usecase for this
is the cursor/crosshair GUI.
So stuff doesn't look so distorted in widescreen resolutions.
Implies that there are black bars on the left/right then..
Can be disabled with "r_scaleMenusTo43 0"
Does *not* affect the HUD (incl. crosshair) - scaling it automagically
would be very hard (or impossible), because it doesn't only render
the crosshair, healthpoints etc, but also fullscreen effects like the
screen turning red when the player is hit - and fullscreen effects
would look very shitty if they didn't cover the whole screen but had
"empty" bars on left/right.
(Mostly) fixes#188 and #189
Mods that have their own video settings menu can tell dhewm3 to replace
the "choices" and "values" entries in their choiceDef with the
resolutions supported by dhewm3 (and corresponding modes).
So if we add new video modes to dhewm3, they'll automatically appear in
the menu without changing the .gui
To enable this, the mod authors only need to add a "injectResolutions 1"
entry to their resolution choiceDef. By default, the first entry will
be "r_custom*" for r_mode -1, which means "custom resolution, use
r_customWidth and r_customHeight".
If that entry shoud be disabled for the mod, just add another entry:
"injectCustomResolutionMode 0"
This is an ugly hack that allows both exporting additional functions
(incl. methods via static function + void* userArg) to Game DLLs
and setting callback functions from the Game DLL that the Engine will
call, without breaking the Game API (again after this change).
This is mostly meant for replacing ugly hacks with SourceHook and
similar and mods (yes, this is still an ugly hack, but less ugly).
See the huge comment in Common.h for more information.
Right now the only thing implemented is a Callback for when images
are reloaded (via reloadImages or vid_restart) - Ruiner needs that.
Also increased GAME_API_VERSION to 9, because this breaks the A[PB]I
(hopefully after the next release it won't be broken in the foreseeable
future)
if --help or -h or -help or -? or /? is passed as commandline argument,
the help is printed (to stdout) and then the game quits.
The help shows some helpful commandline arguments, including how to
tell the engine where to find the game data, how to set the resolution
and more
If an OpenAL source runs out of samples it transisions into state
AL_STOPPED. That happens if we're entering the menu (which switches
to another soundworld) and when saving the game (because the game
blocks for some milliseconds). Work around this by adding a new
field 'stopped' to the channel state and use that to determine if
a sound was stopped. And not AL_STOPPED like before.
In the last few weeks I've played again through the game and kept an eye
on small oddities. And there're a lot of them. For example some GUIs and
videos getting stuck after the first frame (issue #192) or being unable
to protect the guy with the lamp in Alpha Labs 3. Some digging proved
that most - if not all - of these problems are caused by the compilers
optimization level. When build with -O2 both g++ 8.1 and clang 6.0.0 are
producing working code. g++ 8.1 with -O3 has some small, hard to notice
oddities, clang 6.0.0 with -O3 shows a lot of them. Since there's not
measurable difference between -O3 and -O2 just go down to the later:
x doom_o3.txt
+ doom_o2.txt
+----------------------------------------------------------------------+
| + |
| + * |
|x * x * |
| |_____________________|___A______M__A_________M___|_||
+----------------------------------------------------------------------+
N Min Max Median Avg Stddev
x 5 173 178 177 176.4 2.0736441
+ 5 176 178 178 177.2 1.0954451
so even with many loud sounds the overall volume isn't reduced by
OpenAL - apparently OpenAL scales down all sounds temporarily if the
mixed result would be too loud or sth like that.
Just sending all sounds to OpenAL with a lower volume prevents that from
happening (just set your system speaker volume a bit higher if needed).
This problem was especially noticable when shooting at metal walls with
the shotgun (each pellet produces an impact sound so it gets kinda loud)
Sometimes memory was allocated with new[] but freed with delete instead
of delete[], which is wrong.
And there were some small memory leaks, too.
Furtunately clang's AddressSanitizer detected all that so I could easily
fix it.
(There seem to be some more small memory leaks which are harder to fix,
though)
On FreeBSD, the game used to crash when loading the last level of RoE
(d3xp), while loading models/david/hell_h7.ma.
The problem could be reproduced on Linux whith #define USE_LIBC_MALLOC 1
and clang's AddressSanitizer.
Turns out that this file specifies a vertex transform for a non-existent
vertex (index 31, while we only have 0-30) and thus the bounds of
pMesh->vertexes[] are violated.
I added a check to ensure the index is within the bounds and a Warning
if it isn't.
It should work now. If however it turns out that more files have this
problem, maybe .ma is parsed incorrectly and we need a differently fix.
(Should) fix#138
as we do int buttonIndex = ev.button.button - SDL_BUTTON_LEFT;
it's only consistent to do if(ev.button.button < SDL_BUTTON_LEFT + 8)
it doesn't really make any difference as long as SDL_BUTTON_LEFT is 1,
but this way it's safe for SDL3 or whatever future version might break
the ABI.
__builtin_trap() causes an illegal instruction and thus the process
can't resume afterwards.
raise(SIGTRAP) only breaks into the debugger and thus allows to
"ignore" the assertion while debugging.
The assertion in idBounds::operator-(const idBounds&) was triggered
from idWeapon::Event_LaunchProjectiles() (ownerBounds - projBounds)
It only happened when using the BFG.
So I added a check to make sure calling operator- is legal.
I guess this also caused #122
It's an SDL_TEXTEDITING event which we seem to get on Windows whenever
the Window gains focus (or is created). I think it can be safely
ignored, so that's what I do.
I also changed how those warnings are printed - as a hex number now,
because they're defined as hex numbers in the SDL source and it's easier
to find out what kind of event it is this way.
When ingame, Shift-Esc would open the menu and another Shift-Esc the
console. Now it immediately opens the console and only Esc without
Shift opens the menu.
The fullscreen guis pretend to be 640x480 internally, also for the mouse
cursor position. So adding the actually moved pixels (when playing the
game at a higher resolution) to the GUIs cursor position makes it move
too fast.
To fix that I detect (hopefully that check is reliable!) if the
idUserInterfaceLocal instance is a fullscreen GUI and if so scale the
reported mouse moved pixels with 640/actual_window_width and
480/actual_window_height.
If res_none (event with .evType == EV_NONE) is returned,
idEventLoop::RunEventLoop() will assume there are no more events this
frame => pending events will be delayed til next frame (or later if
again res_none is returned in the meantime).
So res_none shouldn't be returned just because there was an SDL event
we didn't care about or we did care about but don't generate a doom3
event for (but toggle fullscreen or something).
Instead we should just fetch and handle the next SDL event.
Before checking there I look for gamedata next to the executable, but
the check was broken: I got the directory the executable is in and
checked if it exists.. well.. of course it does, but that doesn't mean
there's game data in it..
So now I check if that directory actually has a "base/" subdirectory
(or whatever is #defined in BASE_GAMEDIR) and if that fails
/usr/local/games/doom3/ is tried instead.
Thanks chungy for pointing the bug out in #97 !
* r_mode defaults to 5 (1024x768), I think that's more sane than 640x480
* r_fullscreen defaults to 0 (=> windowed mode) because fullscreen in
the wrong resolution sucks.. let people do their initial configuration
in windowed mode
* r_swapInterval defaults to 1 (=> VSync active by default) because that
makes the game feel more smooth and most PCs should be able to 60fps
in this 11years old game anyway
* s_useEAXReverb defaults to 1 (=> use EAX/EFX effects by default),
because OpenAL-soft supports them on all platforms/hardware and if
for some reason the used OpenAL implementation doesn't support it,
it's deactivated automatically anyway.
All these things can be configured in the Options Menu.
the resolutions are really hardcoded in an ugly combination of the
values r_mode supports, a string in strings/*.lang ("#str_04222")
describing the resolutions r_mode supports
("640x480;800x600;1024x768;1152x864;1280x1024;1600x1200")
and a string in mainmenu.gui with the corresponding r_mode values
("3;4;5;6;7;8").. as neither the strings nor mainmenu.gui are GPL'ed
I can't really redistribute a changed version of them.
So I added lots of resolutions to r_vidModes and wrote two functions
that generate the resolutions list string and r_mode value
string for the GUI.
Then I added a hack in the code that detects when the "window" for the
system options ("choiceDef OS2Primary") is created and overwrites the
hardcoded strings with custom ones from my new functions.
This is tested with both the main game and the official d3xp
(Resurrection of Evil) Addon.
No idea if it works with other mods, depends on whether they just copied
that part of the menu or wrote their own.
for some reason neo/tools/compilers/dmap/optimize.cpp included windows.h
and GL/gl.h before including dmap.h, which indirectly includes qgl.h.
This made things in qgl.h explode - seems like APIENTRYP in the
QGLPROC() macro expanded to bullshit because of some APIENTRYP or
APIENTRY definition in windows.h or GL/gl.h
Those includes are totally unnecessary, dmap.h -> qgl.h already includes
GL/gl.h, indirectly via SDL_opengl.h and in that setup things somehow
are fine.
the problem was that the CVar was initialized from the commandline
*after* Posix_InitConsoleInput() is called, so it was too late.
common->StartupVariable() seems to be the right way to initialize a
CVar early.
While I couldn't reproduce the crash, according to the bugreport it
happens if renderSystem->GetScreenWidth()/Height() returned 0 - and
that is indeed the only plausible reason I can imagine for it.
So I check for that case and handle it gracefully by defaulting to
4:3 FOV values.
The version will be 1.4.0 because it's not compatible with
Doom3 1.3.1 mod DLLs.
(Note that this commit doesn't mean 1.4.0 is done, I might do some
minor changes before tagging the Release!)
Because Debian Squeeze's libjpeg6 didn't have jpeg_mem_src(), we added
jpeg_memory_src() to provide the functionality.
This shouldn't be needed anymore and without it we can drop libjpeg code
from our repo.
Fixes#110
For some reason Gentoo renamed zlibs OF() macro to _Z_OF
Our minizip uses OF(), so add a #define for it in case
_Z_OF is defined.
Thanks to salamanderrake for the fix
The implementation is now in framework/minizip/*
instead of framework/Unzip.cpp
This was version 0.15beta, now we use 1.1 from
zlib 1.2.7/contrib/minizip
Some code had to be adjusted for this, but it got
cleaner on the way
idAnimator::GetJointLocalTransform() miscompiles with gcc 4.5 and
-ftree-vrp (implied by -O2).
Reorder code to avoid the compiler bug, no functional change.
The original commit was for game/ only, but d3xp/ will have the same
issues..
Added r_aspectratio -1 which means "auto" (as new default).
This mode sets fov_x and fov_y according to screen-width/height.
=> No need to set r_aspectratio manually anymore (assuming your display's
pixels are about square).
The standard aspect ratios can still be enforced as before, though.
When one tries to quit the game (via quit in console) while a timedemo
is running, the game freezes indefinitely while displaying the timedemo
results.
This happens because the MessageBox used for that waits for a an event
normally triggered by a timer - but it seems like that does not happen
anymore during Shutdown()
The fix makes sure that this message box isn't displayed.
Commit 9e158470 set the SDL OpenGL attribut SDL_GL_ALPHA_SIZE to 0 since
the alpha channel is used by Wayland. But for X11 the GLX 1.4 specification
clearly states: "If the requested number of bits in attrib_list for a
particular color component is 0 or GLX_DONT_CARE, then the number of
bits for that component is not considered." So if SDL_GL_ALPHA_SIZE is
0 a framebuffer without an alpha channel is created. This is no problem
on the default GLX module due to a non standard implementation but
manifests with Nvidias GLX module. The consequence are render mistakes
like in game display showing static or the flashlight looking weird.
Everytime List.h is included in a new file (and sys/platform.h isn't)
there are confusing compiler-errors..
So just #include sys/platform.h in List.h directly, because it uses
ID_INLINE which is defined there
GCC had shitloads of superfluous warnings wherever List.h and Str.h were
included.. get rid of them by using #pragma GCC diagnostic at some places
in List.h and Str.h.
Also add some casts, initialize some variables for other warnings