While the scheme of using our own allocated did work just fine, fisheye
rendering uses glGenTextures which caused a texture id clash and thus
invalid operations (the cube map texture happened to be the same as the
console background texture). Sure, I could have just "fixed" the fisheye
init code, but this brings gl closer in line with glsl (which makes
extensive use of glGenTextures and glDeleteTextures). This doesn't fix
any texture leaks gl has (plenty, I imagine), but it's a step in the
right direction.
My changes:
Emit normals if truform is enabled.
Attempt to avoid 0,0,0 normals (makes lighting & tesselation unhappy).
Fix some (ancient) apparent bugs in GetAliasFrameVerts16().
Clamp minlight, instead of adding it.
Apply colormod as glColor, rather than adding it to emission, to prevent QSG2
issues with fullbrights.
Rearrange init code, and don't go quite as wild with responsiveness to lights.
Material & lightmode settings will need tweaking & testing to work well on all
cards. Feedback needed there.
support for BSP models, until they can be fixed. gl_multitexture should now
actually be a speedup!
NOTE: Some OpenGL implementations have trouble with the texture function
used. 3Dfx Voodoo 1/2 are known to have this trouble. I don't know how to fix
this, or even if it can be fixed. :/
The major change is that we no longer require libGL to even exist on the
system at compile time for the GL targets, we dynamicly link to the
libGL of choice at run time. (This probably breaks most non-linux
systems, and all GL targets except -glx, some fixup will be needed.)
(This also kills glquake, dead dead DEAD! GONE FOREVER! WHEE!)
Some gl_draw cleanup.
Commented out equake alias model occlusion test stuff, very experimental.
Added the .lo and .la patterns to the .gitignore files.
Some minor sbar cleanup. (We don't use the disc in use symbol for
anything.)