Created attachment 75858 [details]
When using glPolygonOffsetEXT(1.0, 2.0), GL_POLYGON_OFFSET_BIAS_EXT should be set to 2.0. Instead, it is set to an enormously huge value (always the same number, 3.35544e+07).
Looks like when using PolygonOffsetEXT() mesa will multiply the bias value by _DepthMaxF to set offset units (hence the same as calling PolygonOffset(). But the get will just return offset units instead of bias instead.
I guess to really support that correctly we'd need to store bias separately. After all _DepthMaxF of the bound draw buffer could change.
I don't know though if anyone is interested in that or we should just drop it...
I wonder if we should just drop the GL_EXT_polygon_offset extension. I see that NVIDIA doesn't advertise it.
I guess if nvidia doesn't advertize it (and by the looks of it they never did) it should be safe to remove (and amd doesn't seem to advertize it neither in their closed-source drivers). After all PolygonOffset became core in OpenGL 1.1, so you'd be looking at OpenGL 1.0 apps potentially using it. Some of the mesa demos actually can use it, though all of them will use the ordinary PolygonOffset by default.
Yeah, it seems to be some obscurely ancient extension. The only reason I "used" it is because I took over the work on OpenGL bindings for ruby, and the library is over 6 years old, so some tests test really old pieces of OpenGL (https://github.com/archSeer/opengl/blob/master/test/test_gl_ext_ext.rb#L29), which is why I was able to report the past few bugs. More to come!
Extension dropped with:
Author: Timothy Arceri <email@example.com>
Date: Fri May 11 15:33:22 2018 +1000
mesa: drop GL_EXT_polygon_offset support
glPolygonOffset() has been part of the GL standard since 1.1. Also
niether AMD or Nvidia support this in their binary drivers.
Reviewed-by: Marek Olšák <firstname.lastname@example.org>