System Environment: -------------------------- --Mesa:e54329233522591bbe8aad8a3fd6bcdc1e430f03 --Xserver: 66b00029e587cec628d0041179a301e888277f8e Bug detailed description: -------------------------- glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, &i) doesn't return the right value. actually, it seems this function doesn't change the value already in 'i' Reproduce steps: ---------------- start X compile and run the attached test case Current result: ---------------- glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, &i) doesn't return the right value. actually, it seems this function doesn't change the value already in 'i' Expected result: ---------------- glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, &i) should return the right value.
Created attachment 13347 [details] test case
What is the hardware you run the test program on?
I run it on q965, but is using software rendering
The case for GL_MAX_3D_TEXTURE_SIZE was missing in indirect_size_get.c. Fixed in Mesa's gl_API.xml file and regenerated file pushed to xserver.
this issue has gone. thanks but this test case still has another issue. I'll file another bug.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.