Created attachment 131194 [details] Glx information Ubuntu 16.04.2 MESA 17.0.5 or 17.2.0-devel (a5f82db3805f glsl: rename image_* qualifiers to memory_*) kernel: 4.10/4.11-rc4 Chrome browser: R58.0.3029.96 Steps: 1. got to https://www.google.com/earth/ 2. Press "Launch Google Earth" 3. Zoom in area of ocean. It will see the non-smooth rendering without detail.
Created attachment 131195 [details] Reference picture for symptom Please help to check symptom.
Just to see if this was some regression I've reproduced this on IVB laptop with older Mesa (13.0.4). It seems this has been there always or for some time.
It looks to me like the application is serving different images in square regions of the map, which makes the ocean look pixelated. Can you please repeat on windows or non-intel hardware to verify that Mesa is causing this rendering artifact?
With Chrome 58.0.3029.96, using the Nvidia blob driver I don't see the blotchy tiles. This seems likely to be an actual mesa rendering bug.
The texture that is being misrendered is GL_COMPRESSED_RGB_S3TC_DXT1_EXT. It is uploaded in that format. Same texture data is rendered properly by Windows with the same hardware. This is a Mesa bug. You can see this clearly at frame 600 of the trace at: https://drive.google.com/open?id=0B2S0H03dkl2CTS0zQlhJeEdsOW8
Chrome browser R58 doesn't encounter this issue in Windows 7.thanks
Created attachment 131221 [details] Software rendering via MESA 17.2.0-devel I also tested it with software rendering, issue could be observed. ... OpenGL vendor string: Mesa Project OpenGL renderer string: Software Rasterizer OpenGL version string: 2.1 Mesa 17.2.0-devel (git-a5f82db) OpenGL shading language version string: 1.20 ...
Created attachment 131222 [details] Symptom for comment #7 List picture for symptom in https://bugs.freedesktop.org/show_bug.cgi?id=100925#c7
Created attachment 131258 [details] [review] workaround I did see these same/similar artifacts (like compression errors) also on nvidia but very faintly so started to look if this was about alpha and that seems the case. Not sure what the real fix is yet though.
I saw internal format was 0x8058 GL_RGBA8_EXT but not GL_RGB8_EXT ... glPixelStorei(pname = GL_UNPACK_ALIGNMENT, param = 1) glCompressedTexImage2D(target = GL_TEXTURE_CUBE_MAP_NEGATIVE_X, level = 0, internalformat = GL_COMPRESSED_RGB_S3TC_DXT1_EXT, width = 256, height = 256, border = 0, imageSize = 32768, data = blob(32768)) glPixelStorei(pname = GL_UNPACK_ALIGNMENT, param = 1) glGetGraphicsResetStatus() = GL_ZERO glCompressedTexImage2D(target = GL_TEXTURE_CUBE_MAP_NEGATIVE_Y, level = 0, internalformat = GL_COMPRESSED_RGB_S3TC_DXT1_EXT, width = 256, height = 256, border = 0, imageSize = 32768, data = blob(32768)) ...
(In reply to Gary Wang from comment #10) > I saw internal format was 0x8058 GL_RGBA8_EXT but not GL_RGB8_EXT Where did you see internal format GL_RGBA8_EXT used? For me this looks like application would like to have alpha but is passing wrong enum GL_COMPRESSED_RGB_S3TC_DXT1_EXT instead of GL_COMPRESSED_RGBA_S3TC_DXT1_EXT.
Hi Tapani, I saw wrong rendering task in log and am sorry for comment #10. :) ... Mesa: glBindTexture GL_TEXTURE_2D 1304 Mesa: FLUSH_VERTICES in bind_texture Mesa: FLUSH_VERTICES in flush Mesa: FLUSH_VERTICES in flush Mesa: FLUSH_VERTICES in flush Mesa: glBindTexture GL_TEXTURE_2D 1304 Mesa: FLUSH_VERTICES in bind_texture Mesa: glTexStorage2D GL_TEXTURE_2D 1 GL_RGBA8 256 256 1 Mesa: _mesa_make_current() Mesa: FLUSH_VERTICES in _mesa_flush Mesa: FLUSH_CURRENT in _mesa_flush Mesa: _mesa_make_current() Mesa: glBindTexture GL_TEXTURE_2D 1256 Mesa: FLUSH_VERTICES in bind_texture Mesa: glGetError <-- GL_NO_ERROR Mesa: glGetError <-- GL_NO_ERROR Mesa: glBindTexture GL_TEXTURE_2D 0 ...
(In reply to Gary Wang from comment #7) > Created attachment 131221 [details] > Software rendering via MESA 17.2.0-devel > > I also tested it with software rendering, issue could be observed. > > ... > OpenGL vendor string: Mesa Project > OpenGL renderer string: Software Rasterizer > OpenGL version string: 2.1 Mesa 17.2.0-devel (git-a5f82db) > OpenGL shading language version string: 1.20 > ... It found Google-Earth web always prevent to run chrome in non-accelerated mode (glx_accel). I re-tested software rendering for this issue today and it appears above #7 comment was wrong due to some modified code in my local mesa repository with incorrect status. Sorry. Please ignore #7.
(In reply to Tapani Pälli from comment #9) > Created attachment 131258 [details] [review] [review] > workaround > > I did see these same/similar artifacts (like compression errors) also on > nvidia but very faintly so started to look if this was about alpha and that > seems the case. Not sure what the real fix is yet though. Yes, you are right. I also test it by GL_COMPRESSED_RGBA_S3TC_DXT1_EXT for compressed texture in glCompressedTexImage2D(). diff --git a/src/mesa/main/teximage.c b/src/mesa/main/teximage.c index 1a00d25..9f9ffd6 100644 --- a/src/mesa/main/teximage.c +++ b/src/mesa/main/teximage.c @@ -2724,6 +2724,8 @@ override_internal_format(GLenum internalFormat, GLint width, GLint height) return internalFormat; } #else + if (internalFormat == GL_COMPRESSED_RGB_S3TC_DXT1_EXT) + internalFormat = GL_COMPRESSED_RGBA_S3TC_DXT1_EXT; return internalFormat; #endif }
Created attachment 131287 [details] [review] s3tc hack-patch Attached patch attempts to fix the issue with a texture swizzle, but unfortunately did not work. I confirmed that hard-coding the texture format to include alpha works properly.
I found some text from the SKL PRM, Vol 5: Memory Views, that explains the errors we're seeing: Example of Opaque Color Encoding [...] 00 ? color_0 01 ? color_1 10 ? 2/3 color_0 + 1/3 color_1 11 ? 1/3 color_0 + 2/3 color_1 [...] Programming Note Context: Opaque Textures (DXT1_RGB) The behavior of this format is not compliant with the OGL spec Here's the OGL spec behaviour for reference: The RGB color for a texel at location (x,y) in the block is given by: RGB0, if color0 > color1 and code(x,y) == 0 RGB1, if color0 > color1 and code(x,y) == 1 (2*RGB0+RGB1)/3, if color0 > color1 and code(x,y) == 2 (RGB0+2*RGB1)/3, if color0 > color1 and code(x,y) == 3 RGB0, if color0 <= color1 and code(x,y) == 0 RGB1, if color0 <= color1 and code(x,y) == 1 (RGB0+RGB1)/2, if color0 <= color1 and code(x,y) == 2 BLACK, if color0 <= color1 and code(x,y) == 3 Arithmetic operations are done per component, and BLACK refers to an RGB color where red, green, and blue are all zero. http://developer.download.nvidia.com/opengl/specs/GL_EXT_texture_compression_s3tc.txt
Patches to fix this issue are on the list: https://patchwork.freedesktop.org/series/24323/
Thank you for taking the time to report this issue. This bug has been fixed upstream with the following commit: commit 688ddb85c8c3357d8e1e9d360c74cd728b128d98 Author: Nanley Chery <nanley.g.chery@intel.com> Date: Thu May 11 15:57:59 2017 -0700 i965/formats: Update the three-channel DXT1 mappings The procedure for decompressing an opaque DXT1 OpenGL format is dependant on the comparison of two colors stored in the first 32 bits of the compressed block. Here's the specified OpenGL behavior for reference: The RGB color for a texel at location (x,y) in the block is given by: RGB0, if color0 > color1 and code(x,y) == 0 RGB1, if color0 > color1 and code(x,y) == 1 (2*RGB0+RGB1)/3, if color0 > color1 and code(x,y) == 2 (RGB0+2*RGB1)/3, if color0 > color1 and code(x,y) == 3 RGB0, if color0 <= color1 and code(x,y) == 0 RGB1, if color0 <= color1 and code(x,y) == 1 (RGB0+RGB1)/2, if color0 <= color1 and code(x,y) == 2 BLACK, if color0 <= color1 and code(x,y) == 3 The sampling operation performed on an opaque DXT1 Intel format essentially hard-codes the comparison result of the two colors as color0 > color1. This means that the behavior is incompatible with OpenGL. This is stated in the SKL PRM, Vol 5: Memory Views: Opaque Textures (DXT1_RGB) Texture format DXT1_RGB is identical to DXT1, with the exception that the One-bit Alpha encoding is removed. Color 0 and Color 1 are not compared, and the resulting texel color is derived strictly from the Opaque Color Encoding. The alpha channel defaults to 1.0. Programming Note Context: Opaque Textures (DXT1_RGB) The behavior of this format is not compliant with the OGL spec. The opaque and non-opaque DXT1 OpenGL formats are specified to be decoded in exactly the same way except the BLACK value must have a transparent alpha channel in the latter. Use the four-channel BC1 Intel formats with the alpha set to 1 to provide the behavior required by the spec. Note that the alpha is already set to 1 for RGB formats in brw_get_texture_swizzle(). v2: Provide a more detailed commit message (Kenneth Graunke). v3: Ensure the alpha channel is set to 1 for DXT1 formats. Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=100925 Cc: <mesa-stable@lists.freedesktop.org> Acked-by: Tapani Pälli <tapani.palli@intel.com> (v1) Reviewed-by: Kenneth Graunke <kenneth@whitecape.org> Signed-off-by: Nanley Chery <nanley.g.chery@intel.com>
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.