Created attachment 142725 [details] The apitraces from when using i965 and llvmpipe The textures in the game Cat Girl Without Salad, which runs though Wine are significantly aliased when using hardware rendering. In the terminal the following message constantly appears: 0032:fixme:d3d:texture2d_blt_fbo Unsupported filter mode WINED3D_TEXF_ANISOTROPIC (0x3). When setting LIBGL_ALWAYS_SOFTWARE=1 llvmpipe is used and the textures appear correctly, as they are when playing the game on Windows. The following two images show the difference at the main menu: i965: https://i.imgur.com/K02TJ0v.png llvmpipe: https://i.imgur.com/lk8rpMn.png The attached file has the apitrace from both i965 and llvmpipe compressed. The apitrace from i965 is pretty strange, it did not replay correctly here, opening four windows and showing only a black screen. glxinfo -B: name of display: :0.0 display: :0 screen: 0 direct rendering: Yes Extended renderer info (GLX_MESA_query_renderer): Vendor: Intel Open Source Technology Center (0x8086) Device: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2) (0x1916) Version: 19.0.0 Accelerated: yes Video memory: 3072MB Unified memory: yes Preferred profile: core (0x1) Max core profile version: 4.5 Max compat profile version: 3.0 Max GLES1 profile version: 1.1 Max GLES[23] profile version: 3.2 OpenGL vendor string: Intel Open Source Technology Center OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2) OpenGL core profile version string: 4.5 (Core Profile) Mesa 19.0.0-devel (git-4b218984d8) OpenGL core profile shading language version string: 4.50 OpenGL core profile context flags: (none) OpenGL core profile profile mask: core profile OpenGL version string: 3.0 Mesa 19.0.0-devel (git-4b218984d8) OpenGL shading language version string: 1.30 OpenGL context flags: (none) OpenGL ES profile version string: OpenGL ES 3.2 Mesa 19.0.0-devel (git-4b218984d8) OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20 Processor: Intel Core i3-6100U; Video: Intel HD Graphics 520; Architecture: amd64; Mesa: 19.0.0-devel (git-4b218984d8); Kernel version: drm-tip (ff3d336ae5b3ef75e12ed400fd4ccb55701c212a) Distribution: Xubuntu 18.04.1 amd64.
Looking at the draw calls with the awesome frameretrace, I can see that the application renders in a multisampled framebuffer before doing a glBlitFramebuffer into another framebuffer of the size of the display. The main difference there is that in the LLVM pipe trace, the blit is done with filter = GL_LINEAR, where as on i965, the blit is done using filter = GL_NEAREST. I think that explains the difference and that's purely the application's choice, no driver issue.
I think the application decide not to sure anisotropic filtering on LLVM pipe because it's not supported, whereas it is on i965. But anisotropic blitting doesn't make sense and is not supported on OpenGL, hence why wine falls back to nearest. Someone needs to investigate how that filter gets selected in wine or the application.
I have reported this at WineHQ bugzilla https://bugs.winehq.org/show_bug.cgi?id=46246, which was marked as a duplicated of https://bugs.winehq.org/show_bug.cgi?id=41929. The patch is already at Wine Staging 4.0-rc3. At the comment number 9 of the second bug report linked I have analyzed the results and now the experience on Windows and Linux is nearly the same to me: the result is nearly perfect, as the comparison image shows. Thank you. Probably it was thanks to your answer that it was fast to find what was the problem in Wine.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.