Hello, CS:GO is not rendered correctly on an intel HD5500 (Intel(R) Core(TM) i7-5600U) using mesa git (as of 02a4fe22b137d4bc8378bedd8319109fd23a50e3). The issue was present in mesa 10.6 and mesa 10.6.1. I haven't tested with older versions. libdrm version is 2.4.62 kernel version is 4.1.4 xf86-video-intel version is 1:2.99.917+381+g5772556 Here are some screenshots comparing the output from an intel hd4000 (mesa 10.6.1) on the left of the screnshots and the hd5500 on the right. It looks like the hdr/bloom postprocessing filters are missing, causing the game to render darker in a lot of areas which makes it unplayable in those. http://0x5c.me/intel/01.png http://0x5c.me/intel/02.png http://0x5c.me/intel/03.png http://0x5c.me/intel/04.png The issue is more visible on screenshot 01.png and 03.png. Let me know if you need more debug information.
How should I proceed to further debug this issue ?
Here are some screenshot that highlight a bit more the issue (ivy bridge is on the left and broadwell is on the right): http://0x5c.me/intel2/02.png http://0x5c.me/intel2/03.png http://0x5c.me/intel2/04.png http://0x5c.me/intel2/05.png Here are some information about the setup: The machine is a Lenovo T450s with an Intel(R) Core(TM) i7-5600U @ 2.60GHz with 12gb of ram running Arch linux. It is installed/booted using the legacy bios method (can it have an influence ?) Mesa version is git as of df97126731a745c1797c783414a44652be039d84. libdrm version is 2.4.63 xf86-video-intel version is based on git master as of last week. I also experience better framerates with the ivy bridge setup (Intel(R) Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same video settings on the following benchmark test http://steamcommunity.com/sharedfiles/filedetails/?id=500334237: * the ivy bridge setup gives 42fps average. * the broadwell setup gives 31fps average. The same performance difference appears running glxgears fullscreen (same resolution) with vblank_mode=0 There is not trace of cpu throttling in dmesg during the tests. The gpu frequency seems to be stuck at 950mhz during the test most of the time. It varies from time to time to 900mhz (using the intel_gpu_frequency tool).
Dylan: Can you see if you can reproduce this on one of our local systems? Matthieu: Can you try running with the environment variable "INTEL_DEBUG=vec4vs". I'm not 100% sure how to do that in Steam, but Dylan or Google may be able to help. This will remove one difference between the IVB and BDW systems.
> Matthieu: Can you try running with the environment variable > "INTEL_DEBUG=vec4vs". I'm not 100% sure how to do that in Steam, but Dylan > or Google may be able to help. This will remove one difference between the > IVB and BDW systems. Exporting the variable (with export) before launching steam seems to do the trick, however the issue is still there. Would the vec4vs output be useful to you ?
(In reply to Matthieu Bouron from comment #4) > > Matthieu: Can you try running with the environment variable > > "INTEL_DEBUG=vec4vs". I'm not 100% sure how to do that in Steam, but Dylan > > or Google may be able to help. This will remove one difference between the > > IVB and BDW systems. > > Exporting the variable (with export) before launching steam seems to do the > trick, however the issue is still there. Would the vec4vs output be useful > to you ? Thanks for trying. If that doesn't workaround the problem, output from that configuration won't help. I knew it was a bit of a long shot.
I can reproduce the issue on an Intel(R) Core(TM) i5-5250U CPU @ 1.60GHz (HD6000) running arch linux (installed/booted using the legacy bios mode not uefi). libdrm : 2.4.64 mesa: 10.6.4 x86-video-intel: 2.99.917+381+g5772556-1 linux: 4.1.5
If there is anything I can do to help fixing the issue, please let me know.
I can confirm having the same issue, on a T450s with an i7-5600U CPU, running Archlinux. libdrm: 2.4.64 mesa: 10.6.7 xf86-video-intel: 1:2.99.917+381+g5772556 linux: 4.1.6 Please let me know if I can help in any way.
Same issue here. Same system (Archlinux), so libraries versions mentioned are the same. Hardware is a NUC with an Intel HD Graphics 6000 (http://www.intel.com/content/www/us/en/nuc/nuc-kit-nuc5i5ryh.html). Game is almost unplayable because of this.
(In reply to Ian Romanick from comment #5) > (In reply to Matthieu Bouron from comment #4) > > > Matthieu: Can you try running with the environment variable > > > "INTEL_DEBUG=vec4vs". I'm not 100% sure how to do that in Steam, but Dylan > > > or Google may be able to help. This will remove one difference between the > > > IVB and BDW systems. > > > > Exporting the variable (with export) before launching steam seems to do the > > trick, however the issue is still there. Would the vec4vs output be useful > > to you ? > > Thanks for trying. If that doesn't workaround the problem, output from that > configuration won't help. I knew it was a bit of a long shot. I think Matthieu's reply has confused the issue. I think he meant that INTEL_DEBUG=vec4vs indeed works around the issue, but that the underlying issue still exists, not that using INTEL_DEBUG=vec4vs did not do anything differently.
(In reply to Matt Turner from comment #10) > (In reply to Ian Romanick from comment #5) > > (In reply to Matthieu Bouron from comment #4) > > > > Matthieu: Can you try running with the environment variable > > > > "INTEL_DEBUG=vec4vs". I'm not 100% sure how to do that in Steam, but Dylan > > > > or Google may be able to help. This will remove one difference between the > > > > IVB and BDW systems. > > > > > > Exporting the variable (with export) before launching steam seems to do the > > > trick, however the issue is still there. Would the vec4vs output be useful > > > to you ? > > > > Thanks for trying. If that doesn't workaround the problem, output from that > > configuration won't help. I knew it was a bit of a long shot. > > I think Matthieu's reply has confused the issue. I think he meant that > INTEL_DEBUG=vec4vs indeed works around the issue, but that the underlying > issue still exists, not that using INTEL_DEBUG=vec4vs did not do anything > differently. Sorry, my initial reply was not very clear. INTEL_DEBUG=vec4vs did *not* work around the issue.
Could you run Steam from terminal and start the game & report whether there are e.g. any Mesa shader compiler errors? (In reply to Matthieu Bouron from comment #2) > I also experience better framerates with the ivy bridge setup (Intel(R) > Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same > video settings on the following benchmark test > http://steamcommunity.com/sharedfiles/filedetails/?id=500334237: > * the ivy bridge setup gives 42fps average. 16 EUs @ 1250 Mhz > * the broadwell setup gives 31fps average. 24 EUs @ 950 Mhz -> More ALU power > The same performance difference appears running glxgears fullscreen (same > resolution) with vblank_mode=0 > > There is not trace of cpu throttling in dmesg during the tests. > The gpu frequency seems to be stuck at 950mhz during the test most of the > time. It varies from time to time to 900mhz > (using the intel_gpu_frequency tool). I think that just tells (or sets) kernel GPU frequency requests, it doesn't report at what speed GPU is actually run (tgat can be limited by firmware due to TDP or temperature). You need to poll actual GPU frequency, e.g. like this (as root): while true; do cat /sys/class/drm/card0/gt_act_freq_mhz; sleep 1; done If your GPU freq doesn't get TDP limited [1], you either have slower memory in BDW, or you use only one memory channel. Check "sudo dmidecode" output, it will tell in which channels your memory is and what is its speed. [1] Your BDW seems to have 15W limit (while your IVB has 35W limit): http://ark.intel.com/products/85215/Intel-Core-i7-5600U-Processor-4M-Cache-up-to-3_20-GHz
(In reply to Eero Tamminen from comment #12) > Could you run Steam from terminal and start the game & report whether there > are e.g. any Mesa shader compiler errors? It looks like there is not mesa compiler error, according to http://pastie.org/private/tx8oids2l8zzw5dafthbcg. > > (In reply to Matthieu Bouron from comment #2) > > I also experience better framerates with the ivy bridge setup (Intel(R) > > Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same > > video settings on the following benchmark test > > http://steamcommunity.com/sharedfiles/filedetails/?id=500334237: > > * the ivy bridge setup gives 42fps average. > > 16 EUs @ 1250 Mhz > > > * the broadwell setup gives 31fps average. > > 24 EUs @ 950 Mhz -> More ALU power > > > > The same performance difference appears running glxgears fullscreen (same > > resolution) with vblank_mode=0 > > > > There is not trace of cpu throttling in dmesg during the tests. > > The gpu frequency seems to be stuck at 950mhz during the test most of the > > time. It varies from time to time to 900mhz > > (using the intel_gpu_frequency tool). > > I think that just tells (or sets) kernel GPU frequency requests, it doesn't > report at what speed GPU is actually run (tgat can be limited by firmware > due to TDP or temperature). > > You need to poll actual GPU frequency, e.g. like this (as root): > while true; do cat /sys/class/drm/card0/gt_act_freq_mhz; sleep 1; done > It stays at 950mhz most of the time, sometimes going to 900mhz, even through the framerate is erractic, constantly going from 60+ to 15fps while looking at the same place. > If your GPU freq doesn't get TDP limited [1], you either have slower memory > in BDW, or you use only one memory channel. Check "sudo dmidecode" output, > it will tell in which channels your memory is and what is its speed. > > [1] Your BDW seems to have 15W limit (while your IVB has 35W limit): > http://ark.intel.com/products/85215/Intel-Core-i7-5600U-Processor-4M-Cache- > up-to-3_20-GHz
Same with NUC5i7RYH (Intel® Iris™ Graphics 6100, http://www.intel.com/content/www/us/en/nuc/nuc-kit-nuc5i7ryh.html → http://ark.intel.com/products/84993/Intel-Core-i7-5557U-Processor-4M-Cache-up-to-3_40-GHz?q=i7-5557U) http://imgur.com/a/bydpS ...
The issue is still there with mesa 11.1.2.
This seems fixed for me with 11.2.0 for the Intel(R) Iris 6100 (Broadwell GT3).
The issue is still there with mesa 1.2.0 for the HD Graphics 5500 (Broadwell GT2) (0x1616). Extended renderer info (GLX_MESA_query_renderer): Vendor: Intel Open Source Technology Center (0x8086) Device: Mesa DRI Intel(R) HD Graphics 5500 (Broadwell GT2) (0x1616) Version: 11.2.0 Accelerated: yes Video memory: 3072MB Unified memory: yes Preferred profile: core (0x1) Max core profile version: 3.3 Max compat profile version: 3.0 Max GLES1 profile version: 1.1 Max GLES[23] profile version: 3.1
Could you try a recent snapshot of master? We've recently overhauled our Gen8+ blitting code to follow the approach we took on Gen6-7, which could have an effect on this.
(In reply to Kenneth Graunke from comment #18) > Could you try a recent snapshot of master? We've recently overhauled our > Gen8+ blitting code to follow the approach we took on Gen6-7, which could > have an effect on this. I'm currently rebuilding a snapshot from today. I'll post the results when possible.
(In reply to Matthieu Bouron from comment #19) > (In reply to Kenneth Graunke from comment #18) > > Could you try a recent snapshot of master? We've recently overhauled our > > Gen8+ blitting code to follow the approach we took on Gen6-7, which could > > have an effect on this. > > I'm currently rebuilding a snapshot from today. I'll post the results when > possible. Late reply but I was not able to rebuild a snapshot that day because of the llvm dependency which failed to build. I have the same issue with the Intel(R) Core(TM) i7-6770HQ CPU @ 2.60GHz / Iris 580 Pro (from the Intel NUC Skull Canyon) with mesa 12.0.3.
Same here with that comment. It's still a nightmare: http://imgur.com/a/Tq49L
Problem still present in 13.0.3. Allow me to add a new compilation of pictures: http://imgur.com/a/xyzT6
Issue still there in mesa 17.0.0.
hello. I can confirm the issue on KBL also (checked with mesa 17.2.8 and 18.1.0) (At the same time game looks fine on radeon GPU on the same laptop). Also I found out, that on SNB (mesa 18.0.1, default from OpenSUSE) - issue doesn't exist. Side note, that in source games brightness level doesn't work at all.
Created attachment 139342 [details] screenshots from radeon and intel
The game was tested on Kaby Lake and textures were dark. Brightness setting in game doesn't work on all OS and GPU. Brightness on KBL can be fixed changing globally in system or changing settings in steam by adding string "setting.mat_tonemapping_occlusion_use_stencil" "1" in ~/.steam/steam/userdata/844520631/730/local/cfg/video.txt. This approach increases brightness on Kaby Lake but does nothing on Ivy Bridge. Changing default brightness in videodefaults.txt doesn't work. In the beggining there is a line "setting.mat_monitorgamma" "2.200000" in video.txt but after changing brightness in game this line in file dissapears. Mesa cann't delete this line with brightness setting. Changing settings in steam user can obtain textures like with radeon. Looks like issues are not conected with mesa.
-- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/1488.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.