Bug 91617

Summary: [BDW] Incorrect rendering in CS:GO
Product: Mesa Reporter: Matthieu Bouron <matthieu.bouron>
Component: Drivers/DRI/i965Assignee: Ian Romanick <idr>
Status: RESOLVED MOVED QA Contact: Intel 3D Bugs Mailing List <intel-3d-bugs>
Severity: normal    
Priority: medium CC: baker.dylan.c, eero.t.tamminen, freedesktop, mattst88, remi
Version: git   
Hardware: Other   
OS: All   
Whiteboard:
i915 platform: i915 features:
Bug Depends on:    
Bug Blocks: 93185    
Attachments: screenshots from radeon and intel

Description Matthieu Bouron 2015-08-12 16:06:03 UTC
Hello,

CS:GO is not rendered correctly on an intel HD5500 (Intel(R) Core(TM) i7-5600U) using mesa git (as of 02a4fe22b137d4bc8378bedd8319109fd23a50e3). The issue was present in mesa 10.6 and mesa 10.6.1. I haven't tested with older versions.

libdrm version is 2.4.62
kernel version is 4.1.4
xf86-video-intel version is 1:2.99.917+381+g5772556

Here are some screenshots comparing the output from an intel hd4000 (mesa 10.6.1) on the left of the screnshots and the hd5500 on the right.

It looks like the hdr/bloom postprocessing filters are missing, causing the game to render darker in a lot of areas which makes it unplayable in those.

http://0x5c.me/intel/01.png
http://0x5c.me/intel/02.png
http://0x5c.me/intel/03.png
http://0x5c.me/intel/04.png

The issue is more visible on screenshot 01.png and 03.png.

Let me know if you need more debug information.
Comment 1 Matthieu Bouron 2015-08-15 10:05:12 UTC
How should I proceed to further debug this issue ?
Comment 2 Matthieu Bouron 2015-08-18 10:03:54 UTC
Here are some screenshot that highlight a bit more the issue (ivy bridge is on the left and broadwell is on the right):

http://0x5c.me/intel2/02.png
http://0x5c.me/intel2/03.png
http://0x5c.me/intel2/04.png
http://0x5c.me/intel2/05.png

Here are some information about the setup:

The machine is a Lenovo T450s with an Intel(R) Core(TM) i7-5600U @ 2.60GHz with 12gb of ram running Arch linux. It is installed/booted using the legacy bios method (can it have an influence ?)

Mesa version is git as of df97126731a745c1797c783414a44652be039d84.
libdrm version is 2.4.63
xf86-video-intel version is based on git master as of last week.

I also experience better framerates with the ivy bridge setup (Intel(R) Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same video settings on the following benchmark test http://steamcommunity.com/sharedfiles/filedetails/?id=500334237:
  * the ivy bridge setup gives 42fps average.
  * the broadwell setup gives 31fps average.

The same performance difference appears running glxgears fullscreen (same resolution) with vblank_mode=0

There is not trace of cpu throttling in dmesg during the tests.
The gpu frequency seems to be stuck at 950mhz during the test most of the time. It varies from time to time to 900mhz (using the intel_gpu_frequency tool).
Comment 3 Ian Romanick 2015-08-19 03:00:53 UTC
Dylan: Can you see if you can reproduce this on one of our local systems?

Matthieu: Can you try running with the environment variable "INTEL_DEBUG=vec4vs".  I'm not 100% sure how to do that in Steam, but Dylan or Google may be able to help.  This will remove one difference between the IVB and BDW systems.
Comment 4 Matthieu Bouron 2015-08-19 07:46:15 UTC
> Matthieu: Can you try running with the environment variable
> "INTEL_DEBUG=vec4vs".  I'm not 100% sure how to do that in Steam, but Dylan
> or Google may be able to help.  This will remove one difference between the
> IVB and BDW systems.

Exporting the variable (with export) before launching steam seems to do the trick, however the issue is still there. Would the vec4vs output be useful to you ?
Comment 5 Ian Romanick 2015-08-20 18:36:56 UTC
(In reply to Matthieu Bouron from comment #4)
> > Matthieu: Can you try running with the environment variable
> > "INTEL_DEBUG=vec4vs".  I'm not 100% sure how to do that in Steam, but Dylan
> > or Google may be able to help.  This will remove one difference between the
> > IVB and BDW systems.
> 
> Exporting the variable (with export) before launching steam seems to do the
> trick, however the issue is still there. Would the vec4vs output be useful
> to you ?

Thanks for trying.  If that doesn't workaround the problem, output from that configuration won't help.  I knew it was a bit of a long shot.
Comment 6 Matthieu Bouron 2015-08-22 12:20:49 UTC
I can reproduce the issue on an Intel(R) Core(TM) i5-5250U CPU @ 1.60GHz (HD6000) running arch linux (installed/booted using the legacy bios mode not uefi).

libdrm : 2.4.64
mesa: 10.6.4
x86-video-intel: 2.99.917+381+g5772556-1
linux: 4.1.5
Comment 7 Matthieu Bouron 2015-08-31 16:41:25 UTC
If there is anything I can do to help fixing the issue, please let me know.
Comment 8 Nofun 2015-09-14 04:22:23 UTC
I can confirm having the same issue, on a T450s with an i7-5600U CPU, running Archlinux.

libdrm: 2.4.64
mesa: 10.6.7
xf86-video-intel: 1:2.99.917+381+g5772556
linux: 4.1.6

Please let me know if I can help in any way.
Comment 9 ubitux 2015-09-23 08:11:51 UTC
Same issue here. Same system (Archlinux), so libraries versions mentioned are the same. Hardware is a NUC with an Intel HD Graphics 6000 (http://www.intel.com/content/www/us/en/nuc/nuc-kit-nuc5i5ryh.html).

Game is almost unplayable because of this.
Comment 10 Matt Turner 2015-12-03 19:21:44 UTC
(In reply to Ian Romanick from comment #5)
> (In reply to Matthieu Bouron from comment #4)
> > > Matthieu: Can you try running with the environment variable
> > > "INTEL_DEBUG=vec4vs".  I'm not 100% sure how to do that in Steam, but Dylan
> > > or Google may be able to help.  This will remove one difference between the
> > > IVB and BDW systems.
> > 
> > Exporting the variable (with export) before launching steam seems to do the
> > trick, however the issue is still there. Would the vec4vs output be useful
> > to you ?
> 
> Thanks for trying.  If that doesn't workaround the problem, output from that
> configuration won't help.  I knew it was a bit of a long shot.

I think Matthieu's reply has confused the issue. I think he meant that INTEL_DEBUG=vec4vs indeed works around the issue, but that the underlying issue still exists, not that using INTEL_DEBUG=vec4vs did not do anything differently.
Comment 11 Matthieu Bouron 2015-12-04 08:40:04 UTC
(In reply to Matt Turner from comment #10)
> (In reply to Ian Romanick from comment #5)
> > (In reply to Matthieu Bouron from comment #4)
> > > > Matthieu: Can you try running with the environment variable
> > > > "INTEL_DEBUG=vec4vs".  I'm not 100% sure how to do that in Steam, but Dylan
> > > > or Google may be able to help.  This will remove one difference between the
> > > > IVB and BDW systems.
> > > 
> > > Exporting the variable (with export) before launching steam seems to do the
> > > trick, however the issue is still there. Would the vec4vs output be useful
> > > to you ?
> > 
> > Thanks for trying.  If that doesn't workaround the problem, output from that
> > configuration won't help.  I knew it was a bit of a long shot.
> 
> I think Matthieu's reply has confused the issue. I think he meant that
> INTEL_DEBUG=vec4vs indeed works around the issue, but that the underlying
> issue still exists, not that using INTEL_DEBUG=vec4vs did not do anything
> differently.

Sorry, my initial reply was not very clear. INTEL_DEBUG=vec4vs did *not* work around the issue.
Comment 12 Eero Tamminen 2015-12-08 10:20:09 UTC
Could you run Steam from terminal and start the game & report whether there are e.g. any Mesa shader compiler errors?

(In reply to Matthieu Bouron from comment #2)
> I also experience better framerates with the ivy bridge setup (Intel(R)
> Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same
> video settings on the following benchmark test
> http://steamcommunity.com/sharedfiles/filedetails/?id=500334237:
>   * the ivy bridge setup gives 42fps average.

16 EUs @ 1250 Mhz

>   * the broadwell setup gives 31fps average.

24 EUs @ 950 Mhz  -> More ALU power


> The same performance difference appears running glxgears fullscreen (same
> resolution) with vblank_mode=0
>
> There is not trace of cpu throttling in dmesg during the tests.
> The gpu frequency seems to be stuck at 950mhz during the test most of the
> time. It varies from time to time to 900mhz
> (using the intel_gpu_frequency tool).

I think that just tells (or sets) kernel GPU frequency requests, it doesn't report at what speed GPU is actually run (tgat can be limited by firmware due to TDP or temperature).

You need to poll actual GPU frequency, e.g. like this (as root):
  while true; do cat /sys/class/drm/card0/gt_act_freq_mhz; sleep 1; done

If your GPU freq doesn't get TDP limited [1], you either have slower memory in BDW, or you use only one memory channel.  Check "sudo dmidecode" output, it will tell in which channels your memory is and what is its speed.

[1] Your BDW seems to have 15W limit (while your IVB has 35W limit):
http://ark.intel.com/products/85215/Intel-Core-i7-5600U-Processor-4M-Cache-up-to-3_20-GHz
Comment 13 Matthieu Bouron 2015-12-15 22:25:42 UTC
(In reply to Eero Tamminen from comment #12)
> Could you run Steam from terminal and start the game & report whether there
> are e.g. any Mesa shader compiler errors?

It looks like there is not mesa compiler error, according to http://pastie.org/private/tx8oids2l8zzw5dafthbcg.

> 
> (In reply to Matthieu Bouron from comment #2)
> > I also experience better framerates with the ivy bridge setup (Intel(R)
> > Core(TM) i7-3520M CPU @ 2.90GHz). At the same resolution, using the same
> > video settings on the following benchmark test
> > http://steamcommunity.com/sharedfiles/filedetails/?id=500334237:
> >   * the ivy bridge setup gives 42fps average.
> 
> 16 EUs @ 1250 Mhz
> 
> >   * the broadwell setup gives 31fps average.
> 
> 24 EUs @ 950 Mhz  -> More ALU power
> 
> 
> > The same performance difference appears running glxgears fullscreen (same
> > resolution) with vblank_mode=0
> >
> > There is not trace of cpu throttling in dmesg during the tests.
> > The gpu frequency seems to be stuck at 950mhz during the test most of the
> > time. It varies from time to time to 900mhz
> > (using the intel_gpu_frequency tool).
> 
> I think that just tells (or sets) kernel GPU frequency requests, it doesn't
> report at what speed GPU is actually run (tgat can be limited by firmware
> due to TDP or temperature).
> 
> You need to poll actual GPU frequency, e.g. like this (as root):
>   while true; do cat /sys/class/drm/card0/gt_act_freq_mhz; sleep 1; done
> 

It stays at 950mhz most of the time, sometimes going to 900mhz, even through the framerate is erractic, constantly going from 60+ to 15fps while looking at the same place.

> If your GPU freq doesn't get TDP limited [1], you either have slower memory
> in BDW, or you use only one memory channel.  Check "sudo dmidecode" output,
> it will tell in which channels your memory is and what is its speed.
> 
> [1] Your BDW seems to have 15W limit (while your IVB has 35W limit):
> http://ark.intel.com/products/85215/Intel-Core-i7-5600U-Processor-4M-Cache-
> up-to-3_20-GHz
Comment 15 Matthieu Bouron 2016-03-22 09:20:39 UTC
The issue is still there with mesa 11.1.2.
Comment 16 ubitux 2016-04-11 08:30:43 UTC
This seems fixed for me with 11.2.0 for the Intel(R) Iris 6100 (Broadwell GT3).
Comment 17 Matthieu Bouron 2016-04-11 08:33:36 UTC
The issue is still there with mesa 1.2.0 for the HD Graphics 5500 (Broadwell GT2) (0x1616).

Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Intel Open Source Technology Center (0x8086)
    Device: Mesa DRI Intel(R) HD Graphics 5500 (Broadwell GT2)  (0x1616)
    Version: 11.2.0
    Accelerated: yes
    Video memory: 3072MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 3.3
    Max compat profile version: 3.0
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.1
Comment 18 Kenneth Graunke 2016-05-08 04:35:19 UTC
Could you try a recent snapshot of master?  We've recently overhauled our Gen8+ blitting code to follow the approach we took on Gen6-7, which could have an effect on this.
Comment 19 Matthieu Bouron 2016-05-10 10:47:44 UTC
(In reply to Kenneth Graunke from comment #18)
> Could you try a recent snapshot of master?  We've recently overhauled our
> Gen8+ blitting code to follow the approach we took on Gen6-7, which could
> have an effect on this.

I'm currently rebuilding a snapshot from today. I'll post the results when possible.
Comment 20 Matthieu Bouron 2016-09-21 08:26:51 UTC
(In reply to Matthieu Bouron from comment #19)
> (In reply to Kenneth Graunke from comment #18)
> > Could you try a recent snapshot of master?  We've recently overhauled our
> > Gen8+ blitting code to follow the approach we took on Gen6-7, which could
> > have an effect on this.
> 
> I'm currently rebuilding a snapshot from today. I'll post the results when
> possible.

Late reply but I was not able to rebuild a snapshot that day because of the llvm dependency which failed to build.

I have the same issue with the Intel(R) Core(TM) i7-6770HQ CPU @ 2.60GHz / Iris 580 Pro (from the Intel NUC Skull Canyon) with mesa 12.0.3.
Comment 21 ubitux 2016-09-21 08:30:12 UTC
Same here with that comment. It's still a nightmare: http://imgur.com/a/Tq49L
Comment 22 ubitux 2017-01-31 11:56:45 UTC
Problem still present in 13.0.3.

Allow me to add a new compilation of pictures: http://imgur.com/a/xyzT6
Comment 23 ubitux 2017-02-20 09:45:21 UTC
Issue still there in mesa 17.0.0.
Comment 24 Denis 2018-05-04 10:05:26 UTC
hello. I can confirm the issue on KBL also (checked with mesa 17.2.8 and 18.1.0)
(At the same time game looks fine on radeon GPU on the same laptop).

Also I found out, that on SNB (mesa 18.0.1, default from OpenSUSE) - issue doesn't exist.

Side note, that in source games brightness level doesn't work at all.
Comment 25 Denis 2018-05-04 10:25:21 UTC
Created attachment 139342 [details]
screenshots from radeon and intel
Comment 26 oleksandr.nikitin 2018-05-18 07:44:32 UTC
The game was tested on Kaby Lake and textures were dark.
Brightness setting in game doesn't work on all OS and GPU.
Brightness on KBL can be fixed changing globally in system or changing settings in steam by adding string "setting.mat_tonemapping_occlusion_use_stencil" "1" in ~/.steam/steam/userdata/844520631/730/local/cfg/video.txt.
This approach increases brightness on Kaby Lake but does nothing on Ivy Bridge.
Changing default brightness in videodefaults.txt doesn't work. 
In the beggining there is a line "setting.mat_monitorgamma" "2.200000" in video.txt but after changing brightness in game this line in file dissapears.

Mesa cann't delete this line with brightness setting. Changing settings in steam user can obtain textures like with radeon. Looks like issues are not conected with mesa.
Comment 27 GitLab Migration User 2019-09-25 18:54:18 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/1488.

Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.