Bug 92059 - [radeonsi, apitrace] Missing textures and geometry in "Middle-earth: Shadow of Mordor"
Summary: [radeonsi, apitrace] Missing textures and geometry in "Middle-earth: Shadow o...
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/Gallium/radeonsi (show other bugs)
Version: git
Hardware: x86-64 (AMD64) Linux (All)
: medium normal
Assignee: Default DRI bug account
QA Contact: Default DRI bug account
URL:
Whiteboard: apitrace
Keywords:
Depends on:
Blocks: 77449
  Show dependency treegraph
 
Reported: 2015-09-20 17:30 UTC by Kai
Modified: 2016-06-20 19:12 UTC (History)
9 users (show)

See Also:
i915 platform:
i915 features:


Attachments
Screenshot from the benchmark mode showing missing geometry and textures (730.56 KB, image/jpeg)
2015-09-20 17:30 UTC, Kai
Details
Screenshot from the first moments in the game (900.49 KB, image/jpeg)
2015-09-20 17:32 UTC, Kai
Details
Compressed shader that fails during compilation (1.31 KB, application/x-xz)
2015-09-21 17:00 UTC, Kai
Details
shader that fails during compilation (7.97 KB, text/plain)
2015-09-21 17:02 UTC, Ilia Mirkin
Details
Setting environment variables yields visible bodies. (1.02 MB, image/jpeg)
2015-09-22 14:03 UTC, Kai
Details
som.log (219.46 KB, text/plain)
2015-11-17 20:11 UTC, Lorenzo Bona
Details
Screenshot on Tonga 380X with Mesa git-59156b2 (275.35 KB, image/jpeg)
2016-05-14 20:02 UTC, Vedran Miletić
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Kai 2015-09-20 17:30:55 UTC
Created attachment 118375 [details]
Screenshot from the benchmark mode showing missing geometry and textures

After launching "Middle-earth: Shadow of Mordor" I noticed, that large parts (everything except the head) of the player character – shown in the background of the main menu – geometry were missing. It became even more obvious, when you run the benchmark, where the entire floor is not rendered (see the attached screenshot) and again the player character has no body and only a head and weapons rendered. I haven't played with the graphics settings extensively yet, but when I first noticed the missing geometry in the main menu, the settings were all on "low" since the game only supports the proprietary Nvidia driver officially and falls back to "bare minimum" with everything else. The benchmark was done with "high" settings.

I'll also attach a screenshot from the first moment in game, where you can see, that this issue seems to affect other characters as well. While a search for a similar bug with Shadow of Mordor didn't yield a result, I did find bug 91110, which also talks about missing character geometry/textures with the radeonsi driver, although with a different game.

The game was run on the following stack (Debian testing as a base):
GPU: Hawaii PRO [Radeon R9 290] (ChipID = 0x67b1)
Mesa: Git:master/9ffc1049ca
libdrm: 2.4.64-1
LLVM: SVN:trunk/r247876 (3.8 devel)
X.Org: 2:1.17.2-1.1
Linux: 4.1.6
Firmware: <https://secure.freedesktop.org/~agd5f/radeon_ucode/hawaii/>
> 286640da3d90d7b51bdb038b65addc47  hawaii_ce.bin
> 161105a73f7dfb2fca513327491c32d6  hawaii_mc.bin
> d6195059ea724981c9acd3abd6ee5166  hawaii_me.bin
> ad511d31a4fe3147c8d80b8f6770b8d5  hawaii_mec.bin
> 63eae3f33c77aadbc6ed1a09a2aed81e  hawaii_pfp.bin
> 5b72c73acf0cbd0cbb639302f65bc7dc  hawaii_rlc.bin
> f00de91c24b3520197e1ddb85d99c34a  hawaii_sdma1.bin
> 8e16f749d62b150d0d1f580d71bc4348  hawaii_sdma.bin
> 7b6ca5302b56bd35bf52804919d57e63  hawaii_smc.bin
> 9f2ba7e720e2af4d7605a9a4fd903513  hawaii_uvd.bin
> b0f2a043e72fbf265b2f858b8ddbdb09  hawaii_vce.bin
libclc: Git:master/4346c30bae
DDX: Git:master/0288a4b87b

Let me know, if you need something else. I'm guessing an apitrace might be high on your list?
Comment 1 Kai 2015-09-20 17:32:41 UTC
Created attachment 118376 [details]
Screenshot from the first moments in the game

As you can see from this screenshot, the geometry/textures seem to be missing for all characters, not just the player character. Interestingly enough the shadows are drawn correctly for both characters.
Comment 2 smoki 2015-09-20 23:38:43 UTC
 Good that you can start it, i heard game use compute shaders with atomic counters, so bunch of extensions we missing.
Comment 3 Kenneth Graunke 2015-09-21 15:19:45 UTC
It looks pretty much the same on i965/broadwell as well.  FWIW, I've overheard that the game requires tessellation shaders for terrain and character rendering, and if your driver doesn't support them, the game simply skips the associated drawing.

But on radeonsi, you should have tessellation, so maybe there's something else going on...
Comment 4 Kai 2015-09-21 15:33:57 UTC
(In reply to Kenneth Graunke from comment #3)
> It looks pretty much the same on i965/broadwell as well.  FWIW, I've
> overheard that the game requires tessellation shaders for terrain and
> character rendering, and if your driver doesn't support them, the game
> simply skips the associated drawing.

Nice... shouldn't the game check the features and then either fall back to some legacy code path or refuse to run with a sensible error message? Ah well.
Also I find this a bit curious, since there *is* a checkbox in the graphics menu, letting me toggle tessellation.

> But on radeonsi, you should have tessellation, so maybe there's something
> else going on...

Yes, I do have working tessellation – at least it works with Unigine Heaven for example. Of course tessellation is only supported in Core Contexts, maybe the game fails miserably there and creates a Compatibility Context? (Btw, since I'm seeing many games/porters getting context creation wrong, I was wondering what Mac OS X hands out on default? AFAIK they only support higher OpenGL feature levels in a Core Context as well. But there the games most of the times seem to work, from what I'm hearing. This leads to me wondering whether Apple just defaults to Core Contexts and whether Mesa might want to do the same then, if true.)
Is there a way I can force a core context through some environment variable?
Comment 5 Kai 2015-09-21 17:00:06 UTC
Created attachment 118391 [details]
Compressed shader that fails during compilation

Ok, I'm pretty sure now, that the game actually loads a Core Context, at least the apitrace I've made (will be uploaded shortly, as soon as the compression is done) shows a 4.1 context, when I do "lookup state".

Also, the game seems to choke on the missing AoA functionality or at least doesn't check whether it can use AoA:
> 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays

The shader triggering this is, however, a tessellation control shader, so maybe Kenneth is right anyway (ie. I don't get the geometry with radeonsi because the tessellation control shader fails to compile due to another extension, that is as of yet missing while i965-supported GPUs don't get the geometry because they're lacking tessellation to begin with).

The attached file is the shader triggering this error in the following call sequence:
glCreateShader(GL_TESS_CONTROL_SHADER); # = 594
glShaderSource(594, 1, <SOURCE FROM ATTACHED FILE>, NULL);
glCompileShader(594); # compile error
Comment 6 Ilia Mirkin 2015-09-21 17:02:22 UTC
Created attachment 118392 [details]
shader that fails during compilation

[reattaching in a non-compressed format so that people can actually see it without performing gymnastics]
Comment 7 Ilia Mirkin 2015-09-21 17:07:33 UTC
(In reply to Kai from comment #5)
> Also, the game seems to choke on the missing AoA functionality or at least
> doesn't check whether it can use AoA:
> > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays

I guess line 9 is: out vec4 vControlPoint[][2];

Which should work without AoA. I wonder if this was recently broken by the AoA support patches... Or maybe it started out broken.
Comment 8 Kai 2015-09-21 18:22:00 UTC
I've just uploaded two trace files to a password protected area – to prevent unnecessary downloads – on my server:
- <http://dev.carbon-project.org/debian/mesa.bugs/92059/shadowofmordor_tessellation.trace.xz>
- <http://dev.carbon-project.org/debian/mesa.bugs/92059/shadowofmordor_no-tessellation.trace.xz>

They both display the issues described in comment #0 as well as attachment 118375 [details] and attachment 118376 [details]. shadowofmordor_tessellation.trace was produced with tessellation enabled in the graphics settings, while shadowofmordor_no-tessellation.trace was created with tessellation disabled in the graphics settings. All other settings were identical.

Known Mesa developers can ask me for password access, the password is merely there to prevent unnecessary traffic. If you already have login credentials from other bugs for which I provided trace files, those should continue to work.


(In reply to Ilia Mirkin from comment #7)
> (In reply to Kai from comment #5)
> > Also, the game seems to choke on the missing AoA functionality or at least
> > doesn't check whether it can use AoA:
> > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> 
> I guess line 9 is: out vec4 vControlPoint[][2];
> 
> Which should work without AoA. I wonder if this was recently broken by the
> AoA support patches... Or maybe it started out broken.

I might have missed something, but I'm not seeing GL_ARB_arrays_of_arrays being exposed on radeonsi?
Comment 9 Timothy Arceri 2015-09-22 05:35:52 UTC
(In reply to Ilia Mirkin from comment #7)
> (In reply to Kai from comment #5)
> > Also, the game seems to choke on the missing AoA functionality or at least
> > doesn't check whether it can use AoA:
> > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> 
> I guess line 9 is: out vec4 vControlPoint[][2];
> 
> Which should work without AoA. I wonder if this was recently broken by the
> AoA support patches... Or maybe it started out broken.

It seems to me that this should fail, and is correctly doing so. From the tessellation spec:

    A tessellation control shader may also declare user-defined per-vertex
    output variables.  User-defined per-vertex output variables are declared
    with the qualifier "out" and have a value for each vertex in the output
    patch.  Such variables must be declared as arrays or inside output blocks
    declared as arrays.  Declaring an array size is optional.  If no size is
    specified, it will be taken from the output patch size (gl_VerticesOut)
    declared in the shader.  If a size is specified, it must match the maximum
    patch size; otherwise, a compile or link link error will occur.  The OpenGL Shading
    Language doesn't support multi-dimensional arrays; therefore, user-defined
    per-vertex tessellation control shader outputs with multiple elements per
    vertex must be declared as array members of an output block that is itself
    declared as an array.

Is there something I'm missing?
Comment 10 Ilia Mirkin 2015-09-22 05:46:49 UTC
(In reply to Timothy Arceri from comment #9)
> (In reply to Ilia Mirkin from comment #7)
> > (In reply to Kai from comment #5)
> > > Also, the game seems to choke on the missing AoA functionality or at least
> > > doesn't check whether it can use AoA:
> > > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> > 
> > I guess line 9 is: out vec4 vControlPoint[][2];
> > 
> > Which should work without AoA. I wonder if this was recently broken by the
> > AoA support patches... Or maybe it started out broken.
> 
> It seems to me that this should fail, and is correctly doing so. From the
> tessellation spec:
>
> [...]
> 
> Is there something I'm missing?

Quite right. I forgot about that little bit in the spec. So the issue here is that (a) AoA isn't supported in mesa, (b) even if it was, the shader doesn't enable it. Without that, you can't have plain per-vertex array outputs in TCS.

You could force-enable it by setting force_glsl_extensions_warn=1 and MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays ... I think.
Comment 11 Kai 2015-09-22 14:03:04 UTC
Created attachment 118397 [details]
Setting environment variables yields visible bodies.

(In reply to Ilia Mirkin from comment #10)
> (In reply to Timothy Arceri from comment #9)
> > (In reply to Ilia Mirkin from comment #7)
> > > (In reply to Kai from comment #5)
> > > > Also, the game seems to choke on the missing AoA functionality or at least
> > > > doesn't check whether it can use AoA:
> > > > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> > > 
> > > I guess line 9 is: out vec4 vControlPoint[][2];
> > > 
> > > Which should work without AoA. I wonder if this was recently broken by the
> > > AoA support patches... Or maybe it started out broken.
> > 
> > It seems to me that this should fail, and is correctly doing so. From the
> > tessellation spec:
> >
> > [...]
> > 
> > Is there something I'm missing?
> 
> Quite right. I forgot about that little bit in the spec. So the issue here
> is that (a) AoA isn't supported in mesa, (b) even if it was, the shader
> doesn't enable it. Without that, you can't have plain per-vertex array
> outputs in TCS.

So, I should probably report this bug to Ferral Interactive (studio responsible for the Linux port), right?

> You could force-enable it by setting force_glsl_extensions_warn=1 and
> MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays ... I think.

The correct override is:
# force_glsl_extensions_warn=true MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays
Setting force_glsl_extension_warn=1 leads to an error. And indeed, setting those two environment variables leads to visible characters in the game, see the attached screenshot.

Should this bug be renamed to »[radeonsi] Implement GL_ARB_arrays_of_arrays for "Middle-earth: Shadow of Mordor"« then?
Comment 12 Ilia Mirkin 2015-09-22 14:53:33 UTC
(In reply to Kai from comment #11)
> Created attachment 118397 [details]
> Setting environment variables yields visible bodies.
> 
> (In reply to Ilia Mirkin from comment #10)
> > (In reply to Timothy Arceri from comment #9)
> > > (In reply to Ilia Mirkin from comment #7)
> > > > (In reply to Kai from comment #5)
> > > > > Also, the game seems to choke on the missing AoA functionality or at least
> > > > > doesn't check whether it can use AoA:
> > > > > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> > > > 
> > > > I guess line 9 is: out vec4 vControlPoint[][2];
> > > > 
> > > > Which should work without AoA. I wonder if this was recently broken by the
> > > > AoA support patches... Or maybe it started out broken.
> > > 
> > > It seems to me that this should fail, and is correctly doing so. From the
> > > tessellation spec:
> > >
> > > [...]
> > > 
> > > Is there something I'm missing?
> > 
> > Quite right. I forgot about that little bit in the spec. So the issue here
> > is that (a) AoA isn't supported in mesa, (b) even if it was, the shader
> > doesn't enable it. Without that, you can't have plain per-vertex array
> > outputs in TCS.
> 
> So, I should probably report this bug to Ferral Interactive (studio
> responsible for the Linux port), right?

That would be ideal.

> 
> > You could force-enable it by setting force_glsl_extensions_warn=1 and
> > MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays ... I think.
> 
> The correct override is:
> # force_glsl_extensions_warn=true
> MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays
> Setting force_glsl_extension_warn=1 leads to an error. And indeed, setting
> those two environment variables leads to visible characters in the game, see
> the attached screenshot.
> 
> Should this bug be renamed to »[radeonsi] Implement GL_ARB_arrays_of_arrays
> for "Middle-earth: Shadow of Mordor"« then?

As I mentioned, merely having the ext available wouldn't make that shader compile. The ext would also have to be enabled in the shader.

However perhaps the game would detect the availability of the ext and stick a "#extension GL_ARB_arrays_of_arrays: enable" into that shader, which would make it work -- no way of knowing that.
Comment 13 Kai 2015-09-22 15:41:46 UTC
(In reply to Ilia Mirkin from comment #12)
> (In reply to Kai from comment #11)
> > (In reply to Ilia Mirkin from comment #10)
> > > (In reply to Timothy Arceri from comment #9)
> > > > (In reply to Ilia Mirkin from comment #7)
> > > > > (In reply to Kai from comment #5)
> > > > > > Also, the game seems to choke on the missing AoA functionality or at least
> > > > > > doesn't check whether it can use AoA:
> > > > > > > 0:9(23): error: GL_ARB_arrays_of_arrays required for defining arrays of arrays
> > > > > 
> > > > > I guess line 9 is: out vec4 vControlPoint[][2];
> > > > > 
> > > > > Which should work without AoA. I wonder if this was recently broken by the
> > > > > AoA support patches... Or maybe it started out broken.
> > > > 
> > > > It seems to me that this should fail, and is correctly doing so. From the
> > > > tessellation spec:
> > > >
> > > > [...]
> > > > 
> > > > Is there something I'm missing?
> > > 
> > > Quite right. I forgot about that little bit in the spec. So the issue here
> > > is that (a) AoA isn't supported in mesa, (b) even if it was, the shader
> > > doesn't enable it. Without that, you can't have plain per-vertex array
> > > outputs in TCS.
> > 
> > So, I should probably report this bug to Ferral Interactive (studio
> > responsible for the Linux port), right?
> 
> That would be ideal.

Done (by e-mail). Lets see, if I hear back from them or if they fix it.

> > > You could force-enable it by setting force_glsl_extensions_warn=1 and
> > > MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays ... I think.
> > 
> > The correct override is:
> > # force_glsl_extensions_warn=true
> > MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays
> > Setting force_glsl_extension_warn=1 leads to an error. And indeed, setting
> > those two environment variables leads to visible characters in the game, see
> > the attached screenshot.
> > 
> > Should this bug be renamed to »[radeonsi] Implement GL_ARB_arrays_of_arrays
> > for "Middle-earth: Shadow of Mordor"« then?
> 
> As I mentioned, merely having the ext available wouldn't make that shader
> compile. The ext would also have to be enabled in the shader.

Yes, I understand. But this would still leave radeonsi without required functionality. So making this bug about enabling GL_ARB_arrays_of_arrays for an application that (pretends*) to need it, seemed reasonable and more descriptive.
 
* I've played about 30 minutes now with those environment variables set and haven't noticed any visual corruption on radeonsi which doesn't expose GL_ARB_arrays_of_arrays. So I'm not sure if the game actually *needs* the extension?
Comment 14 Ilia Mirkin 2015-09-22 15:51:04 UTC
(In reply to Kai from comment #13)
> So I'm not sure if the game actually *needs* the extension?

It does. Without the ext, the shader in question is invalid.

Mesa has partial support for the ext which gets enabled (along with every other available ext) by the force_glsl_bla thing, and that partial support is apparently sufficient for the needs of this game.
Comment 15 Kai 2015-09-26 14:40:45 UTC
(In reply to Ilia Mirkin from comment #14)
> Mesa has partial support for the ext which gets enabled (along with every
> other available ext) by the force_glsl_bla thing, and that partial support
> is apparently sufficient for the needs of this game.

Thanks for the explanation!


I had the chance to see the benchmark mode run on fglrx (not on my system) and noticed, that there was heavy rain visible. As you can see from attachment 118375 [details], there is no rain rendered for me. I just verified with the stack detailed below and »force_glsl_extensions_warn=true MESA_EXTENSION_OVERRIDE=GL_ARB_arrays_of_arrays« set, that this is still the case (no visible missing geometry, but no rain). Currently I'm only seeing the big drops cascading from other geometry (like you can see in the screenshot) and no rain. Not sure, if this is due the partial AoA support or another missing feature.

There are also some hives hanging from ropes (you can see one far off in the screenshot (attachment 118375 [details])) which should have insects buzzing around them. With fglrx those insects are rendered, on my radeonsi they're not.

On the upside: the FPS graph shown with radeonsi looks way more stable than the one I've seen with fglrx (like saw teeth), though that might just be due to the shaders not executed so far. With the stack detailed below, I also managed to get to > 60 FPS in some parts of the benchmark, while I barely saw 40 with the stack from comment #0. On the other hand, I'm still seeing drops to single digit frame rates (just above 3 FPS).

The game was run on the following stack (Debian testing as a base):
GPU: Hawaii PRO [Radeon R9 290] (ChipID = 0x67b1)
Mesa: Git:master/abdab88b30
libdrm: 2.4.64-1
LLVM: SVN:trunk/r248664 (3.8 devel)
X.Org: 2:1.17.2-1.1
Linux: 4.2.1
Firmware: <https://secure.freedesktop.org/~agd5f/radeon_ucode/hawaii/>
> 286640da3d90d7b51bdb038b65addc47  hawaii_ce.bin
> 161105a73f7dfb2fca513327491c32d6  hawaii_mc.bin
> d6195059ea724981c9acd3abd6ee5166  hawaii_me.bin
> ad511d31a4fe3147c8d80b8f6770b8d5  hawaii_mec.bin
> 63eae3f33c77aadbc6ed1a09a2aed81e  hawaii_pfp.bin
> 5b72c73acf0cbd0cbb639302f65bc7dc  hawaii_rlc.bin
> f00de91c24b3520197e1ddb85d99c34a  hawaii_sdma1.bin
> 8e16f749d62b150d0d1f580d71bc4348  hawaii_sdma.bin
> 7b6ca5302b56bd35bf52804919d57e63  hawaii_smc.bin
> 9f2ba7e720e2af4d7605a9a4fd903513  hawaii_uvd.bin
> b0f2a043e72fbf265b2f858b8ddbdb09  hawaii_vce.bin
libclc: Git:master/4346c30bae
DDX: Git:master/0288a4b87b

Let me know, if you need something else.
Comment 16 Kai 2015-09-26 15:32:50 UTC
(In reply to Kai from comment #15)
> On the upside: the FPS graph shown with radeonsi looks way more stable than
> the one I've seen with fglrx (like saw teeth), though that might just be due
> to the shaders not executed so far. With the stack detailed below, I also
> managed to get to > 60 FPS in some parts of the benchmark, while I barely
> saw 40 with the stack from comment #0. On the other hand, I'm still seeing
> drops to single digit frame rates (just above 3 FPS).

Sorry about the reported FPS increase to 60+: the game settings had dropped (I have no idea why, I didn't change them). It's still better than before with the current stack, but not so much, that I would mention it.
Comment 17 Lorenzo Bona 2015-11-17 20:10:50 UTC
This is a log of Shadow of Mordor running on an R7-265 with git mesa/drm/xf86video-ati/xorg/kernel/llvm.

I'm missing texture as Kai reported, and the log is full of: "Mesa: User error: GL_INVALID_OPERATION in GL_PATCHES only valid with tessellation".

If you need more test or data let me know.
Comment 18 Lorenzo Bona 2015-11-17 20:11:37 UTC
Created attachment 119749 [details]
som.log
Comment 19 Edwin Smith (Feral Interactive) 2016-04-07 11:45:10 UTC
We are working on a beta build of Mordor to include a few fixes for upcoming driver changes on Nvidia, at the same time we have made some changes in anticipation of RadesonSi gaining compute_shader support this will mean the rain and beehives should start rendering correctly.

Engineers working on this issue can contact Feral for access if it would help in their work to fix bugs and missing features in Mesa/RadeonSi
Comment 20 Ernst Sjöstrand 2016-04-23 16:55:07 UTC
I think the missing geometry is back, but perhaps we should just wait for the build Edwin mentioned.
Comment 21 Edwin Smith (Feral Interactive) 2016-04-25 09:07:10 UTC
Ernst Sjöstrand, the build is unlikely to be released publicly until the compute shaders are checked into git and available for slightly wider consumption so we can make sure the patch is setup correctly to support the feature when it is released in an official driver update.

However we will expose the beta to developers working on compute via a beta branch to aid testing of the compute feature.
Comment 22 Christoph Haag 2016-04-25 13:43:46 UTC
(In reply to Edwin Smith (Feral Interactive) from comment #21)
> Ernst Sjöstrand, the build is unlikely to be released publicly until the
> compute shaders are checked into git

FYI (and for the people who are testing and are not aware):

1. Compute Shaders for radeonsi have been checked into git 6 days ago:
https://cgit.freedesktop.org/mesa/mesa/commit/?id=464cef5b06e65aa740704e4adac68b7f5fee1b88

2. Compute Shaders for radeonsi require a recent llvm 3.9 svn build.

3. For some (SI only?) GPUs, compute shaders will be disabled even with Linus' current git master of the kernel, but will work with the drm-next-4.7-wip branch from this repository: https://cgit.freedesktop.org/~agd5f/linux/log/?h=drm-next-4.7-wip
(I heard it's because this here is needed: https://cgit.freedesktop.org/~agd5f/linux/commit/?h=drm-next-4.7-wip&id=974cf6cc403d881b9fa939cb2afed19d53afab21)

4. OpenGL 4.3 is not enabled because the mesa devs first want to solve some problems with unreal engine 4, so if you want to test Shadow of Mordor, Alien Isolation, etc. you still need to set
MESA_GL_VERSION_OVERRIDE=4.3 MESA_GLSL_VERSION_OVERRIDE=430.
Comment 23 Vedran Miletić 2016-05-14 20:02:22 UTC
Created attachment 123752 [details]
Screenshot on Tonga 380X with Mesa git-59156b2

Great progress, but there are still some glitches. Game is running with MESA_GL_VERSION_OVERRIDE=4.3 MESA_GLSL_VERSION_OVERRIDE=430 %command%
Comment 24 Jan Ziak 2016-06-06 09:31:19 UTC
(In reply to Vedran Miletić from comment #23)
> Created attachment 123752 [details]
> Screenshot on Tonga 380X with Mesa git-59156b2
> 
> Great progress, but there are still some glitches. Game is running with
> MESA_GL_VERSION_OVERRIDE=4.3 MESA_GLSL_VERSION_OVERRIDE=430 %command%

Hello. Are the glitches still reproducible with current mesa-git?
Comment 25 Kai 2016-06-20 19:12:15 UTC
As far as I'm concerned everything looks good on my R9 290 with the stack detailed below and no special environment variables (not needed since radeonsi started supporting 4.3/4.30 natively.

The only "glitch" I'm seeing is, that the player character shown during load screens gets a wet look right before the image switches over to the actual game. But that might also be a game bug and since it doesn't affect gameplay, I don't really care.

Thanks to everyone involved who made > 30 FPS with (almost) full details at 2560×1440 possible.


The current stack I'm using:
GPU: Hawaii PRO [Radeon R9 290] (ChipID = 0x67b1)
Mesa: Git:master/5a64549f54
libdrm: 2.4.68-1
LLVM: SVN:trunk/r272995 (3.9 devel)
X.Org: 2:1.18.3-1
Linux: 4.6.2
Firmware: firmware-amd-graphics/20160110-1
libclc: Git:master/20d977a3e6
DDX: 1:7.7.0-1


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.