Bug 89148 - r300g: Kernel rejected CS in Wine d3d multisample test
Summary: r300g: Kernel rejected CS in Wine d3d multisample test
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/Gallium/r300 (show other bugs)
Version: git
Hardware: Other All
: medium normal
Assignee: Default DRI bug account
QA Contact: Default DRI bug account
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2015-02-14 16:26 UTC by Stefan Dösinger
Modified: 2015-03-10 10:36 UTC (History)
0 users

See Also:
i915 platform:
i915 features:


Attachments
Test program (3.21 KB, text/plain)
2015-02-17 11:43 UTC, Stefan Dösinger
Details
add some debugging output (2.71 KB, patch)
2015-02-18 15:19 UTC, Alex Deucher
Details | Splinter Review
workaround (1.93 KB, patch)
2015-02-21 11:06 UTC, Marek Olšák
Details | Splinter Review
fix (1.55 KB, patch)
2015-02-24 22:23 UTC, Marek Olšák
Details | Splinter Review

Description Stefan Dösinger 2015-02-14 16:26:23 UTC
When running Wine's d3d8 and d3d9 tests in r300g an invalid command stream is sent to the kernel:

[  916.508352] [drm:radeon_cs_packet_parse] *ERROR* Unknown packet type 1 at 2451 !
[  916.508358] [drm:radeon_cs_ioctl] *ERROR* Invalid command stream !
[  916.508902] [drm:radeon_cs_packet_parse] *ERROR* Unknown packet type 1 at 69 !
[  916.508905] [drm:radeon_cs_ioctl] *ERROR* Invalid command stream !

The user space driver notices the error and writes a message to stderr. The test that triggered the invalid command subsequently fails.

The failing line in the tests is http://source.winehq.org/git/wine.git/blob/f75d1b0c2f77d8c85f7c2a9bcc3545f14e271a86:/dlls/d3d8/tests/visual.c#l3704 . The test performs a copy from a multisampled color buffer to system memory. Wined3d first resolves the multisampled renderbuffer to a non-multisampled texture and calls glGetTexImage. Interestingly it is the glGetTexImage step that fails. The texture has format GL_BGRA, type GL_UNSIGNED_INT_8_8_8_8_REV and internal format GL_SRGB8_ALPHA8.

We call glGetTexImage in plenty of places in this configuration, and this is the only case where this fails. I'll try to pin this down a bit further.

The bug can be reproduced by running make visual.ok in dlls/d3d8/tests in a Wine build tree.

System information:
Wine commit ID: f75d1b0c2f77d8c85f7c2a9bcc3545f14e271a86
Mesa commit ID: e333035c47a6a4cc88f0f9ca2bced500538bebae
Kernel: 3.19
libdrm: 2.4.59
X server: 1.16.3
Distribution: Gentoo
Comment 1 Stefan Dösinger 2015-02-14 16:32:00 UTC
Note that the texture is never used as a source or destination in a regular draw. It is specified with glTexImage2D, but the data pointer is NULL, and glTexSubImage2D is never used. It is only filled with data through the FBO_blit call that resolves the multisampled renderbuffer. As far as I can see no PBO is used.
Comment 2 Alex Deucher 2015-02-16 19:18:33 UTC
Type 1 packets shouldn't be emitted at all and none of the user mode drivers emit them.  I suspect either the command stream is getting corrupted somewhere or there is a prior packet count getting set wrong.
Comment 3 Stefan Dösinger 2015-02-17 09:36:42 UTC
My semi-educated guess would be that the multisample resolve blit sets the size wrong. That would explain why out of the many glGetTexImage calls we're doing this is the only one that fails.

Time to write a stand-alone test case I guess.
Comment 4 Stefan Dösinger 2015-02-17 11:43:12 UTC
Created attachment 113562 [details]
Test program

This program reproduces the result. As a 32 bit binary it generates the same type 1 command that is rejected. As a 64 bit program it crashes with the following backtrace:

(gdb) bt
#0  0x00007ffff75c8c90 in ?? () from /lib64/libc.so.6
#1  0x00007ffff2e0bcd3 in memcpy (__len=32, __src=<optimized out>, __dest=<optimized out>) at /usr/include/bits/string3.h:51
#2  r300_emit_blend_state (r300=<optimized out>, size=8, state=<optimized out>) at r300_emit.c:57
#3  0x00007ffff2e0f150 in r300_emit_dirty_state (r300=r300@entry=0x649950) at r300_emit.c:1450
#4  0x00007ffff2e12018 in r300_emit_states (instance_id=-1, index_bias=0, buffer_offset=0, index_buffer=0x0, flags=<optimized out>, r300=0x649950) at r300_render.c:259
#5  r300_prepare_for_rendering (r300=r300@entry=0x649950, flags=<optimized out>, flags@entry=PREP_EMIT_STATES, index_buffer=index_buffer@entry=0x0, cs_dwords=cs_dwords@entry=21, buffer_offset=buffer_offset@entry=0, index_bias=index_bias@entry=0, 
    instance_id=instance_id@entry=-1) at r300_render.c:311
#6  0x00007ffff2e13258 in r300_blitter_draw_rectangle (blitter=<optimized out>, x1=0, y1=0, x2=256, y2=256, depth=0, type=UTIL_BLITTER_ATTRIB_NONE, attrib=0x0) at r300_render.c:1141
#7  0x00007ffff2d7d3df in util_blitter_custom_color (blitter=0x61c7f0, dstsurf=dstsurf@entry=0x79e030, custom_blend=custom_blend@entry=0x0) at util/u_blitter.c:2146
#8  0x00007ffff2e06b91 in r300_simple_msaa_resolve (pipe=pipe@entry=0x649950, dst=dst@entry=0x79dcd0, dst_level=dst_level@entry=0, dst_layer=dst_layer@entry=0, src=<optimized out>, format=PIPE_FORMAT_B8G8R8A8_SRGB) at r300_blit.c:737
#9  0x00007ffff2e08396 in r300_msaa_resolve (info=0x7ffffffbdc70, pipe=0x649950) at r300_blit.c:783
#10 r300_blit (pipe=0x649950, blit=<optimized out>) at r300_blit.c:809
#11 0x00007ffff2c2ad97 in st_BlitFramebuffer (ctx=<optimized out>, readFB=0x796f30, drawFB=0x79d6b0, srcX0=<optimized out>, srcY0=<optimized out>, srcX1=<optimized out>, srcY1=256, dstX0=0, dstY0=0, dstX1=256, dstY1=256, mask=16384, filter=9728)
    at state_tracker/st_cb_blit.c:263
#12 0x00007ffff2af2ff2 in _mesa_BlitFramebuffer (srcX0=<optimized out>, srcY0=0, srcX1=<optimized out>, srcY1=256, dstX0=<optimized out>, dstY0=<optimized out>, dstX1=256, dstY1=256, mask=16384, filter=9728) at main/blit.c:509
#13 0x000000000040132f in init ()
#14 0x000000000040148a in main ()

Further testing shows that the GL_SRGB8_ALPHA8 internal format of the destination texture is the problem. Replacing this with GL_RGBA8 makes the test work fine. Note that when GL_EXT_sRGB_decode is available Wine always creates sRGB textures and sets GL_TEXTURE_SRGB_DECODE_EXT to GL_SKIP_DECODE_EXT to get d3d-style sRGB read correction toggling.
Comment 5 Alex Deucher 2015-02-18 15:19:47 UTC
Created attachment 113620 [details] [review]
add some debugging output

Can you apply this kernel patch and attach your kernel log output when you hit the error?
Comment 6 Marek Olšák 2015-02-21 11:06:40 UTC
Created attachment 113719 [details] [review]
workaround

Can you test this patch? Note that GL_ARB_framebuffer_sRGB is unsupported.
Comment 7 Stefan Dösinger 2015-02-21 12:01:10 UTC
I'll try it in the next two days. If the workaround doesn't work I'll try to get the log Alex asked for.
Comment 8 Marek Olšák 2015-02-21 12:10:59 UTC
I'm pretty sure the CS parser errors were caused by random garbage emitted by r300g. The driver doesn't expect an sRGB format in the framebuffer, which only seems to happen with glBlitFramebuffer.
Comment 9 Stefan Dösinger 2015-02-24 08:59:01 UTC
No more segfaults, but it still has a rejected CS:

[129870.605179] [drm:r100_cs_track_check] *ERROR* [drm] Buffer too small for color buffer 0 (need 8387584 have 524288) !
[129870.605186] [drm:r100_cs_track_check] *ERROR* [drm] color buffer 0 (16382 2 0 256)
[129870.605189] [drm:radeon_cs_ioctl] *ERROR* Invalid command stream !
Comment 10 Marek Olšák 2015-02-24 22:23:05 UTC
Created attachment 113802 [details] [review]
fix

This patch should fix it.
Comment 11 Stefan Dösinger 2015-02-26 09:48:56 UTC
Attachment 113802 [details] fixes the invalid CS / crash and my test program now reads a color. However, sRGB color correction seems to be applied at some stage. Note that the test program is not using GL_ARB_framebuffer_sRGB (which isn't supported on r500 anyway), and that GL_TEXTURE_SRGB_DECODE_EXT is set to GL_SKIP_DECODE_EXT.

It seems that the correction is applied during blitting. If I clear the 2D texture directly the expected result (0x0000ff80) is returned. If I use the renderbuffer and blit I get 0x0000ff37.

(Wine expects that glBlitFramebuffer never applies sRGB correction when blitting from an sRGB texture, but this codepath is only used if GL_EXT_sRGB_decode is not supported. In this case we load two copies of the texture, one sRGB and one RGB, and use glBlitFramebuffer to blit between them if the application toggles sRGB on and off.)
Comment 12 Marek Olšák 2015-02-26 11:17:16 UTC
r300g doesn't do sRGB conversion for MSAA resolves. It always interprets the textures as linear and only averages the samples.
Comment 13 Stefan Dösinger 2015-02-26 12:00:58 UTC
The thing that seems to trigger the sRGB correction is the fact that the destination texture has an sRGB internal. If I change it to GL_RGBA8 I get the expected result.

The format of the source RB doesn't seem to matter here.
Comment 14 Stefan Dösinger 2015-03-10 10:36:18 UTC
Fixed by 9953586af2254f83a610d4cd284f52f37fa18b98 and c939231e7223510408a446400ad23b8b5ce2922e. My test program returns the correct color and the Wine test passes. Thanks!


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.