I use ArchLinux with last updates. KMS is enabled. My glxinfo name of display: :0 display: :0 screen: 0 direct rendering: Yes server glx vendor string: SGI server glx version string: 1.4 server glx extensions: GLX_ARB_multisample, GLX_EXT_import_context, GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_MESA_copy_sub_buffer, GLX_OML_swap_method, GLX_SGI_make_current_read, GLX_SGI_swap_control, GLX_SGIS_multisample, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, GLX_INTEL_swap_event client glx vendor string: Mesa Project and SGI client glx version string: 1.4 client glx extensions: GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_import_context, GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_MESA_copy_sub_buffer, GLX_MESA_swap_control, GLX_OML_swap_method, GLX_OML_sync_control, GLX_SGI_make_current_read, GLX_SGI_swap_control, GLX_SGI_video_sync, GLX_SGIS_multisample, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, GLX_EXT_texture_from_pixmap, GLX_INTEL_swap_event GLX version: 1.4 GLX extensions: GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_import_context, GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_MESA_copy_sub_buffer, GLX_MESA_swap_control, GLX_OML_swap_method, GLX_OML_sync_control, GLX_SGI_make_current_read, GLX_SGI_swap_control, GLX_SGI_video_sync, GLX_SGIS_multisample, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, GLX_EXT_texture_from_pixmap, GLX_INTEL_swap_event OpenGL vendor string: Tungsten Graphics, Inc. OpenGL renderer string: Mesa DRI R100 (RV200 4C57) 20090101 x86/MMX/SSE2 TCL DRI2 OpenGL version string: 1.3 Mesa 7.10.2 My videocard is 01:00.0 VGA compatible controller: ATI Technologies Inc Radeon Mobility M7 LW [Radeon Mobility 7500] I try to start WarCraft 3 under wine, it's started successfully but exited afted 3 seconds of main screen with error drmRadeonCmdBuffer: -22. Kernel failed to parse or rejected command stream. See dmesg for more info. in dmesg i found [ 7455.848226] [drm:radeon_cs_parser_init] *ERROR* cs IB too big: 16469 [ 7455.848231] [drm:radeon_cs_ioctl] *ERROR* Failed to initialize parser ! If I set software tcl - game works fine. Here is an bugreport in ArchLinux bugzilla https://bugs.archlinux.org/task/23155?project=1&opened=4027 Tell me please, if you need additional information
When I increase command buffer to 32 kb via driconf - game starts perfectly with hardware tcl. But than it's exited with error warcraft\war3.exe: radeon_common.c:1250: rcommonEnsureCmdBufSpace: Assertion `rmesa->cmdbuf.cs->cdw' failed. Then i try manually edit ~/.drirc and change command buffer to 34 kb and game runs without errors. So this bug has an workaround, but i think that it's need to be fixed
Please attach your xorg log, dmesg output, and glxinfo output. Some of the commands that are added mostly likely for the tcl-related registers are not properly accounted for. I'd start with r200EnsureEmitSize() in r200_tcl.c and make sure it's properly accounting for all the state that needs to be emitted. Print out what size it expects vs. what size gets emitted for each bit of state and track down where the count is off.
Created attachment 46481 [details] dmesg output
Created attachment 46482 [details] glxinfo output
Created attachment 46483 [details] xorg log
can you tell me, how can i find what size it expects and what size gets emitted for each bit of state and track down where the count is off?
(In reply to comment #2) > Please attach your xorg log, dmesg output, and glxinfo output. Some of the > commands that are added mostly likely for the tcl-related registers are not > properly accounted for. I'd start with r200EnsureEmitSize() in r200_tcl.c and > make sure it's properly accounting for all the state that needs to be emitted. > Print out what size it expects vs. what size gets emitted for each bit of state > and track down where the count is off. and one moment about "I'd start with r200EnsureEmitSize() in r200_tcl.c" rv200 is an R100-based chipset, so driver is r100
(In reply to comment #7) > and one moment about "I'd start with r200EnsureEmitSize() in r200_tcl.c" > rv200 is an R100-based chipset, so driver is r100 radeonEnsureEmitSize() in radeon_tcl.c is called from radeon_run_tcl_render(). radeonEnsureEmitSize() walks through the pending state to make sure it will all fit in the current command buffer, but it appears to be missing some additional state that's being emitted. If you run the game from the terminal, it should print out a warning: if (emit_end < rmesa->radeon.cmdbuf.cs->cdw) WARN_ONCE("Rendering was %d commands larger than predicted size." " We might overflow command buffer.\n", rmesa->radeon.cmdbuf.cs->cdw - emit_end); Most likely: radeonEmitArrays() radeonEmitEltPrimitive() radeonEmitPrimitive() is emitting more state than radeonEnsureEmitSize() thinks it should.
no, there is no any warnings. With default settings i see drmRadeonCmdBuffer: -22. Kernel failed to parse or rejected command stream. See dmesg for more info. and [63763.072946] [drm:radeon_cs_parser_init] *ERROR* cs IB too big: 16403 [63763.072950] [drm:radeon_cs_ioctl] *ERROR* Failed to initialize parser ! in dmesg or with setup command buffer to 32 i see radeon_common.c:1250: rcommonEnsureCmdBufSpace: Assertion `rmesa->cmdbuf.cs->cdw' failed. when try to start a game
-- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/279.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.