Most of these problems were observed using gtk interface of vbam, but composite problem affects qt4 too. Most observing was done using "AccelMethod" "EXA" cause "XAA" was too slow. First compositing: with xcompmgr, gtk menus are drawn underneath activated game screen, when OpenGL output is selected, however it works correctly if dri is disabled Second - xvideo: while in xine it works correctly, but in gvbam emu screen is split into two areas: a black strip at the bottom a two side-by-side (doubled) game screen and all of the pixels have green tint (this works correctly with fglrx) if it matters, as you may check, in xvideo gvbam uses YUY2
> with xcompmgr, gtk menus are drawn underneath activated game screen, when > OpenGL output is selected, however it works correctly if dri is disabled That's bug 8732, being addressed by DRI2. For the XVideo problem, please attach (as opposed to paste) the full Xorg.0.log file. Also, which version of xf86-video-ati are you using?
Created attachment 16449 [details] my Xorg log version is 6.8.0, should have mention that in the first place.
Created attachment 16450 [details] the correct log Sorry, this one is with EXA.
(In reply to comment #3) > > Sorry, this one is with EXA. Are you saying this problem is specific to EXA or XAA? Where is 'gvbam' available? YUY2 seems to work fine here with mplayer.
vba-m is a sourceforge project (http://sourceforge.net/projects/vbam/). gvbam is the name of gtk executable of the project.
It uses cmake, and gtk gui build only with portaudio, gtkglextmm and a few other gtkmm related dependencies. I use it with OpenAL instead portaudio.
I haven't tested xvideo with XAA, I just remember both cairo and OpenGL being ridiculously slow with EXA (with EXA it's just cairo, which is working fine with frglx, BTW - cairo uses image backend)
A minor correction: OpenGL is slow with EXA too, it was disabling of dri that gave speed up (I fail to notice that at first).
BTW, you said that dri problem is being addressed by DRI2, but unless I misunderstood one of the mailing list posts, even when it arrives, some memory management thing has to be implemented in the graphic driver. Did I miss something or is it already implemented ?
And this is probably wrong place to ask it but is there a way to disable dri for a specific program ?
(In reply to comment #10) > And this is probably wrong place to ask Indeed. > it but is there a way to disable dri for a specific program ? Assuming you mean a way to force software rendering: When AIGLX is disabled, you can try setting the environment variable LIBGL_ALWAYS_INDIRECT=1 for running the program. Otherwise, you'd probably need to point LD_LIBRARY_PATH to a Mesa software only version of libGL.so.1 . Also, of course DRI2 has requirements, the point is that once the drivers support it, OpenGL rendering and the Composite extension will interact properly.
No, first of all, AIGLX is enabled. When I run Xorg on second display, while is already running on the first, one of the messages in the log is (EE) RADEON(0): [dri] DRIScreenInit failed. Disabling DRI. I'm asking, can this be simulated for a single app ? And I'll add a bit more info on the xvideo issue. The game screen is actually split into 4 equal parts. When I try to move the window, when the game is running, there's a funny effect. The game screen seems to show some offscreen layer, cause most of the areas shown while moving are blue-black, while what was originally shown is in the fixed position (were it was, before I started moving the window), till I release the window, when it's redrawn in the new place. Actually it the same effect I see in gxine, while no movie is loaded, just the initial screen is shown, but gxine doesn't have the color problem.
(In reply to comment #12) > (EE) RADEON(0): [dri] DRIScreenInit failed. Disabling DRI. > I'm asking, can this be simulated for a single app ? No. > The game screen seems to show some offscreen layer, cause most of the areas > shown while moving are blue-black, while what was originally shown is in the > fixed position (were it was, before I started moving the window), till I > release the window, when it's redrawn in the new place. That's expected with an XVideo overlay adaptor port. The blue-black is the overlay colour key.
Created attachment 16647 [details] output of xvinfo OK, major update. The key is my xvinfo. The question is why did it work with fglrx ? gvbam does following m_iFormat = FOURCC_YUY2; for (int i = 0; i < iNumFormats; i++) { if (pFormats[i].id == 0x3 || pFormats[i].type == XvRGB) { /* Try to find a 32-bit mode */ if (pFormats[i].bits_per_pixel == 32) { m_iFormat = pFormats[i].id; } } } (where pFormats is the result of XvListImageFormats) for fglrx that code was working, for this driver it seems to choose an incorrect format. Is that code incorrect, if so, why did it work with fglrx ? (maybe fglrx simply doesn't have a format for those conditions, so this change of format is never triggered. And as a side note: yesterday I fetched this driver's git. Well, this actually seems to be a bug in the driver: if Adaptor #0: "ATI Radeon Video Overlay" is chosen, program works, if Adaptor #1: "Radeon Textured Video" is chosen, program exits with: The error was 'BadMatch (invalid parameter attributes)'. (Details: serial 1667 error_code 8 request_code 141 minor_code 13) (Note to programmers: normally, X errors are reported asynchronously; that is, you will receive the error a while after causing it. To debug your program, run it with the --sync command line option to change this behavior. You can then get a meaningful backtrace from your debugger if you break on the gdk_x_error() function.) But there's still the question, if anything can be done about that major slowdown with (OpenGl/Cairo)+Composite+dri (actually, for cairo, it's there even when dri is disabled, but then it's only a major slowdown, instead of near-freeze, when with dri) (for OpenGL, while slowdown with dri is major, system doesn't seem to freeze).
(In reply to comment #14) > And as a side note: yesterday I fetched this driver's git. > Well, this actually seems to be a bug in the driver: > if Adaptor #0: "ATI Radeon Video Overlay" is chosen, program works, Yes, there were some fixes in this area recently. Marking as fixed then. > if Adaptor #1: "Radeon Textured Video" is chosen, program exits with: > The error was 'BadMatch (invalid parameter attributes)'. > (Details: serial 1667 error_code 8 request_code 141 minor_code 13) > (Note to programmers: normally, X errors are reported asynchronously; > that is, you will receive the error a while after causing it. > To debug your program, run it with the --sync command line > option to change this behavior. You can then get a meaningful > backtrace from your debugger if you break on the gdk_x_error() function.) Looks like an app bug - as you can see in the xvinfo output, the textured adaptor doesn't define any port attributes, so the BadMatch error is the expected result according to the XvSetPortAttribute manpage. That is assuming xdpyinfo -queryExtensions|grep 141 gives XVideo (opcode: 141, base event: xx, base error: xxx) otherwise please provide the output of that command. > But there's still the question, if anything can be done about that major > slowdown with (OpenGl/Cairo)+Composite+dri (actually, for cairo, it's there > even when dri is disabled, but then it's only a major slowdown, instead of > near-freeze, when with dri) (for OpenGL, while slowdown with dri is major, > system doesn't seem to freeze). I'm afraid your description of these problems here haven't been very clear. Please post detailed descriptions of the symptoms and corresponding configurations to the xorg-driver-ati@lists.x.org mailing list, and we can decide which of those are really bugs that need to be reported separately.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.