I am trying to get DRI PRIME working on my AMD dual-GPU laptop. hp dv6z-7000 AMD A10-4600 APU with Radeon HD 7660G (ARUBA) AMD Radeon HD 7730M (VERDE) discrete GPU, 2GB VRAM Ubuntu GNOME 13.10 Custom-built kernel 3.11-rc7 from drm-next-3.13-wip branch (for DPM) I'm using the Ubuntu-provided xorg, mesa, glamor, and libdrm packages. I had to compile my own xserver-xorg-video-ati/radeon packages with glamor acceleration support enabled. With this I can confirm glamor acceleration is being used with the 7660G on r600 and it works well. Now, I want to enable DRI PRIME to use the 7730M with games. adam@Adam-dv6z-7000:~$ xrandr --listproviders Providers: number : 3 Provider 0: id: 0x6c cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 3 associated providers: 2 name:radeon Provider 1: id: 0x45 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 6 outputs: 0 associated providers: 2 name:radeon Provider 2: id: 0x45 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 6 outputs: 0 associated providers: 2 name:radeon adam@Adam-dv6z-7000:~$ xrandr --setprovideroffloadsink 0x45 0x6C adam@Adam-dv6z-7000:~$ DRI_PRIME=1 glxinfo | grep renderer When I run that third line, X crashes immediately with a segfault and automatically restarts GDM to the login screen. I will attach a Xorg log to this bug report after I post it and run the commands.
Created attachment 87528 [details] Xorg log from DRI_PRIME=1 crash
OK, I've done more research into the issue. I started by reinstalling Ubuntu GNOME 3.10 on my laptop for a clean install, then upgraded to the most up-to-date packages in Ubuntu's repository. I have not built any custom packages. With this setup, X will not start at all. Trying to run startx fails with a segmentation fault and the error: r600: Unknown chipset 0x9900 This chipID appears to be the 7660G GPU according to this line of the Xorg log: [ 2200.372] (--) RADEON(0): Chipset: "ARUBA" (ChipID = 0x9900) Upon looking into the Xorg log, it shows that the 7660G (ARUBA) initializes acceleration via EXA while the 7730M (VERDE) initializes acceleration via glamor. Both cards have acceleration enabled, and then it segfaults before anything happens on screen. I want to use glamor acceleration on both cards as I've found it to perform better than EXA, so I added a simple xorg.conf.d file like so: Section "Device" Identifier "Radeon" Option "AccelMethod" "glamor" EndSection Now X starts, but it's back to my original problem. With xorg.conf forcing glamor acceleration, the 7660G GPU initializes EGL and glamor properly which is reflected by desktop acceleration working, but the 7730M attempts to initialize EGL and fails, being unusable for DRI PRIME (though still showing up in listproviders, twice for some reason). I have a thread on Ubuntu Forums with Xorg.log from both without and with forcing glamor. In addition when I tried to replicate the issue on my desktop, I got graphical corruption. I'll file a different bug report for that issue. http://ubuntuforums.org/showthread.php?t=2180826 The conversation that led to this report started here: http://phoronix.com/forums/showthread.php?84382-RadeonSI-GLAMOR-Support-Still-M-I-A-From-Ubuntu-13-10&p=363236#post363236
Is this still an issue with current Mesa Git?
I also experience this issue on a Samsung 880Z5E laptop with integrated Intel HD Graphics 4000 (Ivybrigde) and dedicated Radeon 8770M cards. Kernel version is 3.14.7-1 from Debian Sid, xserver-xorg-video-radeon is 1:7.4.0-1, mesa is 10.2.1-2, libdrm 2.4.54-1. As far as I know, GLAMOR is disabled in debian build. Xorg log after crash is the same as attached by Adam. What I'm trying is: xrandr --setprovideroffloadsink 0x41 0x68 DRI_PRIME=1 glxinfo After these two commands X immediately crashes. Is there any progress on this issue?
Detailed backtrace: #0 GetScreenPrime (master=0x7f1e37f39e20, master@entry=0x1, prime_id=prime_id@entry=1) at ../../../../hw/xfree86/dri2/dri2.c:158 #1 0x00007f1e36f9a47d in GetScreenPrime (prime_id=prime_id@entry=1, master=master@entry=0x1) at ../../../../hw/xfree86/dri2/dri2.c:151 #2 DRI2GetScreenPrime (master=<optimized out>, prime_id=prime_id@entry=1) at ../../../../hw/xfree86/dri2/dri2.c:169 #3 0x00007f1e36f9c447 in DRI2Connect (client=client@entry=0x7f1e38854de0, pScreen=<optimized out>, driverType=65536, fd=fd@entry=0x7fff2ef0bb70, driverName=driverName@entry=0x7fff2ef0bb80, deviceName=deviceName@entry=0x7fff2ef0bb88) at ../../../../hw/xfree86/dri2/dri2.c:1312 #4 0x00007f1e36f9d09c in ProcDRI2Connect (client=0x7f1e38854de0) at ../../../../hw/xfree86/dri2/dri2ext.c:122 #5 ProcDRI2Dispatch (client=0x7f1e38854de0) at ../../../../hw/xfree86/dri2/dri2ext.c:596 #6 0x00007f1e36e7786e in Dispatch () at ../../dix/dispatch.c:433 #7 0x00007f1e36e7b68a in dix_main (argc=10, argv=0x7fff2ef0bd98, envp=<optimized out>) at ../../dix/main.c:294 #8 0x00007f1e34718b45 in __libc_start_main () from /lib/x86_64-linux-gnu/libc.so.6 #9 0x00007f1e36e66c2e in _start ()
I have the same exact problem with the latest stuff from git ( mesa, xorg-server, ddx for intel and radeon from git, llvm-3.5 (from svn), all relevant libs from git etc...). Glamor is always enabled for radeon. Kernel-3.15.2 My dgpu is a hd8000M (oland chip) with an integrated Ivybrigde gpu.
Please be careful not to mix up several similar, but potentially different issues. First of all, any PRIME functionality requires that acceleration is successfully initialized for all GPUs involved, otherwise the X server will crash when trying to use it (which is unfortunate, but the way it is right now). In Adam's case, acceleration failed to initialize for the discrete GPU with (EE) RADEON(G0): Failed to create EGL context (EE) RADEON(G0): glamor detected, failed to initialize EGL. If this still happens with current Mesa Git, setting the environment variable EGL_LOG_LEVEL=debug for the Xorg process should give more information about the failure. In Vitaliy's case, the problem is the lack of glamor support in Debian sid. This is fixed in the X packages in Debian experimental. dimitris says glamor initializes successfully for him, so it's probably a different problem which should be tracked in a different report.
(In reply to comment #7) > In Vitaliy's case, the problem is the lack of glamor support in Debian sid. > This is fixed in the X packages in Debian experimental. Ah, ok... but it seems experimental has no actual xserver-xorg-video-radeon package... so I should probably rebuild it by hand :-(
Just built the packages - you're right, X doesn't crash with glamor enabled... But I get black picture when I'm trying to run DRI_PRIME=1 glxgears... I understand it's not related to this bug, but maybe you have some thoughts about it? What can it be?
Also it says "Running synchronized to vertical refresh. The frame rate should be approximately the same as the monitor refresh rate", but then it says "46864 frames in 5.0 seconds = 9372.675 FPS" which means the sync doesn't work...
(In reply to comment #9) > But I get black picture when I'm trying to run DRI_PRIME=1 glxgears... PRIME currently requires a compositing manager to work. > I understand it's not related to this bug, [...] Indeed, please take your remaining issues elsewhere.
OK, thanks. Everything works now, I'll post new bugs if I experience more problems :)
Adam Honse, Ubuntu 13.10 reached EOL on July 17, 2014. For more on this, please see https://wiki.ubuntu.com/Releases. If this is reproducible with a supported release, it will help immensely if you filed a new report with Ubuntu by ensuring you have the package xdiagnose installed, and that you click the Yes button for attaching additional debugging information running the following from a terminal: ubuntu-bug xorg Also, please feel free to subscribe me to it. For more on why this is helpful, please see https://wiki.ubuntu.com/ReportingBugs.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.