[This bug has been confirmed with the XBMC devs, and works on non-Intel hardware. It appears to have been introduced some time after the 2.21.9 version of the Intel Xorg driver. Versions after this are locked to using XBMC on 0,0 coordinates (left output) and the functionality is thus severly impaired.] The scenario: running a dual-head setup with XBMC on the second display causes it to output on the main screen regardless of which graphics port it is set to output to. I've done extensive testing on four different systems to try to get to the bottom of which component introduced this. I have ruled out Window-managers, libva, mesa and libdrm. That leaves the Xorg intel driver (intel_drv.so) from xserver-xorg-video-intel / xf86-video-intel or the X server itself. However, since I witnessed the same issue after compiling recent versions of libdrm2, libva and xserver-xorg-video-intel without upgrading the X server, Xorg itself does not seem to be at fault. Current test system #1 (Haswell, i7-4765T): - xorg-server 1.15.2-1 - lib32-libdrm 2.4.56-1 - libdrm 2.4.56-1 - xf86-video-intel 2.99.912-2 - lib32-mesa 10.2.4-1 - lib32-mesa-demos 8.2.0-1 - mesa 10.2.4-1 - mesa-demos 8.2.0-1 vainfo from the same system: vainfo: VA-API version: 0.35 (libva 1.3.1) vainfo: Driver version: Intel i965 driver for Intel(R) Haswell Desktop - 1.3.2 vainfo: Supported profile and entrypoints VAProfileMPEG2Simple : VAEntrypointVLD VAProfileMPEG2Simple : VAEntrypointEncSlice VAProfileMPEG2Main : VAEntrypointVLD VAProfileMPEG2Main : VAEntrypointEncSlice VAProfileH264ConstrainedBaseline: VAEntrypointVLD VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice VAProfileH264Main : VAEntrypointVLD VAProfileH264Main : VAEntrypointEncSlice VAProfileH264High : VAEntrypointVLD VAProfileH264High : VAEntrypointEncSlice VAProfileVC1Simple : VAEntrypointVLD VAProfileVC1Main : VAEntrypointVLD VAProfileVC1Advanced : VAEntrypointVLD VAProfileNone : VAEntrypointVideoProc VAProfileJPEGBaseline : VAEntrypointVLD I've also tested this on Ivy Bridge, Haswell and on Sandy Bridge systems using various combinations of HDMI, VGA, DVI & DisplayPort outputs. They all behave the same, regardless of distro (Ubuntu/ArchLinux) and kernel versions. The 'last known good' config is as follows: (this is from a Ubuntu 13.04 system running a 3.16.0 kernel) - i965-va-driver_1.2.1-1 - libdrm2_2.4.46 - libva1_1.2.1-1 - xserver-xorg-core_1.13.3 - xserver-xorg-video-intel_2.21.9 I would be more than happy to provide any extra input, or test patches for this issue. It should however be easily reproducible on any multi-headed system using the built-in/on-CPU graphics.
Are you able to reproduce the problem on a bare X (no window manager)? (Just trying to minimise the number of factors at play.) Can you then compile xf86-video-intel by hand with ./configure --enable-debug=full and attach the log file for the first few seconds of output? That should be enough to figure out why it appears at that location.
Created attachment 104659 [details] debug info from xf86-video-intel while starting and changing output ports in XBMC
That logfile is missing a large chunk in the middle so I can't see how the CRTCs were set up, nor where the 1920x080 window was created. From that excerpt the driver is behaving correctly and xbmc is simply rendering to (0, 0), (1920, 1080).
Hello Chris, I've compiled xf86-video-intel (2.99.912) with the --enable-debug=full option and have included it above. Let me know if you need more output from it (I limited it to the first 200 and last 2000 lines) Not sure I know what you mean by reproducing the problem outside a window manager. I can (and have) tried running it as a XBMC session, but it starts up on both monitors regardless, i.e, no second desktop is being used. The problem is of course to have the regular desktop on the primary output, and XBMC running in the background on the other, which isn't possible running it outside a WM/DE. I have tested this with several different DEs to eliminate issues with one single package though: - IceWM - Cinnamon (which works perfectly on Ubuntu 13.04 btw., incl. 23.976 on IVB) - Fluxbox - XFCE4 - Unity - Mate - Gnome - etc. So I am fairly sure it is not a problem with the specific WM I am using. All the above have been tested on the current (Haswell i7-4765T) test system. When I start XBMC from within a WM (DE), it pops up on the first output (which is a VGA). It is set to use the DisplayPort however, and changing it between the two has no effect. (It actually stated it was running off the DP, when it in fact was on the VGA.) EDIT: I just saw your comment on the logfile. I'll include the full output, xz compressed.
Created attachment 104660 [details] full Xorg.0.log
Ok, that all looks sane. It just looks like the window is positioned incorrectly. Can you install xtrace, and use that to capture the calls made by xbmc as you tell it to switch outputs?
All I am getting with xtrace is this: $ xtrace /usr/bin/xbmc -fs | tee -a xbmc-xtrace.txt Function File Line -------------------------------------------------------------------------------- addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized addr2line: /usr/bin/xbmc: File format not recognized Press return here to close xterm(/usr/bin/xbmc). $ ls -al *txt -rw-r--r-- 1 martin users 1035 14.08.2014 16:01 vainfo.txt -rw-r--r-- 1 martin users 486 15.08.2014 10:57 xbmc-xtrace.txt $ cat xbmc-xtrace.txt Function File Line -------------------------------------------------------------------------------- Function File Line -------------------------------------------------------------------------------- Function File Line -------------------------------------------------------------------------------- It is probably not being run correctly. Actually, it had a dependency on 'xterm' (which also did not print anything in it) I tried running it on the xbmc.bin as it appeared to be needing a binary, but it still did not print any functions. However, I ran a strace instead which I'll include here.
Created attachment 104663 [details] strace of XBMC process
Hmm, I expected Ubuntu to package th right xtrace. See http://xtrace.alioth.debian.org/
Knew something was fishy here. The /usr/bin/xtrace command is bundled with glibc(!). The one which I compiled is 'x11trace'. Sorry about that. I'm running a new trace now.
Created attachment 104672 [details] _proper_ x11trace of xbmc Proper xtrace/x11trace. :P What is being performed after starting it: - System -> Settings -> System -> Video Output -> Monitor (states DP2, although the active output is VGA) - Change monitor output to VGA (reverts to same monitor, technically correct, although all outputs are treated as one) - Change output to 'Default' (reverts to same monitor) - Main menu -> Exit XBMC
Looked through the xtrace. I can see that it queries the screen configuration through RANDR, but I couldn't spot if it took any action. So for whatever reason it is blissfully ignorant. Before I try stepping through xbmc, can you grab an xtrace against a working xbmc setup?
That was easier said than done actually. I spent quite a bit of time trying to get xtrace to actually bind to the correct X server instance, despite specifying it as an argument: /usr/local/bin/xtrace --display :0.0 xlogo No display name to create specified, trying :9 Until I noticed that you needed the '-n' option to bypass xauth(!). Not sure why this was not needed on the other test system, but it is obviously using a newer X.org. Anyway, I have now created the output and I am attaching it above. The same basic steps were taken, i.e: - System -> Settings -> System -> Video Output -> Monitor (set to 'Default'/'HDMI1') - Change monitor output to HDMI2, which changes successfully to my PJ. (HDMI1 is my LCD screen.) - Main menu -> Exit XBMC
Created attachment 104774 [details] xtrace from working xbmc setup (using driver from 2.21.9-0ubuntu0~raring)
Ok, figure this one out. This is the movement of the DRI2 (GL) window for xbmc: working: 005:>:0cfd: Event DRI2-BufferSwapComplete(103) drawable=0x00000002 ust_hi=35651596 ust_lo=102 msc_hi=533911098 msc_lo=0 sbc_hi=26050340 sbc_lo=557 005:>:0cfd: Event PropertyNotify(28) window=0x0220000b atom=0x1b6(unrecognized atom) time=0x1a24d24e state=NewValue(0x00) 005:<:0cfe: 16: Request(2): ChangeWindowAttributes window=0x0220000b value-list={cursor=None(0x00000000)} 005:<:0cff: 8: Request(95): FreeCursor cursor=0x0220000f 005:<:0d00: 8: Request(10): UnmapWindow window=0x0220000b 005:<:0d01: 4: Request(43): GetInputFocus 005:>:0d01: Event UnmapNotify(18) event=0x0220000b window=0x0220000b from-configure=false(0x00) 005:>:0d01:32: Reply to GetInputFocus: revert-to=Parent(0x02) focus=0x01800aef 005:>:0d01: Event PropertyNotify(28) window=0x0220000b atom=0x127("_NET_WM_STATE") time=0x1a24d334 state=NewValue(0x00) 005:>:0d01: Event PropertyNotify(28) window=0x0220000b atom=0x1cf(unrecognized atom) time=0x1a24d334 state=NewValue(0x00) 005:>:0d01: Event (generated) ConfigureNotify(22) event=0x0220000b window=0x0220000b above-sibling=None(0x00000000) x=1920 y=0 width=1914 height=745 border-width=0 override-redirect=false(0x00) broken: 004:>:101d: Event DRI2-BufferSwapComplete(102) drawable=0x00000002 ust_hi=44040231 ust_lo=2 msc_hi=643714655 msc_lo=0 sbc_hi=44367 sbc_lo=290 004:>:101d: Event KeyRelease(3) keycode=0x24 time=0x008ce534 root=0x000000a0 event=0x02a00026 child=0x02a00027 root-x=888 root-y=679 event-x=888 event-y=679 state=Mod2 same-screen=true(0x01) 004:<:101e: 16: Request(2): ChangeWindowAttributes window=0x02a00026 value-list={cursor=None(0x00000000)} 004:<:101f: 8: Request(95): FreeCursor cursor=0x02a0002a 004:<:1020: 8: Request(10): UnmapWindow window=0x02a00026 004:<:1021: 4: Request(43): GetInputFocus 004:>:1021: Event UnmapNotify(18) event=0x02a00026 window=0x02a00026 from-configure=false(0x00) 004:>:1021: Event FocusOut(10) detail=Nonlinear(0x03) event=0x02a00026 mode=Normal(0x00) 004:>:1021: Event FocusIn(9) detail=Pointer(0x05) event=0x02a00026 mode=Normal(0x00) 004:>:1021: Event KeymapNotify(11) keys(0-7 omitted)=0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00; 004:>:1021: Event LeaveNotify(8) detail=Virtual(0x01) mode=Normal(0x00) flags=focus,same-screen time=0x008ce8b4 root=0x000000a0 event=0x02a00026 child=0x02a00027 root-x=888 root-y=679 event-x=888 event-y=679 state=Mod2 004:>:1021:32: Reply to GetInputFocus: revert-to=PointerRoot(0x01) focus=PointerRoot(0x00000001) 004:>:1021: Event PropertyNotify(28) window=0x02a00026 atom=0x136("_NET_WM_STATE") time=0x008ce8be state=NewValue(0x00) 004:>:1021: Event PropertyNotify(28) window=0x02a00026 atom=0x1e2(unrecognized atom) time=0x008ce8be state=NewValue(0x00) 004:>:1021: Event (generated) ConfigureNotify(22) event=0x02a00026 window=0x02a00026 above-sibling=None(0x00000000) x=0 y=0 width=1918 height=791 border-width=0 override-redirect=false(0x00) The window is moved externally to that request stream. :|
I see. Interesting. I notice the non-working setup is trying to set the focus to the root-window, instead of inheriting the parent: 004:>:1021:32: Reply to GetInputFocus: revert-to=PointerRoot(0x01) focus=PointerRoot(0x00000001) 004:>:1021: Event PropertyNotify(28) window=0x02a00026 So in terms of a possible fix, I guess this will need some deeper digging to see why the apparent disconnect between the application calls and the incorrect DRI2 window placements occur? (Especially if more layers in the gfx-stack are involved.)
(In reply to comment #16) > I see. Interesting. I notice the non-working setup is trying to set the > focus to the root-window, instead of inheriting the parent: > > 004:>:1021:32: Reply to GetInputFocus: revert-to=PointerRoot(0x01) > focus=PointerRoot(0x00000001) Don't worry, this is just noise from XSync() (which generates a round-trip and querying the focus on the root window is a request which should be impossible to generate an error, and so safest to use). > 004:>:1021: Event PropertyNotify(28) window=0x02a00026 > > So in terms of a possible fix, I guess this will need some deeper digging to > see why the apparent disconnect between the application calls and the > incorrect DRI2 window placements occur? (Especially if more layers in the > gfx-stack are involved.) Yeah, it seems simple enough that it can no longer find something it was looking for, but I can't see what. I'll try and get xbmc setup this week on a multimonitor and see if I can trace the problem.
Ok–sounds good. If you need me for any additional testing; let me know. (I've had quite good success in running XBMC on Intel so far, with the exception of this issue and proper 23.976Hz output on Ivy Bridge which is apparently a hardware limitation...)
@Chris: this is technically a workaround, but one of the XBMC (I refuse to refer to it by its new baby-toy name) devs just made a patch where it uses XMoveWindow after mapping to force XBMC to the desired position. I tested it moments ago and it worked brilliantly. I am unsure if the behaviour we are seeing potentially affects other software trying to bind to a specific output port, but I thought I'd let you know in case you check out this branch (FernetMenta's) - or the master branch after the PR has been approved - and it works. :-)
Still deep in other work atm so haven't had a chance to work on this yet. I have pulled down xbmc.git though... If you have a pointer to the patch that makes it work for you (and yes I was expecting to see a XMoveWindow) that would be useful to highlight the code that is meant to work before the patch.
I understand that completely. Here is the aforementioned patch which utilizes XMoveWindow– https://github.com/FernetMenta/xbmc/commit/d6eb9c3bd6945948d6ae3bb0fa00196b27c3aa05
A glimmer of understanding. That decidely points to the issue being the _NET_WM_STATE fullscreen=true request of a window at (x_output, y_output) going wrong. To clarify, it switched between desktops with Cinnamon and failed with everything else? Of course the puzzling thing remains the mystery what actually changed of significance in the XRandR response. Just in case, can you do $ xrandr --verbose on working / broken setups?
Created attachment 105025 [details] xrandr --verbose from working setup
Created attachment 105026 [details] xrandr --verbose from nonworking setup
Try xrandr --noprimary
> To clarify, it switched between desktops with Cinnamon and failed with everything else? Sorry for the late reply; I had to pull the xrandr output from a system at a different location (and obviously xrandr requires you to have a local X session) I have not tested the 'XMoveWindow-patch' with WMs other than Cinnamon. I did however test XBMC with numerous other WMs before (both with the older i965 driver and current ones), and found Cinnamon to be the one which behaved most correctly/seamlessly. For instance, fluxbox (which the XBMC devs insist on using), as well as IceWM, and XFCE4, never acknowledged the xrandr settings at all, despite manually setting these before and during an XBMC run. It always outputted on the first available port, at 0,0 coordinates. (This was on the working setup, i.e the 1.2.1 driver.) With 1.2.1 Unity however appeared to work; although it outputed everything in 60Hz (same as the GUI), and not 23.97Hz (from my custom Xorg.conf modeline), which it is supposed to. The only thing not working properly-either as a result of the drivers or the dual-head config-is proper 50Hz output. It is juddery. And also the "pure" 23.976Hz (or 23.976023976 to be finicky) output that I mentioned before on Ivy Bridge which I am unsure actually _can_ be fixed at the driver level. (Still drops a frame once every few minutes) > $ xrandr --verbose > on working / broken setups? I've included this output as attachments.
$ xrandr --noprimary does not print a thing (with or without --verbose). This is on the non-working setup (the other is not available atm.)
Look at xrandr --verbose after xrandr --noprimary, and see if that changes behaviour with xbmc.
Ah, gotcha. :) I've tested this now, and it does not appear to have an impact on XBMC. It still reports it is running on 'DP2' (despite being on VGA1), and changing the output ports does nothing. I've included the xrandr output (after setting --noprimary) anyway in case it is of interest.
Created attachment 105027 [details] xrandr --verbose from nonworking setup after setting --noprimary
Well, I just did a diff and the only difference between those were the timestamps-so not very useful... ;)
Right, it is still reporting VGA1 as primary. Try changing it with 'xrandr --output DP1 --primary'.
One of the final questions is where is Primary coming from....
Created attachment 105028 [details] xrandr --verbose from nonworking setup after setting --primary to DP2 Still no dice, unfortunately. It does report the correct output as primary (DP2–don't ask me where DP1 has gone), but XBMC still starts up on the wrong screen though, and cycling through the outputs again does nothing.
I'm baffled by how you select which output to run xbmc on (looking at system/settings/video_output)
Ok, ubuntu + xbmc-13.1 doesn't have the monitor option as far as I can see.
Built xbmc 14.0 from commit 60faade477359e06b4e8be74425c63bf56f927ba Merge: 1d56a78 fe0726c Author: Tobias Arrskog <topfs2@xbmc.org> Date: Fri Aug 22 19:37:53 2014 +0200 and it switched between hdmi and lvds correctly.
There has been issues previously–before SDL was dropped–in selecting monitor outputs on Linux, but I assumed this would have made it into 13.1 (I've been using this functionality for over a year). Looks like perhaps 13.2 or a nightly build (ppa:team-xbmc/xbmc-nightly) would be needed. Recently I've always compiled XBMC from source after checking it out from git, and am currently running mainline 14.0-ALPHA1 Git:2014-07-07-dae6f76. Before that the Xvba branch using their repos. Most of the Xvba stuff is merged into mainline, but there seems to be some delay in getting it into current release versions if what you are stating is correct. If the version of XBMC you're running is indeed relying on SDL (which isn't dual-head aware), you should only be able to see the 'Default' output in the list. EDIT: just saw that you tested a 14.0 version from source. It probably already has the XMoveWindow patch included.
I've checked out commit 303d857f98601967a4ff5ddb2592c3d8963a3081 Merge: 0ad5b5d 0136c28 Author: Memphiz <memphis@machzwo.de> Date: Wed Aug 20 11:16:52 2014 +0200 which is the point before commit 6e7bc91d28c232676ae41abf4d819e598c019bdd Author: Rainer Hochecker <fernetmenta@online.de> Date: Wed Aug 20 14:57:07 2014 +0200 X11: fix multi-monitor setups after recent changes of X servers and that commit doesn't work on any version of the ddx here.
(In reply to comment #39) > and that commit doesn't work on any version of the ddx here. Ah. No, it behaves differently whether the monitor is actually switched on. If the monitor is switched off, xrandr still has the connection (and monitor registered as an active part of the fb) but xbmc doesn't move.
And now, it is failing again with everything. Starting to look like a race condition with xbmc and the window manager.
Not sure what is going on here. Which window manager are you testing with? XBMC has behaved consistently with all three systems I've tested on–the first was a Ubuntu Raring (13.04) IVB system upgraded to Saucy (13.10) with the provided i965, libdrm, libva packages. That system was later manually upgraded (packaged the debs myself) and tested with: - xserver-xorg-video-intel_2.99.911 - libva-intel-driver_1.3.1 - libva_1.3.1 - libva-x11-1_1.3.1 - vainfo_1.3.1-1 - libva-drm1_1.3.1 - libva-glx1_1.3.1 - libdrm2_2.4.54 - libdrm-intel1_2.4.54 And the problem re-appeared. Downgrading those to the known-good versions mentioned earlier resolved the issue (using the exact same XBMC version) I also tested it on a fresh Mint 16 installation (also on IVB) (using packages from Saucy/13.10 from which it is based), and reproduced it. The latest test system is an ArchLinux derivative (Manjaro) (on HWE), which runs the following: - xf86-video-intel 2.99.912-2 - libva 1.3.1-2 - libva-intel-driver 1.3.2-1 - libdrm 2.4.56-1 The Mint 16 / Ubuntu 13.10 test systems used the provided packages for 13.10 from 01.org, which at the time were: - libva 1.2.1 - libdrm 2.4.46 - xserver-xorg-video-intel 2.99.904 All my latest tests were done with the Cinnamon WM (2.2.13). Mint 16 uses the older 2.0.14. However, I did test a range of WMs earlier, and could not get a single one of them to successfully change the output. One of the specific versions I used (apart from the ones compiled from git) were: https://launchpad.net/~wsnipex/+archive/ubuntu/xbmc-fernetmenta-master/+build/6167089/+files/xbmc_14.0%7Egit20140709.0500-607950d-0saucy_all.deb https://launchpad.net/~wsnipex/+archive/ubuntu/xbmc-fernetmenta-master/+build/6167088/+files/xbmc-bin_14.0%7Egit20140709.0500-607950d-0saucy_amd64.deb Unsure if that is of any help though. The behaviour you are seeing is not something I've encountered. Are you testing on a laptop? Maybe there is an issue with LVDS that I have not witnessed before.
I'm using awesome -- I thought it would be a small enough ewmh compliant window manager to figure out what is going on. So far, I can see the request for the fullscreen window on the HDMI1 output, I can see awesome place it there, and then for some reason I haven't fathomed yet, it moves it (when processing the MapRequest) to only occupy the LVDS. By learning a little bit of Lua, (I am starting to like that little embedded interpreter!), I found that awesome was moving the newly mapped windows to the same screen as the mouse. So the race appears to be between the XWarpPointer performed by xbmc and the processing of the MapRequest by the window manager. (The dependence upon the ddx is simply whether or not we block doing some update or not, changing the relative timings of event processing and client priorities.) The correct fix is then: diff --git a/xbmc/windowing/X11/WinSystemX11.cpp b/xbmc/windowing/X11/WinSystemX11.cpp index 018a4d2..76f33c1 100644 --- a/xbmc/windowing/X11/WinSystemX11.cpp +++ b/xbmc/windowing/X11/WinSystemX11.cpp @@ -151,7 +151,6 @@ bool CWinSystemX11::DestroyWindow() CWinEventsX11Imp::Quit(); XUnmapWindow(m_dpy, m_mainWindow); - XSync(m_dpy,TRUE); XDestroyWindow(m_dpy, m_glWindow); XDestroyWindow(m_dpy, m_mainWindow); m_glWindow = 0; @@ -659,13 +658,13 @@ bool CWinSystemX11::Restore() bool CWinSystemX11::Hide() { XUnmapWindow(m_dpy, m_mainWindow); - XSync(m_dpy, False); + XFlush(m_dpy); return true; } bool CWinSystemX11::Show(bool raise) { XMapWindow(m_dpy, m_mainWindow); - XSync(m_dpy, False); + XFlush(m_dpy); m_minimized = false; return true; } @@ -931,7 +930,6 @@ bool CWinSystemX11::SetWindow(int width, int height, bool fullscreen, const std: class_hints->res_class = (char*)classString.c_str(); class_hints->res_name = (char*)classString.c_str(); - XSync(m_dpy,False); XSetWMProperties(m_dpy, m_mainWindow, &windowName, &iconName, NULL, 0, NULL, wm_hints, class_hints); @@ -942,14 +940,12 @@ bool CWinSystemX11::SetWindow(int width, int height, bool fullscreen, const std: Atom wmDeleteMessage = XInternAtom(m_dpy, "WM_DELETE_WINDOW", False); XSetWMProtocols(m_dpy, m_mainWindow, &wmDeleteMessage, 1); } + + XWarpPointer(m_dpy, None, m_mainWindow, 0, 0, 0, 0, mouseX*width, mouseY*height); + XMapRaised(m_dpy, m_glWindow); XMapRaised(m_dpy, m_mainWindow); - XSync(m_dpy,TRUE); - - if (changeWindow && mouseActive) - { - XWarpPointer(m_dpy, None, m_mainWindow, 0, 0, 0, 0, mouseX*width, mouseY*height); - } + XFlush(m_dpy); CDirtyRegionList dr; RefreshGlxContext(m_currentOutput.compare(output) != 0);
Interesting. Are you saying I should drop XMoveWindow and use the change shown in your last post?
(In reply to comment #44) > Interesting. Are you saying I should drop XMoveWindow and use the change > shown in your last post? I think your window creation is correct without the addition of the XMoveWindow, and that the final placement is being messed up by the WM, judging by awesome, doing the WarpPointer before the MapWindow should prevent that class of "place new windows on the same screen of the mouse" issues. Eliminating the extra XSync (and especially that spurious discard-all-events!) is just for extra polish. Not sure about the XSync around the GL context manipulation though - they shouldn't be required, but I expect that they are there due to experiments.
Thank you very much for your effort. I will test with your proposed changes and update XBMC.
You proposed changes do not work here with LXDE. I am testing with a HDMI monitor and a VGA. If I place HDMI right-of VGA, it works even without the XMoveWindow call. If I place VGA right-of HDMI it fails. The application window is placed on VGA instead of HDMI and has wrong resolution. It has the HDMI resolution. I get the same failing behavior regardless of how I open XBMC. I tried from a terminal opened on HDMI (mouse also on HDMI) and via a remote terminal. This odd behavior shows for all applications. Every app I open via LXDE menu is placed on the VGA. Fluxbox and Openbox show the same issue which make me think it is not a WM problem.
I can confidently state that the placement of the window is due to window manager. I can try lxde/fluxbox later today, but since there is no internal translation of the window inside X or the ddx, it has to be from the window manager.
I agree, the window is placed by the WM and if I set the override redirect flag, the window is positioned correctly. But hard to believe that all WMs suddenly start failing. Maybe they get some incorrect information from the XServer?
In my awesome, example no. I ran the entire session under xtrace and watched the configure requests go into the WM and be transformed there. In that case, it is quite easy to see how there would be a race between moving the mouse and the position of a new window. (I am pretty sure mutter and unity do the same placement on screen next to mouse as well.) However, I'll need to dig into fluxbox to see what is going on there.
Openbox and LXDE have a configuration file where you can influence placement of windows <placement> <policy>Smart</policy> <!-- 'Smart' or 'UnderMouse' --> <center>yes</center> <!-- whether to place windows in the center of the free area found or the top left corner --> <monitor>Mouse</monitor> <!-- with Smart placement on a multi-monitor system, try to place new windows on: 'Any' - any monitor, 'Mouse' - where the mouse is, 'Active' - where the active window is --> </placement> If I set monitor to mouse, your changes work. The default "Any" shows this odd behavior. I am still wondering what in the underlying layers caused this changed behavior. Anyway, at least we know how to make it work.
for the record: removing the XSync calls breaks refresh rate switching, i.e. playing a 23.976 video that triggers a change of refresh rate. Without XSync screen starts flashing: black - half the WM wallpaper - black - and so on.
(In reply to comment #52) > for the record: > removing the XSync calls breaks refresh rate switching, i.e. playing a > 23.976 video that triggers a change of refresh rate. Without XSync screen > starts flashing: black - half the WM wallpaper - black - and so on. Now that strikes me as a driver bug...
Didn't have problems with Openbox. Interestingly it calls its default "place new windows on the primary monitor". How do you make xbmc change refresh rate? Just play a movie and allow it change modes to one closest the desired?
(In reply to comment #54) > Didn't have problems with Openbox. Interestingly it calls its default "place > new windows on the primary monitor". How do you make xbmc change refresh > rate? Just play a movie and allow it change modes to one closest the desired? Note that OpenBox (default config) worked here too if HDMI was configured right-of VGA, it failed with HDMI left-of VGA. You need to turn on "Adjust display refresh rate to match video" http://wiki.xbmc.org/index.php?title=Settings/Videos Refresh rate will switch if it finds an acceptable match (XBMC needs to be in full screen mode)
xbmc-xrandr, you have to be kidding me. At least do diff --git a/xbmc/windowing/X11/XRandR.cpp b/xbmc/windowing/X11/XRandR.cpp index c94f2e3..4fdaeeb 100644 --- a/xbmc/windowing/X11/XRandR.cpp +++ b/xbmc/windowing/X11/XRandR.cpp @@ -74,7 +74,7 @@ bool CXRandR::Query(bool force, int screennum, bool ignoreoff) CStdString cmd; cmd = getenv("XBMC_BIN_HOME"); cmd += "/xbmc-xrandr"; - cmd = StringUtils::Format("%s -q --screen %d", cmd.c_str(), screennum); + cmd = StringUtils::Format("%s -q --screen %d --current", cmd.c_str(), screennum); FILE* file = popen(cmd.c_str(),"r"); if (!file)
Why? that's not what we want. We want to query data for a particular screen because we support multi-screen setups.
Disregard my last comment. Why should we avoid polling for hw changes? This should support hotplugging a monitor.
XRandR has two calls: XRRGetScreenResources() and XRRGetScreenResourcesCurrent(). The first forces the Xserver and kernel do a laborious and expensive hardware probe to see exactly what is attached. The second, added later when it was realised just how stupid that was given hardware and kernels that do autodetection of new hardware, just queries the kernel for the current set of outputs and their configuration. xrandr --current uses XRRGetScreenResourcesCurrent(), and does not imply only querying the current output. It is the right call to use after receiving a hotplug notification after a XRRNotifyEvent (which is X telling you that the current configuration has indeed changed). Without --current, the call can take up to a few seconds to complete and cause the displays to flash (though typically with good hardware only 0.2-0.5s). Do not use this unless you know exactly what you are doing at the hardware level.
Also note that X will generate multiple RRScreenChangeNotify events for a "single" update. It is better to batch them up and process them in a single pass.
Oh fun. The flashing is due to setting m_windowDirty on every XRR event, which causes it to recreate the output window every time. I haven't yet explained how you triggered so many XRR events though, but I suspect xbmc-xrandr...
Thanks for all your input. I will do some investigation based on this new information. This can take a few days, I have a busy week.
No worries, if you come across any oddities let me know. We are skirting around some tricky code, there is likely a bug or two lurking in the driver.
The problem is that creation of XBMC's window triggers a RRScreenChangeNotify event. In order to be save on all environments we destroy/recreate the window in those situations. Why does window creation trigger this event?
I get incorrect results from XQueryPointer. If works on the working configuration which is HDMI right-of VGA1 but with HDMI left-of VGA1 x position is always 1920. y pos is correct.
(In reply to comment #56) > xbmc-xrandr, you have to be kidding me. > > At least do > > diff --git a/xbmc/windowing/X11/XRandR.cpp b/xbmc/windowing/X11/XRandR.cpp > index c94f2e3..4fdaeeb 100644 > --- a/xbmc/windowing/X11/XRandR.cpp > +++ b/xbmc/windowing/X11/XRandR.cpp > @@ -74,7 +74,7 @@ bool CXRandR::Query(bool force, int screennum, bool > ignoreoff) > CStdString cmd; > cmd = getenv("XBMC_BIN_HOME"); > cmd += "/xbmc-xrandr"; > - cmd = StringUtils::Format("%s -q --screen %d", cmd.c_str(), screennum); > + cmd = StringUtils::Format("%s -q --screen %d --current", cmd.c_str(), > screennum); > > FILE* file = popen(cmd.c_str(),"r"); > if (!file) I merged this change today and already got the first complaint. Only a single mode is available on startup without probing: log with this change: http://sprunge.us/aVIe 17:33:14 T:140080651007808 INFO: Output 'HDMI1' has 1 modes 17:33:14 T:140080651007808 INFO: ID:0x48 Name:4096x2160 Refresh:24.000000 Width:4096 Height:2160 17:33:14 T:140080651007808 INFO: Pixel Ratio: 0.938362 log with the change reverted: http://sprunge.us/eFgV 51 modes available.
(In reply to comment #66) > (In reply to comment #56) > > xbmc-xrandr, you have to be kidding me. > > > > At least do > > > > diff --git a/xbmc/windowing/X11/XRandR.cpp b/xbmc/windowing/X11/XRandR.cpp > > index c94f2e3..4fdaeeb 100644 > > --- a/xbmc/windowing/X11/XRandR.cpp > > +++ b/xbmc/windowing/X11/XRandR.cpp > > @@ -74,7 +74,7 @@ bool CXRandR::Query(bool force, int screennum, bool > > ignoreoff) > > CStdString cmd; > > cmd = getenv("XBMC_BIN_HOME"); > > cmd += "/xbmc-xrandr"; > > - cmd = StringUtils::Format("%s -q --screen %d", cmd.c_str(), screennum); > > + cmd = StringUtils::Format("%s -q --screen %d --current", cmd.c_str(), > > screennum); > > > > FILE* file = popen(cmd.c_str(),"r"); > > if (!file) > > I merged this change today and already got the first complaint. Only a > single mode is available on startup without probing: Exactly. Nothing is probed until someone explicitly requests it. Do it once, preferably outside of xbmc since it doesn't do multimonitor configuration, and then anytime you receive a hotplug notification.
Some additional information, which may or may not be relevant to this bug. There are still issues with Xorg being able to set the correct modes for dual-head, specifically the refresh rate. With 13.04, setting 24Hz & 23.976Hz on HDMI2 was not an issue (worked well, 24Hz set on true 24Hz content, and 23.976Hz set on U.S-hobbled 24Hz=((24*1000)/1001 content). With the newer drivers from 14.04 (using the latest 01.org provided ones), xrandr tries to set this (seemingly successfully), but the output is always running on 60Hz regardless. It is the same behaviour for 50Hz as well (which is stuck at 60). The main LCD monitor is running on 60Hz, but even turning the monitor off (or using XBMCs 'blank other displays' setting has zero effect.) However, this occurs independently of running XBMC or not. (The problem can also be verified from the desktop.) Driver version rehash: - i965-va-driver:amd64 1.3.2-1 - libva1:amd64 1.3.1-3 - libdrm2:amd64 2.4.54 - xserver-xorg-video-intel 2.99.911 Modes seen by xrandr: Screen 0: minimum 8 x 8, current 3840 x 1200, maximum 32767 x 32767 VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 connected primary 1920x1200+0+0 (normal left inverted right x axis y axis) 518mm x 324mm 1920x1200 59.6*+ 60.0 59.6* 1024x768 60.0 + 75.1 75.0 70.1 60.0 1920x1080 60.0 60.2 1600x1200 60.0 60.1 1280x1024 76.0 76.0 75.0 60.0 60.0 60.0 1152x921 66.0 66.0 1152x864 75.0 832x624 74.6 74.5 800x600 72.2 75.0 60.3 56.2 640x480 75.0 72.8 75.0 72.8 66.7 60.0 60.0 720x400 70.1 DP1 disconnected (normal left inverted right x axis y axis) HDMI2 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 1600mm x 900mm 1920x1080 60.0*+ 50.0 59.9 24.0 24.0 1920x1080@60 60.0 1920x1080@50 50.0 1920x1080@59.94p 59.9 1920x1080i@60 60.1 1920x1080i@50 50.0 1920x1080@24 24.0 24.0 1920x1080i 60.1 50.0 60.0 1280x720@60 60.0 1280x720@50 50.0 1280x720 60.0 50.0 59.9 1440x576@50 50.0 1440x576 50.0 1440x576i@50 50.1 1440x480 60.0 59.9 1440x480@59.9 59.9 1440x480i@59.9 60.1 720x576@50 50.0 720x576 50.0 720x576i 50.1 720x480 60.0 59.9 720x480@59.9 59.9 720x480i 60.1 60.1 640x480@60 60.0 60.0 640x480 60.0 59.9 640x480@59.9 60.0 DP2 disconnected (normal left inverted right x axis y axis) VIRTUAL1 disconnected (normal left inverted right x axis y axis) Running 'xrandr --output HDMI2 --mode 1920x1080@24' from within the desktop tries to set 24Hz, but still outputs 60Hz regardless. Surely I can't be the only one experiencing this issue? (are there really that few people running XBMC on Intel w/ dual-head?) btw.-this is my fourth upgrade (and subsequent immediate downgrade) on this system from 13.04 to 13.10 & 14.04–assumed to be the final one–as XBMC was worked around to correctly start on the second display with the latest patches. But obviously the picture is juddery as hell trying to output 23.976Hz on a 60Hz display (especially when the video renderer _thinks_ it is on the correct refresh rate).
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.