Bug 81476 - three monitors on two radeon cards works with some layouts not others
Summary: three monitors on two radeon cards works with some layouts not others
Status: RESOLVED MOVED
Alias: None
Product: DRI
Classification: Unclassified
Component: DRM/Radeon (show other bugs)
Version: unspecified
Hardware: x86-64 (AMD64) Linux (All)
: medium major
Assignee: Default DRI bug account
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2014-07-18 02:28 UTC by Ian! D. Allen
Modified: 2019-11-19 08:53 UTC (History)
0 users

See Also:
i915 platform:
i915 features:


Attachments
Xorg.0.log (304.15 KB, text/plain)
2014-07-18 02:28 UTC, Ian! D. Allen
no flags Details
Xorg.0.log with Segmentation fault at address 0x10 (71.16 KB, text/plain)
2014-07-18 17:21 UTC, Ian! D. Allen
no flags Details
display scanout limit is 4k on avivo boards (1.11 KB, patch)
2014-07-18 21:53 UTC, Alex Deucher
no flags Details | Splinter Review

Description Ian! D. Allen 2014-07-18 02:28:15 UTC
Created attachment 103009 [details]
Xorg.0.log

Summary: Three monitors on two cards work fine in some layouts but not others.

Ubuntu 12.04.4 LTS with updated kernel and Xorg from Ubuntu "Trusty".
Two ATI FireMV 2250 cards with three 1600x1200 Dell LCD monitors.
(More details below.)
No xorg.conf file - using defaults.

Works fine out-of-the-box when configured with two screens adjacent on one card and the third screen (on the second card) anywhere above or below either of the two adjacent screens.  All three monitors display their screens nicely and I can drag windows to all three screens on all three monitors.

Relevant xrandr for the above working configuration:
Screen 0: minimum 320 x 200, current 3200 x 2400, maximum 8192 x 8192
DVI-0 connected primary 1600x1200+0+0 (normal left inverted right x axis y axis) 367mm x 275mm
DVI-1 connected 1600x1200+1600+0 (normal left inverted right x axis y axis) 367mm x 275mm
DVI-1-2 connected 1600x1200+0+1200 367mm x 275mm
DVI-1-3 disconnected

If I use either xrandr or the Ubuntu "Screen Display" panel to move the third screen to be in-line with the other two, either to the right or to the left, it causes the video on the other two to fail and only the third screen to display correctly.  The failed screens have video garbage on them and don't change when I move windows onto them.  Switching to the console (CTRL+ALT+F1) and back causes the third monitor to re-display correctly but the other two monitors are now black.  Using the "Apply" button in "Screen Display" refreshes the two black monitors with more video garbage that changes with every "Apply".

If I move all three screens to be in-line and use the "Screen Display" "Rotation" option to set the two non-working screens to use any rotation or 180 flip, they clean up and display correctly on their monitors (in rotated or flipped mode, of course)!

When I restore the "Rotation" back to "Normal" on either screen, the video fails again with garbage.  Adding back rotation or 180 flip, the video cleans up again and displays correctly (in rotated or flipped mode, of course).

Things would probably work fine across all three monitors if I could flip 180 all three monitors and stand on my head, but for some reason the "Screen Display" configuration options allow rotation only on the two adjacent screens on the one card and not on the third screen on the other card.

It's odd that the video is fine when rotated or flipped, but is garbage when left as "Normal".  Alas, I can't easily use the screens when rotated or flipped.

Ubuntu 12.04 Package: xserver-xorg-lts-trusty  1:7.7+1ubuntu8~precise1

Hardware (from lspci):
04:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] RV516 GL [FireMV 2250]
04:00.1 Display controller: Advanced Micro Devices, Inc. [AMD/ATI] RV516 GL [FireMV 2250] (Secondary)
05:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] RV516 GL [FireMV 2250]
05:00.1 Display controller: Advanced Micro Devices, Inc. [AMD/ATI] RV516 GL [FireMV 2250] (Secondary)

Excerpts from /var/log/Xorg.0.log (full log attached):
X.Org X Server 1.15.1
Release Date: 2014-04-13
X Protocol Version 11, Revision 0
Build Operating System: Linux 2.6.42-54-generic x86_64 Ubuntu
Current Operating System: Linux home 3.13.0-32-generic #57~precise1-Ubuntu SMP Tue Jul 15 03:51:20 UTC 2014 x86_64
Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.13.0-32-generic root=UUID=49af0ade-c356-429f-a723-85d7358f0484 ro
Build Date: 29 May 2014  10:29:10AM
xorg-server 2:1.15.1-0ubuntu2~precise1 (For technical support please see http://www.ubuntu.com/support) 
Current version of pixman: 0.30.2
...
(II) LoadModule: "radeon"
(II) Loading /usr/lib/xorg/modules/drivers/radeon_drv.so
(II) Module radeon: vendor="X.Org Foundation"
  compiled for 1.15.1, module version = 7.3.0
  Module class: X.Org Video Driver
  ABI class: X.Org Video Driver, version 15.0
Comment 1 Alex Deucher 2014-07-18 03:16:04 UTC
Your desktop is too large for the 3D engine.  The max surface size supported by the r5xx 3D engine is 4096x4096.  1600*3 = 4800 pixels.  That's why it only works above and below.
Comment 2 Ian! D. Allen 2014-07-18 04:32:29 UTC
(In reply to comment #1)
> Your desktop is too large for the 3D engine.  The max surface size
> supported by the r5xx 3D engine is 4096x4096.  1600*3 = 4800 pixels.
> That's why it only works above and below.

I don't think that is the problem:

1. I'm not asking any one FireMV card to do more than two 1600x1200 monitors; that's only 3200x1200 pixels per card, well within 4096x4096.

2. Three adjacent monitors works fine (no video garbage) if I flip 180 the two affected screens.  (But then of course I can't use those screens unless I physically turn those two monitors upside-down!)  Here's the relevant xrandr output for this working configuration, with the first two screens inverted:

Screen 0: minimum 320 x 200, current 4800 x 1200, maximum 8192 x 8192
DVI-0 connected primary 1600x1200+0+0 inverted (normal left inverted right x axis y axis) 367mm x 275mm
DVI-1 connected 1600x1200+1600+0 inverted (normal left inverted right x axis y axis) 367mm x 275mm
DVI-1-2 connected 1600x1200+3200+0 367mm x 275mm
DVI-1-3 disconnected

3. The failure is always on the card with the two monitors, even if I configure the third monitor to the left of those two.  If I was going out-of-bounds, surely moving the screen from one side to the other side would affect where the error appeared?

4. Is the software really so badly written that it can't tell me that I'm going out-of-bounds?  I get no error from xrandr for the three-adjacent layout.
Comment 3 Alex Deucher 2014-07-18 13:21:24 UTC
(In reply to comment #2)
> (In reply to comment #1)
> > Your desktop is too large for the 3D engine.  The max surface size
> > supported by the r5xx 3D engine is 4096x4096.  1600*3 = 4800 pixels.
> > That's why it only works above and below.
> 
> I don't think that is the problem:
> 
> 1. I'm not asking any one FireMV card to do more than two 1600x1200
> monitors; that's only 3200x1200 pixels per card, well within 4096x4096.
> 

They way the core provider stuff in the xserver works the rendering is done on one GPU and the respective updated regions are sent to the other displays.  If you want to use a dedicated GPU per displays, you'll have to use zaphod mode and set up a static configuration in your xorg.conf.  If you want 3D, you'll have to use the displays independently.  If you enable xinerama, you won't get 3D.  There have been several proposals to extend the xserver to support splitting rendering across GPUs, but it it a huge amount of work and so far no one has finished.  See: http://bz.bzflag.bz/~Ch3ck/ for the latest proposal or google for "xorg shatter" for more information.

> 2. Three adjacent monitors works fine (no video garbage) if I flip 180 the
> two affected screens.  (But then of course I can't use those screens unless
> I physically turn those two monitors upside-down!)  Here's the relevant
> xrandr output for this working configuration, with the first two screens
> inverted:
> 
> Screen 0: minimum 320 x 200, current 4800 x 1200, maximum 8192 x 8192
> DVI-0 connected primary 1600x1200+0+0 inverted (normal left inverted right x
> axis y axis) 367mm x 275mm
> DVI-1 connected 1600x1200+1600+0 inverted (normal left inverted right x axis
> y axis) 367mm x 275mm
> DVI-1-2 connected 1600x1200+3200+0 367mm x 275mm
> DVI-1-3 disconnected
> 
> 3. The failure is always on the card with the two monitors, even if I
> configure the third monitor to the left of those two.  If I was going
> out-of-bounds, surely moving the screen from one side to the other side
> would affect where the error appeared?

It depends which card is the one doing the rendering and which is the one that is diplaying only.

> 
> 4. Is the software really so badly written that it can't tell me that I'm
> going out-of-bounds?  I get no error from xrandr for the three-adjacent
> layout.

The OpenGL compositing manager you are running should check for the max limits of the GL driver before using it for surfaces which may exceed that.  If you disable the OpenGL compositor, your desktop should work fine.
Comment 4 Ian! D. Allen 2014-07-18 17:11:51 UTC
(In reply to comment #3)
> They way the core provider stuff in the xserver works the rendering
> is done on one GPU and the respective updated regions are sent to the
> other displays.

If the above limits are true, why do I get a working 4800x1200 desktop when I invert two of the three screens?  Why can it show my 4800x1200 desktop if two screens are upside-down but not if they are right-side-up?  How is this possible if 4800x1200 can't be done? [*]

> If you disable the OpenGL compositor, your desktop should work fine.

I don't know enough about the graphics chain to know how to "disable the OpenGL compositor".  All I do is start Xorg, a window manager, and some clients.  There is nothing mentioned about "composit" actually being used anywhere in the Xorg.0.log.  The net searches I do all find stuff that's years old.  Please point me at some current documentation on how to disable this?

> The OpenGL compositing manager you are running should check for the max
> limits of the GL driver before using it for surfaces which may exceed that. 

Yes, it should check instead of quietly screwing up.  Can it not query the GPU for the max limit?  (The max limit you say of 4096 is not obvious in the Xorg.0.log output anywhere.  Why does xrandr tell me "maximum 8192 x 8192"?)  To whom should I submit this "failure to check limit" bug report?

[*] The working 4800x1200 desktop with two screens inverted is unstable, since if I CTRL-ALT-F1 to a console, it dies with:
...
reporting 4 6 22 172
reporting 4 6 22 172
reporting 4 6 22 172
reporting 4 6 22 172
reporting 4 6 22 172
reporting 4 6 22 172
reporting 4 6 22 172
(II) AIGLX: Suspending AIGLX clients for VT switch
(EE)
(EE) Backtrace:
(EE) 0: /usr/bin/X (xorg_backtrace+0x55) [0x7f6e13e71b05]
(EE) 1: /usr/bin/X (0x7f6e13cba000+0x1bb9c9) [0x7f6e13e759c9]
(EE) 2: /lib/x86_64-linux-gnu/libpthread.so.0 (0x7f6e12fe0000+0xfcb0) [0x7f6e12fefcb0]
(EE) 3: /usr/bin/X (CreatePicture+0x3e) [0x7f6e13de897e]
(EE) 4: /usr/bin/X (0x7f6e13cba000+0xdc15e) [0x7f6e13d9615e]
(EE) 5: /usr/bin/X (0x7f6e13cba000+0x1a8758) [0x7f6e13e62758]
(EE) 6: /usr/bin/X (0x7f6e13cba000+0xea898) [0x7f6e13da4898]
(EE) 7: /usr/bin/X (BlockHandler+0x44) [0x7f6e13d144f4]
(EE) 8: /usr/bin/X (WaitForSomething+0x11d) [0x7f6e13e6ee4d]
(EE) 9: /usr/bin/X (0x7f6e13cba000+0x55954) [0x7f6e13d0f954]
(EE) 10: /usr/bin/X (0x7f6e13cba000+0x59b8a) [0x7f6e13d13b8a]
(EE) 11: /lib/x86_64-linux-gnu/libc.so.6 (__libc_start_main+0xed) [0x7f6e11c5576d]
(EE) 12: /usr/bin/X (0x7f6e13cba000+0x44101) [0x7f6e13cfe101]
(EE)
(EE) Segmentation fault at address 0x10
(EE)
Fatal server error:
(EE) Caught signal 11 (Segmentation fault). Server aborting
(EE)
(EE)
Please consult the The X.Org Foundation support
         at http://wiki.x.org
 for help.
(EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
(EE)
(EE) Server terminated with error (1). Closing log file.
xinit: connection to X server lost
Comment 5 Ian! D. Allen 2014-07-18 17:21:41 UTC
Created attachment 103052 [details]
Xorg.0.log with Segmentation fault at address 0x10

Brought up X11 using startx.  System used two of three 1600x1200 monitors (on one card) and left third monitor disabled.  Enabled third monitor (for 4800x1200 desktop) and inverted two of the monitors on the first card to avoid video garbage.  Desktop worked fine at 4800x1200 (except first two monitors were inverted). Typed CTRL-ALT-F1 and server died with above segfault.  Re-running startx brings up server on all three monitors but it immediately segfaults again.

Apparently I can bring up X11 on two monitors and then dynamically move to all three, with two inverted, but once I've saved that config it won't bring it up on its own and dies.
Comment 6 Ian! D. Allen 2014-07-18 17:53:21 UTC
The above segfault only happened when I ran the startx as root.  Running it as an ordinary user, it works fine, bringing up the 4800x1200 desktop with three monitors on two cards, with the two screens on the first card inverted.

I can CTRL-ALT-F1 to console and then back; the whole 4800x1200 desktop redraws just fine (with two monitors inverted).  I can CTRL-ALT-F7 to switch to my 4800x1200 Xinerama desktop that uses a fixed xorg.conf and then CTRL-ALT-F8 to switch to the above RANDR desktop that runs 4800x1200 and uses no xorg.conf.

So how do I get the RANDR destop working without inverting the two monitors?
Comment 7 Alex Deucher 2014-07-18 18:25:09 UTC
(In reply to comment #4)
> (In reply to comment #3)
> > They way the core provider stuff in the xserver works the rendering
> > is done on one GPU and the respective updated regions are sent to the
> > other displays.
> 
> If the above limits are true, why do I get a working 4800x1200 desktop when
> I invert two of the three screens?  Why can it show my 4800x1200 desktop if
> two screens are upside-down but not if they are right-side-up?  How is this
> possible if 4800x1200 can't be done? [*]

Rotation uses an additional buffer for each display.  The contents of the respective rotated displays are copied out of the primary surface over to the rotated shadow buffer.

If you want to check what coordinate limit is causing a problem, try configuring the displays for a 3x1 configuration less than 4096 pixels, e.g., 1600x1200+1600x1200+800x600.  If that works then try bumping the size of the last monitor above the limit. E.g., 1600x1200+1600x1200+1024x768.

> 
> > If you disable the OpenGL compositor, your desktop should work fine.
> 
> I don't know enough about the graphics chain to know how to "disable the
> OpenGL compositor".  All I do is start Xorg, a window manager, and some
> clients.  There is nothing mentioned about "composit" actually being used
> anywhere in the Xorg.0.log.  The net searches I do all find stuff that's
> years old.  Please point me at some current documentation on how to disable
> this?

It depends on your desktop environment (gonme, kde, etc.) and your window manager.  E.g., kwin or gnome shell, etc..

> 
> > The OpenGL compositing manager you are running should check for the max
> > limits of the GL driver before using it for surfaces which may exceed that. 
> 
> Yes, it should check instead of quietly screwing up.  Can it not query the
> GPU for the max limit?  (The max limit you say of 4096 is not obvious in the
> Xorg.0.log output anywhere.  Why does xrandr tell me "maximum 8192 x 8192"?)
> To whom should I submit this "failure to check limit" bug report?

8192x8192 is the max surface size the display hardware can handle.  It's independent of the 3D engine.  The limits of the 3D engine are exposed via OpenGL and any OpenGL app can query them.

Try starting X without a window manager or a simple one like twm.  It should work correctly because no OpenGL operations would be used in that case so the 3D engine limits are not getting hit.
Comment 8 Ian! D. Allen 2014-07-18 20:20:55 UTC
(In reply to comment #7)
> Try starting X without a window manager or a simple one like twm.
> It should work correctly because no OpenGL operations would be used
> in that case so the 3D engine limits are not getting hit.

All along I've been using the vtwm window manager, not the fancy compiz.

I just tried using twm instead of vtwm; no improvement.

The desktop doesn't work across two cards with three monitors; the two monitors on the first card have video crap on them.   The video crap changes a little bit right at the top if I drag windows onto them from the working third monitor, but they are otherwise useless (unless I invert them).
Comment 9 Alex Deucher 2014-07-18 20:54:21 UTC
(In reply to comment #8)
> All along I've been using the vtwm window manager, not the fancy compiz.
> 
> I just tried using twm instead of vtwm; no improvement.
> 
> The desktop doesn't work across two cards with three monitors; the two
> monitors on the first card have video crap on them.   The video crap changes
> a little bit right at the top if I drag windows onto them from the working
> third monitor, but they are otherwise useless (unless I invert them).

To help narrow this down try configuring the displays for a 3x1 configuration less than 4096 pixels, e.g., 1600x1200+1600x1200+800x600.  If that works then try bumping the size of the last monitor above the limit. E.g., 1600x1200+1600x1200+1024x768.
Comment 10 Ian! D. Allen 2014-07-18 21:25:55 UTC
(In reply to comment #9)
> To help narrow this down try configuring the displays for a 3x1
> configuration less than 4096 pixels

This +2432 offset works fine across three monitors:

    xrandr --output DVI-1-2 --pos 2432x0 --auto

    DVI-0 connected 1600x1200+0+0
    DVI-1 connected 1600x1200+1600+0
    DVI-1-2 connected 1600x1200+2432+0

This +2433 offset causes video crap on the first two monitors:

    xrandr --output DVI-1-2 --pos 2433x0 --auto

    DVI-0 connected 1600x1200+0+0
    DVI-1 connected 1600x1200+1600+0
    DVI-1-2 connected 1600x1200+2433+0

As before: using Xorg with twm and a few terminal clients.  No compiz.
Comment 11 Alex Deucher 2014-07-18 21:53:12 UTC
Created attachment 103065 [details] [review]
display scanout limit is 4k on avivo boards

Does this kernel patch help?
Comment 12 Ian! D. Allen 2014-07-18 22:46:46 UTC
(In reply to comment #11)
> Does this kernel patch help?

I haven't built a kernel from source in years.
Let's see if I still remember how to do this.

I've fetched the Ubuntu kernel source.
Found /usr/src/linux-lts-trusty-3.13.0/drivers/gpu/drm/radeon/radeon_display.c
I've made the edit. (Was line 1422 in my file, not 1547 as in yours.)
Now "make oldconfig" and "make all" and wait...

I'll let you know.
Thanks for the quick action on this.
I see you're the co-author on this file; nice to meet you!
Comment 13 Ian! D. Allen 2014-07-19 04:35:03 UTC
(In reply to comment #11)
> Does this kernel patch help?

Not with my problem.

1. It doesn't fix the bug and make my desktop work across three monitors.
2. It doesn't prevent all the configurations that can trigger the bug.
3. It does prevent some valid configurations that used to work before.

With your patch, xrandr shows the new smaller screen maximum:

    Screen 0: minimum 320 x 200, current 4096 x 1200, maximum 4096 x 4096
    DVI-0 connected 1600x1200+0+0 
    DVI-1 connected 1600x1200+1600+0 
    DVI-1-2 connected 1600x1200+2496+0 
    DVI-1-3 disconnected

The +2432 offset still works fine across three monitors, as it did before:

    xrandr --output DVI-1-2 --pos 2432x0 --auto

Offsets from +2433 to +2496 are still allowed and still trigger the bug and cause video crap on the first two monitors, just as they did before.

Offsets above +2496 are now prohibited due to the new smaller screen maximum, even though I know they would work if I rotated or inverted the first two monitors.

Your patch doesn't fix the bug.  It does prevent me from using the actual available screen resolution, should I want to invert or rotate the first two monitors.  Functionality is lost.

I gather you don't know how to fix the bug, so you're instead trying to limit the inputs so as not to not let people trigger it?

You said "Rotation uses an additional buffer for each display".  I guess I'll have to go through the code and find out how to make the non-inverted outputs use the same type of buffering as the inverted outputs for widths above 4064.  That should work.
Comment 14 Ian! D. Allen 2014-07-19 10:45:17 UTC
(In reply to comment #13)
> I guess I'll have to go through the code and find out how to make the
> non-inverted outputs use the same type of buffering as the inverted
> outputs for widths above 4064.

This looks like too much work; I'm not a graphics card programmer.  It would be faster for me to give away these two old FireMV cards and buy a single modern card that Linux X11 supports with three 1600x1200 DVI monitors and full 3D (no Xinerama).  Alex - does AMD have anything like this?
Comment 15 Ian! D. Allen 2014-07-19 13:50:17 UTC
(In reply to comment #14)
> a single modern card that Linux X11 supports with three 1600x1200 DVI
> monitors and full 3D (no Xinerama).

Phoronix suggests that the open source drivers work well on Radeon HD 6000 cards: http://www.phoronix.com/vr.php?view=20509

"If you find a good deal on a Radeon HD 6000 series GPU (outside of the HD 6900 series), that's where open-source fans will best likely be served".
Comment 16 Alex Deucher 2014-07-21 14:48:13 UTC
(In reply to comment #14)
> 
> This looks like too much work; I'm not a graphics card programmer.  It would
> be faster for me to give away these two old FireMV cards and buy a single
> modern card that Linux X11 supports with three 1600x1200 DVI monitors and
> full 3D (no Xinerama).  Alex - does AMD have anything like this?

All evergreen (hd 5000 series) and newer dGPUs can support up to 6 monitors depending on the board.
Comment 17 Ian! D. Allen 2014-07-21 16:06:53 UTC
(In reply to comment #16)
> All evergreen (hd 5000 series) and newer dGPUs can support up to 6
> monitors depending on the board.

Phoronix suggests they don't all work well with open source drivers, those that do work often don't do well multi-monitor, and most don't do triple DVI output unless one buys one or more active adapters.

I don't want to pick a card that becomes obsolete and where the driver writers have moved on and aren't interested in fixing the old bugs (e.g. the above bug in the radeon driver).

Are the open source drivers for the evergreen cards (and newer) under active development?
Comment 18 Alex Deucher 2014-07-21 16:28:37 UTC
(In reply to comment #17)
> 
> Phoronix suggests they don't all work well with open source drivers, those
> that do work often don't do well multi-monitor, and most don't do triple DVI
> output unless one buys one or more active adapters.

Displays should generally work fine.  Note that Evergreen, NI, and SI parts only have two non-DP PLLs so they are limited to two independent non-DP clocks.  So what this means is that you can only use two non-DP monitors with independent timings.  If you want to use more than two non-DP monitors, you have to use the exact same pixel clocks on several of the monitors so you don't use more then two different clocks.  E.g., if all three of your monitors are identical use the same modeline and pixel clock, you'd only need one PLL so you should be fine.

> 
> I don't want to pick a card that becomes obsolete and where the driver
> writers have moved on and aren't interested in fixing the old bugs (e.g. the
> above bug in the radeon driver).

TBH, I think it's a hw limitation on r5xx hardware that's not properly handled.  Unfortunately, X didn't support multi-card xrandr support when r5xx was still actively being developed.  At that time, the only way to support multi-card multi-head was zaphod mode which uses separate buffers for each display so you never hit the hw limits.

> 
> Are the open source drivers for the evergreen cards (and newer) under active
> development?

Yes.
Comment 19 Ian! D. Allen 2014-07-21 23:17:18 UTC
(In reply to comment #18)
> if all three of your monitors are identical use the same modeline and
> pixel clock, you'd only need one PLL so you should be fine.

Yes: Three identical 1600x1200.  I just have to find an available card that can handle three DVI connectors, ideally without having to buy an active adapter.  (Sapphire makes a "Flex" card that claims to do this; my initial attempts to find one to buy have turned up nothing useful.)

> TBH, I think it's a hw limitation on r5xx hardware that's not properly handled.

I'd believe it as a hardware limitation if inverting the first two monitors didn't fix it.  If I were willing to physically turn my first two monitors upside-down, it would work fine with the given hardware.  The hardware can display across my three monitors at 4800x1200 as long as two of the screens are inverted.  We just need to make the software put those screens into the hardware the other way up!

Thanks for your comments.
Comment 20 Alex Deucher 2014-07-22 00:04:16 UTC
(In reply to comment #19)
> 
> I'd believe it as a hardware limitation if inverting the first two monitors
> didn't fix it.  If I were willing to physically turn my first two monitors
> upside-down, it would work fine with the given hardware.  The hardware can
> display across my three monitors at 4800x1200 as long as two of the screens
> are inverted.  We just need to make the software put those screens into the
> hardware the other way up!

As I said before, it only works with rotation because it's scanning out of different buffers when rotation is enabled.  When you enable rotation a shadow buffer is created which is a rotated duplicate of the sub-region of the entire desktop.  So for the rotated case, the display hw is scanning out of a 1600x1200 surface rather than the full 4800x1200 surface.  If you wanted to you could do something similar for the non-rotated case in the xserver, but that would incur an extra copy for every update of the desktop in the non-rotated case.

If you want to test further, try and disable acceleration (Option "NoAccel" "true" in the device section of your xorg.conf) or use the modesetting driver (xf86-video-modesetting).  You won't be able to use rotation since that requires acceleration do to the rotated blit, but that will take all non-display hw factors (3D, etc.) out of the equation.
Comment 21 Martin Peres 2019-11-19 08:53:58 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/drm/amd/issues/511.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.