Bug 97186 - Support intel-virtual-output with modesetting
Summary: Support intel-virtual-output with modesetting
Status: RESOLVED MOVED
Alias: None
Product: xorg
Classification: Unclassified
Component: Server/DDX/Xorg (show other bugs)
Version: unspecified
Hardware: Other All
: medium normal
Assignee: Xorg Project Team
QA Contact: Xorg Project Team
URL:
Whiteboard:
Keywords:
: 97837 (view as bug list)
Depends on:
Blocks:
 
Reported: 2016-08-02 13:06 UTC by main.haarp
Modified: 2018-12-13 18:30 UTC (History)
5 users (show)

See Also:
i915 platform:
i915 features:


Attachments

Description main.haarp 2016-08-02 13:06:18 UTC
Many laptops come with Intel graphics and a dedicated GPU that can be switched on on-demand. It's called Optimus. Some laptop makers wire the Displayport, HDMI, etc outputs to the dGPU. As a result, the dGPU needs to be running to use these outputs.

The only way to make Optimus work dynamically on non-Windows OS is to use Bumblebee. With it, X runs on the Intel GPU, and the tool switches on the dGPU and starts a separate X server with the nvidia driver when necessary, then transfers images back to the Intel X server

Displays hooked up to the dGPU's outputs result a sort of reversed situation. xf86-video-intel provides the intel-virtual-output tool, which acts as a display proxy between the X servers and transfers video data to the nvidia X server.

i-v-o won't work when modesetting is used instead of xf86-video-intel. The first problem I ran into is that modesetting does not provide and VIRTUAL outputs like the intel driver does. Can these be added somehow?
Comment 1 Tomas Pruzina 2016-08-02 14:53:00 UTC
(In reply to main.haarp from comment #0)
> Many laptops come with Intel graphics and a dedicated GPU that can be
> switched on on-demand. It's called Optimus. Some laptop makers wire the
> Displayport, HDMI, etc outputs to the dGPU. As a result, the dGPU needs to
> be running to use these outputs.

Sounds like something that randr offloading should handle.
Generally people use it other way, e.g. they pipe stuff rendered on dGPU onto iGPU outputs. See NVIDIA manual: http://us.download.nvidia.com/XFree86/Linux-x86/319.12/README/randr14.html

> The only way to make Optimus work dynamically on non-Windows OS is to use
> Bumblebee. 
See link above, as a sidenote, some people use PCI passtrough with VM.

>With it, X runs on the Intel GPU, and the tool switches on the
> dGPU and starts a separate X server with the nvidia driver when necessary,
> then transfers images back to the Intel X server
> 
> Displays hooked up to the dGPU's outputs result a sort of reversed
> situation. xf86-video-intel provides the intel-virtual-output tool, which
> acts as a display proxy between the X servers and transfers video data to
> the nvidia X server.
> 
> i-v-o won't work when modesetting is used instead of xf86-video-intel. The
> first problem I ran into is that modesetting does not provide and VIRTUAL
> outputs like the intel driver does. Can these be added somehow?

I'm under impression that this works via xrandr already (nvidia man page above suggests using it in similar a manner). Never used intel-virtual-output thingy but I do believe that it's a convenience wrapper for randr offloading functionality (could be wrong).
Comment 2 main.haarp 2016-08-02 19:05:35 UTC
(In reply to Tomas Pruzina from comment #1)
> 
> Sounds like something that randr offloading should handle.
> Generally people use it other way, e.g. they pipe stuff rendered on dGPU
> onto iGPU outputs. See NVIDIA manual:
> http://us.download.nvidia.com/XFree86/Linux-x86/319.12/README/randr14.html

Thanks, that's an interesting read. You are correct in saying that normally, dGPU renders are piped back to the iGPU. However, in the case of hardware outputs attached to the dGPU, the "reverse" is necessary too.


The doc does not quite describe what Optimus intended. I quote:

"To use the NVIDIA driver as an RandR 1.4 output source provider, the X server needs to be configured to use the NVIDIA driver for its primary screen"

Meaning that the Nvidia card has to remain active to use this feature, which defeats the power-saving purpose. This is Nvidia's "solution" for Linux users.

The Bumblebee solution, with its separate independent X servers, is much more elegant in this regard. But it requires a tool like i-v-o to use outputs attached to the dGPU.


> I'm under impression that this works via xrandr already (nvidia man page
> above suggests using it in similar a manner). Never used
> intel-virtual-output thingy but I do believe that it's a convenience wrapper
> for randr offloading functionality (could be wrong).

i-v-o does proxy some xrandr events, but afaik the actual screen content is copied via buffers. It's quite inefficient, but it works. https://bugs.freedesktop.org/show_bug.cgi?id=96820#c3 provides some insights.
Comment 3 Pablo Cholaky 2017-06-03 04:45:28 UTC
There is someone looking about this? I would love to see this going up, because Intel xf86 driver is quite broken and the codebase is not quite simple.

This would also benefit any other videocard with modesetting to work with Nvidia and being power efficient.
Comment 4 Pablo Cholaky 2017-06-03 04:46:56 UTC
*** Bug 97837 has been marked as a duplicate of this bug. ***
Comment 5 Chris Wilson 2017-06-03 08:57:30 UTC
(In reply to Pablo Cholaky from comment #3)
> There is someone looking about this? I would love to see this going up,
> because Intel xf86 driver is quite broken and the codebase is not quite
> simple.

What bug have you encountered?
Comment 6 Pablo Cholaky 2017-06-03 21:50:51 UTC
Hi Chris,

I still with the problem mentioned on bug 98517, with modesetting is OK, I can recover easily with workaround (switch from X to terminal and back to X), but with under intel xf86 driver is madness and the workaround don't work, I need to blindly unlock computer, open a terminal and execute multiple times xrandr to get it work. I think this is a DRI problem, but with intel xf86 makes the bug even worst.

Also, IVO is kinda broken with DRI3, X crashes if I unplug the monitor, this doesn't happen with DRI2, not sure if 96820 may relate something.

And, also, using IVO and running very demanding games, external outputs works very bad, not sure if this is Nvidia's fault, but seems like refresh is lacking very away, do you need a ticket for that? I didn't saw anything wrong using verbose, except some "forcing refresh"
Comment 7 GitLab Migration User 2018-12-13 18:30:18 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/xserver/issues/179.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.