Optimus and the recent nVidia beta drivers release

New nVidia beta drivers, 319.12, have been released yesterday. Unfortunately, several web sites have been quick to put articles with titles suggesting Optimus support finally coming to Linux, and I'm saying "unfortunately" because I believe this is a case where inaccurate reporting hurts everyone: to non-technical users, the articles may have an effect of giving a false impression that the wait is over and the complete and proper support in the official binary drivers has arrived. As far as I see, there is some confusion and misinformation in users' discussions, and that is not helping either. So, let's try to clarify things.

In short, the new beta is but a first user-visible step towards complete Optimus support. Remarkably, it covers use cases that Bumblebee has never supported well: using external monitors attached to the nVidia GPU, and running all rendering on the nVidia GPU. On the other hand, Bumblebee provided power management and render offloading on the basis of individual applications, neither of which is offered by the new beta.

In a typical muxless Optimus laptop (or a mux'ed laptop in "Optimus" configuration), you have the laptop LCD panel connected to Intel GPU only (so that nVidia card cannot display on it), and you may also have some external video port connected to nVidia GPU only (so that Intel card cannot display on it). Normally, you run the X server with the Intel driver, with the only output being the LCD panel, and all works well. Let's now consider more fancy scenarios.

1. You want to run a graphically intensive game, so you wish that heavy rendering is performed by the nVidia card, but it's only powered on for the duration of the game. You don't need to redirect rendering of any other applications to the nVidia card. This is called "render offloading".

2. You want to temporarily plug in an external monitor into the nVidia-driven output port without disrupting your existing X session already running on the X server with the Intel driver. Since the Intel chip cannot access that external port, it will need the nVidia card to perform display ("scanout") for it (this is assuming Intel does all rendering; alternatively, nVidia could be performing rendering for its portion of the desktop).

3. You want to use nVidia card for rendering the whole desktop, trading increased power consumption for improved acceleration of all graphical apps, including the compositor. Since the nVidia chip cannot access the laptop LCD panel, it will need the Intel card to perform scanout for it.

2 and 3 are called scanout offloading, and notice how it is needed in different directions for different use cases. The card performing the scanout is called the scanout sink, and the other is called the scanout source.

With the new beta, nVidia supports scanout offloading, with the restriction that the nVidia chip can be the scanout source but not the sink. Thus, it supports use case 3.

Use case 2 needs GPU hotplug in the X server, because you want to power up the discrete GPU only when the external monitor is plugged in, and on top of that use case 1 needs a mechanism to route rendering between different drivers. For now, it's possible to use a combination of virtual crtc patch and hybrid-screenclone to "solve" case 2 (yep, that's painful), and Bumblebee "solves" case 1. Proper support in the drivers/server stack will be more efficient, of course.

Notably, it should be possible to use the new beta drivers to get better accelerated rendering for gaming sessions by starting the game on a separate X display with nVidia driver and scanout offloading. FSGamer should come in handy for that. Note that you want to be using xf86-video-intel driver for the offload sink in this case (not the modesetting driver as the readme currently suggests), which should work since version 2.21.5 and required, since Xorg does not support different drivers for the same PCI device.

Hope that helps.

Alexander

PS. If you were wondering whether you should install new drivers: if unsure, don't do that yet. You might clobber your gl/glx libraries, leaving you without accelerated rendering on Intel desktop, and accelerated nVidia+scanout is not easy to setup either.
Shared publiclyView activity