Shared publicly  - 
Optimus and the recent nVidia beta drivers release

New nVidia beta drivers, 319.12, have been released yesterday. Unfortunately, several web sites have been quick to put articles with titles suggesting Optimus support finally coming to Linux, and I'm saying "unfortunately" because I believe this is a case where inaccurate reporting hurts everyone: to non-technical users, the articles may have an effect of giving a false impression that the wait is over and the complete and proper support in the official binary drivers has arrived. As far as I see, there is some confusion and misinformation in users' discussions, and that is not helping either. So, let's try to clarify things.

In short, the new beta is but a first user-visible step towards complete Optimus support. Remarkably, it covers use cases that Bumblebee has never supported well: using external monitors attached to the nVidia GPU, and running all rendering on the nVidia GPU. On the other hand, Bumblebee provided power management and render offloading on the basis of individual applications, neither of which is offered by the new beta.

In a typical muxless Optimus laptop (or a mux'ed laptop in "Optimus" configuration), you have the laptop LCD panel connected to Intel GPU only (so that nVidia card cannot display on it), and you may also have some external video port connected to nVidia GPU only (so that Intel card cannot display on it). Normally, you run the X server with the Intel driver, with the only output being the LCD panel, and all works well. Let's now consider more fancy scenarios.

1. You want to run a graphically intensive game, so you wish that heavy rendering is performed by the nVidia card, but it's only powered on for the duration of the game. You don't need to redirect rendering of any other applications to the nVidia card. This is called "render offloading".

2. You want to temporarily plug in an external monitor into the nVidia-driven output port without disrupting your existing X session already running on the X server with the Intel driver. Since the Intel chip cannot access that external port, it will need the nVidia card to perform display ("scanout") for it (this is assuming Intel does all rendering; alternatively, nVidia could be performing rendering for its portion of the desktop).

3. You want to use nVidia card for rendering the whole desktop, trading increased power consumption for improved acceleration of all graphical apps, including the compositor. Since the nVidia chip cannot access the laptop LCD panel, it will need the Intel card to perform scanout for it.

2 and 3 are called scanout offloading, and notice how it is needed in different directions for different use cases. The card performing the scanout is called the scanout sink, and the other is called the scanout source.

With the new beta, nVidia supports scanout offloading, with the restriction that the nVidia chip can be the scanout source but not the sink. Thus, it supports use case 3.

Use case 2 needs GPU hotplug in the X server, because you want to power up the discrete GPU only when the external monitor is plugged in, and on top of that use case 1 needs a mechanism to route rendering between different drivers. For now, it's possible to use a combination of virtual crtc patch and hybrid-screenclone to "solve" case 2 (yep, that's painful), and Bumblebee "solves" case 1. Proper support in the drivers/server stack will be more efficient, of course.

Notably, it should be possible to use the new beta drivers to get better accelerated rendering for gaming sessions by starting the game on a separate X display with nVidia driver and scanout offloading. FSGamer should come in handy for that. Note that you want to be using xf86-video-intel driver for the offload sink in this case (not the modesetting driver as the readme currently suggests), which should work since version 2.21.5 and required, since Xorg does not support different drivers for the same PCI device.

Hope that helps.


PS. If you were wondering whether you should install new drivers: if unsure, don't do that yet. You might clobber your gl/glx libraries, leaving you without accelerated rendering on Intel desktop, and accelerated nVidia+scanout is not easy to setup either.
Albert Vilella's profile photoBartosz Radaczyński's profile photoFatah Ishak's profile photoBumblebee Project's profile photo
I and at least few other people have been trying to get this driver working. Nobody with a display-less Nvidia GPU connected only to the Intel GPU has been successful. For the moment, only displaying on Nvidia HDMI has been shown to work. Hopefully this will be figured out tonight.
I tried installing on my Nvidia optimus laptop without reading properly and my display got messed up. Ubuntu raring does not support it properly. I ended up getting 640x480 resolution with no unity launcher and panel. Reverting back to bumblebee didn't help either. Now compiz won't start. Have temporarily moved to gnome shell to get a working desktop. 
You need to have Linux 3.9, xorg-edgers or equivalent, xrandr 1.4, and a strong command of the terminal. Unfortunately, none come standard with Raring. Good luck though, this driver is the most beta I've ever seen.
Arrrgh, I am really keen for Use Case 2.  Progress is progress though.

Thanks for the clean explanation. :)
Thanks for explanation. So HDMI display still does not work! :-(
+Sayantan Das Deleting xorg.conf may help? I faced this issue once and got it resolved this way. Check 'inxi -Gx' and 'optirun inxi -Gx' output and see if direct rendering shows 'Yes' to check if both drivers are setup properly.
I'm good with nvidia proprietary driver on ubuntu 12.10, much hassles to install but worth the effort, but if this bumblebee can offer more features and stability then trying it would be ok. 
+Koko Widyatmoko Bumblebee is when you have dual graphics card setup, for instance a primary Intel card and secondary Nvidia card. Optimus is a technology where system intelligently decides the graphics card to be used and is not supported yet on Linux . Bumblebee is a workaround to use Nvidia card on demand in Linux.
+Anand Radhakrishnan oo i see, i do have dual graphic with intel gma from ivy bridge and an nvidia card, will it work? By the way, thanks for the info. 
Thank you for the very clean and informative article.
Looking forward to achieve the best bumblebee integration with the new drivers.
I'm using bumblebee on Ubuntu 12.10 on my Alienware m17x. I love it, I've had no issues with it. It works when I have a need for it. Thanks for the great work on it.
i managed to use the vga port on my w530 with activated optimus, but no output on my laptop display, so i ended up deactivating optimus again and render everything over the nvidia card :(
+Chaitanya Chandel thank you! But when i try to:
xrandr --setprovideroutputsource Intel NVIDIA-0
i just get the "how to use xrandr" - output
any ideas?
You'll need to use a 3.9 series kernel. These are still at rc stage - so depending on your distribution - it could take some additional effort.
i tried to update my kernel once...ended up in reinstalling linux mint :(
Well, this is the bleeding edge - beta NVidia drivers with rc kernel :-)
As NVidia has mentioned in their documentation, they require PRIME interfaces for this driver to work. That requires a 3.9 series kernel.
I haven't used mint - but since it's derived from ubuntu - you could add a ppa for the newer kernel - if you're willing to risk it.
+Chaitanya Chandel i think i will wait for one or two weeks, cause linux mint 15 should be released than and then i will try it again :) thank you for your post!
:D i couldn't resist and mint even booted and i am now on 3.9rc6 :)
Has anybody tried the FPS perf on one of the Steam games with this driver vs Bumblebee's primus?
can you ported this into hackintosh please?
I think that will not be done. Unless you do it yourself.
Add a comment...