Profile cover photo
Profile photo
Bumblebee Project
Bumblebee aims to provide support for nVidia Optimus laptops for GNU/Linuxdistributions. Using Bumblebee, you can use your nVidia card for renderinggraphics which will be displayed using the Intel card.
Bumblebee aims to provide support for nVidia Optimus laptops for GNU/Linuxdistributions. Using Bumblebee, you can use your nVidia card for renderinggraphics which will be displayed using the Intel card.

Bumblebee Project's posts

Post has attachment
Your next milestone of Bumblebee, will be version 4.0! Several people thought the project was dead, but that isn't true.
We will start around May 2016.

For more information, see Github:

Hopefully a new version will be released not very long from now. Some major issues regarding to the interaction with the NVIDIA driver & bbswitch will be fixed.
Thanks for your patience! Here is already a nice picture of Bumblebee.

When you're running Ubuntu 13.10 or later, you can install the packages from the repositories (including Linux Mint 17, where the Ubuntu Universe & Multiverse repos are enabled by default), you don't need to use PPA's anymore!

1. (Ubuntu) Enable Universe & Multiverse repos. This can be done via the 'Software Sources' in Software Center.
2. sudo apt-get install bumblebee bumblebee-nvidia primus linux-headers-generic
3. Reboot.

And have fun!

PSA for Gentoo users

Gentoo users who emerge nvidia-drivers versions 331.49-r2, 334.21-r2 or newer with USE=uvm will see a regression manifesting as Bumblebee daemon unable to unload the nvidia kernel module and power down the GPU.

We'll try to roll out a new Bumblebee release soon to resolve the issue. In the meantime, if you're affected please re-emerge nvidia-drivers with USE=-uvm, or if you need CUDA functionality use the live ebuild from the bumblebee overlay:

sudo EGIT_BRANCH=develop emerge -1 bumblebee-9999

I apologize for the inconvenience. Please see and for additional details.

Post has attachment
Bumblebee 3.2 is officially released. It has lot of new bug fixes and two new features. Very soon the new packages will be available (PPA for Ubuntu, etc.).

One new feature is to start optirun without any render bridge (-b is added), useful for nvidia-settings command (optirun -b none nvidia-settings -c :8) . Second feature is the --no-xorg option for optirun to disable starting secondary X server. This is handy for CUDA or OpenCL applications that do not need the graphics rendering capabilities.

Release notes can be found here:

How to upgrade, please visit:

You can try to install the new Nvidia Beta drivers, 319.12, but is currently not advised (it can cause lower performance). However the new driver is supported and will not break Bumblebee, it will be auto-detected.

Have fun!

Optimus and the recent nVidia beta drivers release

New nVidia beta drivers, 319.12, have been released yesterday. Unfortunately, several web sites have been quick to put articles with titles suggesting Optimus support finally coming to Linux, and I'm saying "unfortunately" because I believe this is a case where inaccurate reporting hurts everyone: to non-technical users, the articles may have an effect of giving a false impression that the wait is over and the complete and proper support in the official binary drivers has arrived. As far as I see, there is some confusion and misinformation in users' discussions, and that is not helping either. So, let's try to clarify things.

In short, the new beta is but a first user-visible step towards complete Optimus support. Remarkably, it covers use cases that Bumblebee has never supported well: using external monitors attached to the nVidia GPU, and running all rendering on the nVidia GPU. On the other hand, Bumblebee provided power management and render offloading on the basis of individual applications, neither of which is offered by the new beta.

In a typical muxless Optimus laptop (or a mux'ed laptop in "Optimus" configuration), you have the laptop LCD panel connected to Intel GPU only (so that nVidia card cannot display on it), and you may also have some external video port connected to nVidia GPU only (so that Intel card cannot display on it). Normally, you run the X server with the Intel driver, with the only output being the LCD panel, and all works well. Let's now consider more fancy scenarios.

1. You want to run a graphically intensive game, so you wish that heavy rendering is performed by the nVidia card, but it's only powered on for the duration of the game. You don't need to redirect rendering of any other applications to the nVidia card. This is called "render offloading".

2. You want to temporarily plug in an external monitor into the nVidia-driven output port without disrupting your existing X session already running on the X server with the Intel driver. Since the Intel chip cannot access that external port, it will need the nVidia card to perform display ("scanout") for it (this is assuming Intel does all rendering; alternatively, nVidia could be performing rendering for its portion of the desktop).

3. You want to use nVidia card for rendering the whole desktop, trading increased power consumption for improved acceleration of all graphical apps, including the compositor. Since the nVidia chip cannot access the laptop LCD panel, it will need the Intel card to perform scanout for it.

2 and 3 are called scanout offloading, and notice how it is needed in different directions for different use cases. The card performing the scanout is called the scanout sink, and the other is called the scanout source.

With the new beta, nVidia supports scanout offloading, with the restriction that the nVidia chip can be the scanout source but not the sink. Thus, it supports use case 3.

Use case 2 needs GPU hotplug in the X server, because you want to power up the discrete GPU only when the external monitor is plugged in, and on top of that use case 1 needs a mechanism to route rendering between different drivers. For now, it's possible to use a combination of virtual crtc patch and hybrid-screenclone to "solve" case 2 (yep, that's painful), and Bumblebee "solves" case 1. Proper support in the drivers/server stack will be more efficient, of course.

Notably, it should be possible to use the new beta drivers to get better accelerated rendering for gaming sessions by starting the game on a separate X display with nVidia driver and scanout offloading. FSGamer should come in handy for that. Note that you want to be using xf86-video-intel driver for the offload sink in this case (not the modesetting driver as the readme currently suggests), which should work since version 2.21.5 and required, since Xorg does not support different drivers for the same PCI device.

Hope that helps.


PS. If you were wondering whether you should install new drivers: if unsure, don't do that yet. You might clobber your gl/glx libraries, leaving you without accelerated rendering on Intel desktop, and accelerated nVidia+scanout is not easy to setup either.

Post has attachment
Bumblebee 3.1 is released on 25 February 2013. Meaning you can upgrade your Bumblebee installation (via Ubuntu PPA) if you didn't upgrade already: sudo apt-get update && sudo apt-get install bumblebee bumblebee-nvidia linux-headers-generic. Visit our website: (for trying the new primus functionality, see News).

Release notes v3.1:
Primus Github:

An update on Bumblebee and Steam compatibility

Steam beta is available for more than 10 days already, but unfortunately Bumblebee graphics offloading is not easily usable with Steam games, and there is some confusion about current status, for example in the Reddit discussions. This lengthy and somewhat technical post is here to clear things up.

First, there was an issue that Steam UI would not load if started under primus. That issue was actually due to OpenGL API misuse in Steam, and was worked around in primus on the same day. After that, there were no Steam bugreports on the issue trackers. However, that actually was not enough, because games started from the Steam UI would still run on the integrated card. This problem was pointed out to primus developer just a day ago on IRC. Take note: bugs do not get developers' attention unless someone actually reports them.

This problem is due to Steam overwriting both LD_PRELOAD and LD_LIBRARY_PATH environment variables. VirtualGL uses the former, and primus the latter to perform OpenGL offloading. This issue was raised in private communications with Valve.

However, a really good use of hybrid graphics would involve running Steam UI off the integrated card, and only individual games off the discrete card. Today, people just edit game launch scripts to achieve that, but that is not exactly convenient (and the script will be overwritten when a game update arrives).

There is a feature request to Valve to allow prepending wrapper scripts when editing command-line arguments. When implemented, it will allow to add primusrun/optirun to games on a case-by-case basis, which is definitely better than the current situation.

There is also experimental software that aims to allow dynamic libGL switching based on user preference. It currently has a somewhat inefficient implementation based on FUSE and LD_LIBRARY_PATH ( = does not work with Steam) and a better implementation based on LD_AUDIT. However, as best laid schemes of mice and men go often awry, the LD_AUDIT implementation is unusable on most of the modern Linux distributions due to a bug in Glibc.

Given the above, at the moment libgl-switcheroo development is inactive, but I would like to take this opportunity to gauge public interest in such a feature. Please +1 the first two responses to the post to "vote".

Post has attachment
For those who are not aware yet - +Alexander Monakov's primus solution is what most would call "done" now, and definitely ready to try out if you're still using optirun!

Simply download and compile, then use the included primusrun script instead of optirun. Keep your bumblebee daemon/settings as they have been for optirun, no need to change anything - just replace "optirun" by "primusrun". You'll have to set it up manually for now, but it only involves 2 (single-arch) or 3 (multiarch) files so is relatively easy.

Main advantages this has over the optirun/VirtualGL solution:
- Less overhead (better framerates!) and cleaner solution (no networking or compression involved at all).
- Fixes the "bug" that causes bumblebee to shut down the GPU too early sometimes (no more need for the "optirun bash" workaround - ever!).
- Less buggy/glitchy, easier to debug.
- Only uses/starts secondary GPU for OpenGL parts of applications - everything else remains on your main GPU (power savings).

Give it a spin and make sure to report any bugs/problems you find on the github issue tracker for primus (see link -> issues).

Most of you are probably wondering - will we pack this as part of bumblebee in the future? Most likely, yes! We have to get the team together and make some decisions about this and what will happen to optirun first (on some distros - specifically those that use statically compiled - primus will not work (yet?), so the VirtualGL solution should remain available for that small group of people...).

Post has attachment
This is some really interesting stuff by +Alexander Monakov. Works quite nicely when combined with bumblebee - simply "optirun bash" to keep your card active in the proper configuration, then run this in a separate terminal. Not quite ready for mass-consumption yet, but this has the potential of replacing the VirtualGL part of Bumblebee in the future!

New Bumblebee version 3.0.1 released.

Bumblebee v3.0.1. fixes the "GPU has fallen off the bus" error, buffer overflow error and several other bug fixes.

Upgrading is easy (Ubuntu):
sudo apt-get update && sudo apt-get install bumblebee

Have fun!
Wait while more posts are being loaded