Profile

Cover photo
Bumblebee Project
1,411 followers|190,534 views
AboutPostsPhotosVideos

Stream

Bumblebee Project

Shared publicly  - 
 
Your next milestone of Bumblebee, will be version 4.0! Several people thought the project was dead, but that isn't true.
We will start around May 2016.

For more information, see Github:
https://github.com/Bumblebee-Project/Bumblebee/milestones/Bumblebee%204.0

Hopefully a new version will be released not very long from now. Some major issues regarding to the interaction with the NVIDIA driver & bbswitch will be fixed.
Thanks for your patience! Here is already a nice picture of Bumblebee.
32
4
Danas Anis's profile photo
 
Great news!
Add a comment...

Bumblebee Project

Shared publicly  - 
 
Optimus and the recent nVidia beta drivers release

New nVidia beta drivers, 319.12, have been released yesterday. Unfortunately, several web sites have been quick to put articles with titles suggesting Optimus support finally coming to Linux, and I'm saying "unfortunately" because I believe this is a case where inaccurate reporting hurts everyone: to non-technical users, the articles may have an effect of giving a false impression that the wait is over and the complete and proper support in the official binary drivers has arrived. As far as I see, there is some confusion and misinformation in users' discussions, and that is not helping either. So, let's try to clarify things.

In short, the new beta is but a first user-visible step towards complete Optimus support. Remarkably, it covers use cases that Bumblebee has never supported well: using external monitors attached to the nVidia GPU, and running all rendering on the nVidia GPU. On the other hand, Bumblebee provided power management and render offloading on the basis of individual applications, neither of which is offered by the new beta.

In a typical muxless Optimus laptop (or a mux'ed laptop in "Optimus" configuration), you have the laptop LCD panel connected to Intel GPU only (so that nVidia card cannot display on it), and you may also have some external video port connected to nVidia GPU only (so that Intel card cannot display on it). Normally, you run the X server with the Intel driver, with the only output being the LCD panel, and all works well. Let's now consider more fancy scenarios.

1. You want to run a graphically intensive game, so you wish that heavy rendering is performed by the nVidia card, but it's only powered on for the duration of the game. You don't need to redirect rendering of any other applications to the nVidia card. This is called "render offloading".

2. You want to temporarily plug in an external monitor into the nVidia-driven output port without disrupting your existing X session already running on the X server with the Intel driver. Since the Intel chip cannot access that external port, it will need the nVidia card to perform display ("scanout") for it (this is assuming Intel does all rendering; alternatively, nVidia could be performing rendering for its portion of the desktop).

3. You want to use nVidia card for rendering the whole desktop, trading increased power consumption for improved acceleration of all graphical apps, including the compositor. Since the nVidia chip cannot access the laptop LCD panel, it will need the Intel card to perform scanout for it.

2 and 3 are called scanout offloading, and notice how it is needed in different directions for different use cases. The card performing the scanout is called the scanout sink, and the other is called the scanout source.

With the new beta, nVidia supports scanout offloading, with the restriction that the nVidia chip can be the scanout source but not the sink. Thus, it supports use case 3.

Use case 2 needs GPU hotplug in the X server, because you want to power up the discrete GPU only when the external monitor is plugged in, and on top of that use case 1 needs a mechanism to route rendering between different drivers. For now, it's possible to use a combination of virtual crtc patch and hybrid-screenclone to "solve" case 2 (yep, that's painful), and Bumblebee "solves" case 1. Proper support in the drivers/server stack will be more efficient, of course.

Notably, it should be possible to use the new beta drivers to get better accelerated rendering for gaming sessions by starting the game on a separate X display with nVidia driver and scanout offloading. FSGamer should come in handy for that. Note that you want to be using xf86-video-intel driver for the offload sink in this case (not the modesetting driver as the readme currently suggests), which should work since version 2.21.5 and required, since Xorg does not support different drivers for the same PCI device.

Hope that helps.

Alexander

PS. If you were wondering whether you should install new drivers: if unsure, don't do that yet. You might clobber your gl/glx libraries, leaving you without accelerated rendering on Intel desktop, and accelerated nVidia+scanout is not easy to setup either.
115
24
Albert Vilella's profile photoBartosz Radaczyński's profile photoFatah Ishak's profile photoBumblebee Project's profile photo
28 comments
 
I think that will not be done. Unless you do it yourself.
Add a comment...

Bumblebee Project

Shared publicly  - 
 
Bumblebee 3.1 is released on 25 February 2013. Meaning you can upgrade your Bumblebee installation (via Ubuntu PPA) if you didn't upgrade already: sudo apt-get update && sudo apt-get install bumblebee bumblebee-nvidia linux-headers-generic. Visit our website: http://bumblebee-project.org/ (for trying the new primus functionality, see News).

Release notes v3.1: https://raw.github.com/Bumblebee-Project/Bumblebee/master/doc/RELEASE_NOTES_3_1
Primus Github: https://github.com/amonakov/primus
74
16
Julián Nuñez's profile photoPetr Beňa's profile photoFederico Leonel's profile photoxiaojie qiu's profile photo
9 comments
 
I am wondering whether Bumblebee cut the power for GPU totally? If so, is there any need to modify ACPI interfaces ?And I have a situation that a dozen of GPU devices are plugged on a single board, is it possible for me to control the power of those devices base on Bubblebee? Advice is welcomed if this is can be done!
Add a comment...

Bumblebee Project

Shared publicly  - 
 
This is some really interesting stuff by +Alexander Monakov. Works quite nicely when combined with bumblebee - simply "optirun bash" to keep your card active in the proper configuration, then run this in a separate terminal. Not quite ready for mass-consumption yet, but this has the potential of replacing the VirtualGL part of Bumblebee in the future!
16
2
David Whyte's profile phototimofonic timofonic's profile photoBumblebee Project's profile photo
14 comments
 
Well, you also could look at 'PRIME' in nouveau driver: https://wiki.archlinux.org/index.php/PRIME
Add a comment...

Bumblebee Project

Shared publicly  - 
 
Should there be native Optimus support in Linux Kernel 3.5?
20
2
Bumblebee Project's profile photo
4 comments
 
No, why should they?

Bumblebee Project

Shared publicly  - 
 
When you're running Ubuntu 13.10 or later, you can install the packages from the repositories (including Linux Mint 17, where the Ubuntu Universe & Multiverse repos are enabled by default), you don't need to use PPA's anymore!

1. (Ubuntu) Enable Universe & Multiverse repos. This can be done via the 'Software Sources' in Software Center.
2. sudo apt-get install bumblebee bumblebee-nvidia primus linux-headers-generic
3. Reboot.

And have fun!
12
Philippe Fouque's profile phototimofonic timofonic's profile photoGašper Sedej's profile photoBumblebee Project's profile photo
12 comments
 
+Gašper Sedej
One reason: the main developer Lekensteyn doesn't commit anymore: https://github.com/Bumblebee-Project/Bumblebee/graphs/contributors

And a lot of forks are happening (bad habit)... See:
http://s32.postimg.org/8232p324l/lot_of_forks.png

Finally, I think there are too less moderators currently, who has write access to the github repo. Overall, too less issues get solved mainly due to lack of repository administrators who can for example accept Pull requests.

UPDATE: I received a message that ArchangeGabriel I’ll be back soon to fix some stuff.

Read more:
https://github.com/Bumblebee-Project/Bumblebee/issues/710
Add a comment...

Bumblebee Project

Shared publicly  - 
 
PSA for Gentoo users

Gentoo users who emerge nvidia-drivers versions 331.49-r2, 334.21-r2 or newer with USE=uvm will see a regression manifesting as Bumblebee daemon unable to unload the nvidia kernel module and power down the GPU.

We'll try to roll out a new Bumblebee release soon to resolve the issue. In the meantime, if you're affected please re-emerge nvidia-drivers with USE=-uvm, or if you need CUDA functionality use the live ebuild from the bumblebee overlay:

sudo EGIT_BRANCH=develop emerge -1 bumblebee-9999

I apologize for the inconvenience. Please see https://bugs.gentoo.org/show_bug.cgi?id=506168 and https://github.com/Bumblebee-Project/Bumblebee/issues/565 for additional details.
2
1
Alexander Monakov's profile phototimofonic timofonic's profile photoBumblebee Project's profile photo
3 comments
 
It's an old post. So I it should be solved by now... The issue is closed: https://github.com/Bumblebee-Project/Bumblebee/issues/565

Add a comment...

Bumblebee Project

Shared publicly  - 
 
Bumblebee 3.2 is officially released. It has lot of new bug fixes and two new features. Very soon the new packages will be available (PPA for Ubuntu, etc.).

One new feature is to start optirun without any render bridge (-b is added), useful for nvidia-settings command (optirun -b none nvidia-settings -c :8) . Second feature is the --no-xorg option for optirun to disable starting secondary X server. This is handy for CUDA or OpenCL applications that do not need the graphics rendering capabilities.

Release notes can be found here: https://raw.github.com/Bumblebee-Project/Bumblebee/master/doc/RELEASE_NOTES_3_2

How to upgrade, please visit:
(Ubuntu) https://github.com/Bumblebee-Project/Bumblebee/wiki/Upgrading-on-Ubuntu

Ps.
You can try to install the new Nvidia Beta drivers, 319.12, but is currently not advised (it can cause lower performance). However the new driver is supported and will not break Bumblebee, it will be auto-detected.

Have fun!
Version 3.2 - 22 April 2013 Highlights: - [new] Implemented -b none option for optirun to disable injection of any render offloading bridge (primus or VirtualGL). Use this to invoke nvidia-settings (o...
61
18
Nancho Alvarez's profile photoLars Wadefalk's profile photoHenrique Nunnes's profile photoBumblebee Project's profile photo
7 comments
 
+Henrique Nunes da Silva 
When you are playing games using the discrete video card, it becomes hotter, this is normal (because the discrete video card is working hard & the card is NOT in 'stand-by' mode). However overheating is never the intention.

Bumblebee makes it only possible to use the card, it isn't overclocking any hardware or something, so it doesn't matter if you are using Windows, Linux, Mac or....

My first guess is to clean your laptop, remove all the dust. There are special high pressure sprayers on the market for cleaning laptops/computers. Also it is possible that your laptop hardware is broken, causing overheating problems.

Good luck!
Add a comment...

Bumblebee Project

Shared publicly  - 
 
An update on Bumblebee and Steam compatibility

Steam beta is available for more than 10 days already, but unfortunately Bumblebee graphics offloading is not easily usable with Steam games, and there is some confusion about current status, for example in the Reddit discussions. This lengthy and somewhat technical post is here to clear things up.

First, there was an issue that Steam UI would not load if started under primus. That issue was actually due to OpenGL API misuse in Steam, and was worked around in primus on the same day. After that, there were no Steam bugreports on the issue trackers. However, that actually was not enough, because games started from the Steam UI would still run on the integrated card. This problem was pointed out to primus developer just a day ago on IRC. Take note: bugs do not get developers' attention unless someone actually reports them.

This problem is due to Steam overwriting both LD_PRELOAD and LD_LIBRARY_PATH environment variables. VirtualGL uses the former, and primus the latter to perform OpenGL offloading. This issue was raised in private communications with Valve.

However, a really good use of hybrid graphics would involve running Steam UI off the integrated card, and only individual games off the discrete card. Today, people just edit game launch scripts to achieve that, but that is not exactly convenient (and the script will be overwritten when a game update arrives).

There is a feature request to Valve to allow prepending wrapper scripts when editing command-line arguments. When implemented, it will allow to add primusrun/optirun to games on a case-by-case basis, which is definitely better than the current situation.

There is also experimental software that aims to allow dynamic libGL switching based on user preference. It currently has a somewhat inefficient implementation based on FUSE and LD_LIBRARY_PATH ( = does not work with Steam) and a better implementation based on LD_AUDIT. However, as best laid schemes of mice and men go often awry, the LD_AUDIT implementation is unusable on most of the modern Linux distributions due to a bug in Glibc.

Given the above, at the moment libgl-switcheroo development is inactive, but I would like to take this opportunity to gauge public interest in such a feature. Please +1 the first two responses to the post to "vote".
69
8
Antonio VonG's profile photoNate Moore (Rebootkid)'s profile photoJohan du Preez's profile photoBubba Lichvar's profile photo
11 comments
 
Hi +Johan du Preez What you should do is join the steam group called "Linux Nvidia Optimus Users" (http://steamcommunity.com/groups/LinuxOptimus) It has every thing you would need to know about running the 310 drivers, running Valve games, launching SS3, and how to launch all of the other games. Also if you have any questions about anything, like I have, people there tend to help.
Add a comment...

Bumblebee Project

Shared publicly  - 
 
For those who are not aware yet - +Alexander Monakov's primus solution is what most would call "done" now, and definitely ready to try out if you're still using optirun!

Simply download and compile, then use the included primusrun script instead of optirun. Keep your bumblebee daemon/settings as they have been for optirun, no need to change anything - just replace "optirun" by "primusrun". You'll have to set it up manually for now, but it only involves 2 (single-arch) or 3 (multiarch) files so is relatively easy.

Main advantages this has over the optirun/VirtualGL solution:
- Less overhead (better framerates!) and cleaner solution (no networking or compression involved at all).
- Fixes the "bug" that causes bumblebee to shut down the GPU too early sometimes (no more need for the "optirun bash" workaround - ever!).
- Less buggy/glitchy, easier to debug.
- Only uses/starts secondary GPU for OpenGL parts of applications - everything else remains on your main GPU (power savings).

Give it a spin and make sure to report any bugs/problems you find on the github issue tracker for primus (see link -> issues).

Most of you are probably wondering - will we pack this as part of bumblebee in the future? Most likely, yes! We have to get the team together and make some decisions about this and what will happen to optirun first (on some distros - specifically those that use statically compiled libglapi.so - primus will not work (yet?), so the VirtualGL solution should remain available for that small group of people...).
40
10
Валентин Фучеджи's profile photoAlexander Monakov's profile photoNoel Merino's profile photo
37 comments
 
+Alexander Monakov Thanks Alex. I'm doing only testings. Best regards!
Add a comment...

Bumblebee Project

Shared publicly  - 
 
New Bumblebee version 3.0.1 released.

Bumblebee v3.0.1. fixes the "GPU has fallen off the bus" error, buffer overflow error and several other bug fixes.

Upgrading is easy (Ubuntu):
sudo apt-get update && sudo apt-get install bumblebee

Have fun!
35
4
Tolga Aydemir's profile photoJacek Wojtkowski's profile photoMelroy van den Berg's profile photoRoman Frančuk's profile photo
18 comments
 
Can you please build packages for Debian?
There is http://suwako.nomanga.net/ but it is a bit outdated.
Add a comment...

Bumblebee Project

Shared publicly  - 
1
Renaud “和彦” Lepage's profile photoMelroy van den Berg's profile photo
2 comments
 
Ah I already saw no options for a poll in Google+ ><
Thanks in advance
Story
Tagline
Bumblebee aims to provide support for nVidia Optimus laptops for GNU/Linuxdistributions. Using Bumblebee, you can use your nVidia card for renderinggraphics which will be displayed using the Intel card.
Introduction
The Bumblebee Project proudly presents version 3.0 of Bumblebee, a project aiming to support NVIDIA Optimus technology under Linux.
Contact Information
Contact info
Address
http://bumblebee-project.org/