Profile cover photo
Profile photo
Jeremy Visser
334 followers -
Geek. Sysadmin. Bassoonist.
Geek. Sysadmin. Bassoonist.

334 followers
About
Jeremy's posts

Post has attachment
Flags flying at half mast on the Anzac Bridge today.
Photo

I wonder if the character "Three Dog" from Fallout 3 is a reference to "3D0G", a command you can enter in the Apple II system monitor to return to BASIC.

Post has attachment

Post has attachment

Post has attachment

Post has attachment
Finally Ubuntu decides to drop the turd that is Unity. I said it back in 2011, and it only took them six more years to spot the obvious.

Today I was pleasantly surprised to find that (unlike the last time I tried, ~1 year ago) I can use my NVIDIA GPU (GeForce 750 Ti) as an off-screen renderer with my monitor only plugged into my Intel GPU.

Unlike most people who care about this (laptops with NVIDIA Optimus), this is a regular desktop PC which has Intel graphics and an NVIDIA card in a PCI-e port.

The advantage of this approach is I get to use the i915 driver to provide my console framebuffer, which is able to run at 1920x1080. Additionally, I can run a Wayland–based desktop on a second VT which only uses Intel graphics.

There are some hardware bugs with my desktop that causes the EFI–provided framebuffer to sometimes be low-resolution or high-resolution depending on whether I press the keyboard to dismiss the HP logo on boot. Yes, you read that right.

And another bug (not sure if HW or SW) means the console is completely invisible unless I specify video=efifb:off. Having efifb enabled leaves it invisible even past the point where inteldrmfb normally takes over. The only way I can get efifb to display a picture is by having only one GPU enabled at a time. Grr.

There are some disadvantages of this approach, though. Notably, tearing occurs and vertical sync just isn't implemented properly. Also being NVIDIA, I have to use Xorg, as Wayland support isn't there yet (should be resolved in GNOME 3.24 which will have EGLStreams support).

And, worst of all, I get terrible flickering in CS:GO. Depending on my video settings, I get bizarre glitches like textures flickering every other frame, or the background brightness flickering every other frame. If I display the game on an output from my NVIDIA GPU, it works fine. Only happens when I display through an Intel GPU port.

In summary though, I like the direction this is heading. It's nice to be able to plug a whole lot of GPUs together and plug your monitor into any port and have it show a picture. After all, Windows has been able to do this for years. About time Linux started being able to do it too.

Post has shared content
Winter Warmth

This is my favorite soap bubble shot of the winter so far. We had a little snow on the ground and it was covering one of our landscaping timbers in the front garden. I like the way it helps form the backdrop.

Have a great day!

Website: http://georgefletcher.photography
Facebook: https://www.facebook.com/georgefletcherphotography 500px: https://500px.com/gffphotos

+HQSP Macro curated by curated by +Stefanie Schächtel +Igor Schevchenko +Peter Marbaise +Evi Verstraeten +Robert Kubacki and +Andi Fritzsch #hqspmacro
#BTPMacroPro+BTP Macro Pro . founded by +Rinus Bakker , owned by +Nancy Dempsey ,curated by +Kenny Jones
#promotephotography by +Promote Photography
#showyourbestwork and +ShowYourBestWork by +Britta Rogge
#canonusers +*** by +Gene Bowker +John Spade

#frozen #soapbubble #snow #macro #patterns #f
Photo

Post has attachment
I strongly support this. Symantec would be a laughing stock if not for the fact that it's so serious.

Post has attachment
Wait while more posts are being loaded