Profile cover photo
Profile photo
Terry's Linux Tips
Tips for using Linux
Tips for using Linux

Terry's Linux Tips's posts

Post has attachment
The Intel NUC5i5RYH

I mentioned in another note that I bought an Intel NUC.  This is a very small, quiet PC, smaller than a Mac Mini and much in the same vein.  The 5i5RYH model comes with an Intel i5 processor (which has 4 cores; I don't know why they couldn't just call it i4, or at least put a "4" in the name somewhere).

The NUC comes with 3 USB 3.0 ports, of which only two seem to work.  At first I thought it was a driver problem, but now that I've updated the firmware and changed operating systems, I'm convinced it's a flaw.  It accommodates a slim SSD drive (I put in a 100 gig Samsung) and 16 gigs of RAM.  For the display, it has a mini-HDMI port.  Yes, you need to go out and order an adapter for your HDMI cable, or else get one of those mini-to-regular cables.  I don't see why they couldn't use a full size HDMI as the Mac Mini does.

I tried installing OpenSUSE 13.2, the latest and greatest version.  It ran into video problems.  Fedora didn't work either.  Finally I got Ubuntu 15.04 to install -- it comes with a very recent kernel version that the NUC was able to use.  Even then, there have been problems with the network, random pauses that make it very difficult to use.

I originally got this NUC to replace a home-built Linux server that was hard-crashing every few days.  Rather than replace the motherboard, then the RAM, then the cables, etc., I opted for the NUC as a drop-in replacement for this huge box.

Then, two things happened.  I updated the firmware on my Linux server and voila! it stopped crashing.  Thank heavens.  I am leery of updating firmware unless necessary, but in this case it was a lifesaver.

Secondly, I realized I need a Windows computer for various tests and use cases.  Since the NUC wasn't very happy with Linux, I ordered an OEM Windows 7 dvd from Amazon for $79.  It came with a rather worn-looking license sticker on the outside.  "Oh boy," I thought, "this looks pirated."  

But Windows installed like a dream, the license key was accepted without problem, and it was up and running within 30 minutes, albeit with no network connection.  Forewarned by some excellent reviews on Amazon, I had already downloaded a bunch of drivers from Intel's website including display and network.  I ran the driver installers right after installing Windows, and this got Windows to connect right away to the LAN.

Next I installed  the Chrome browser, logged into it and allowed it to sync with my Mac Chrome bookmarks.  Finally I put TightVNC on the NUC so that I can easily window into the NUC without having to switch the display cables around or keep a 3rd keyboard on my already-crowded desk.  

The result:  a pretty fast, quiet little Windows desktop.  OK, it's not Linux.  But it's necessary to have Windows in this day and age.  My aging work laptop does run XP, soon to be wiped and updated to Win7 by the fearless I.T. team, but now I have my own Windows machine.

Conclusions:  don't buy a recent NUC to use with Linux.  The i5 is still new enough that it doesn't have driver support, despite assurances on the Intel website to the contrary.  Do buy a NUC if you need a very quiet, low energy, tiny-footprint Windows PC.  There are older NUCs out there running Linux, but overall you're better off avoiding this one.

Probably my next (and final) acquisition for this bad boy will be a USB 3.0 expansion hub, to accommodate whatever gadgets I want to attach -- thumb drives, backup hard drives, etc.

Screen sharing

I finally got my Mac and Linux machines to cooperate!  In using my Mac as a front end to my various workstations, screen sharing obviously has come to the fore as an important tool, but it can be tricky to use.

I've been using TightVNC which works pretty well on the Windows laptop, but less well on Linux (as a destination).  Recently I discovered x11vnc which finally allows me to view the current session (as opposed to a new session that starts up specifically for the vnc viewer).

Just install x11vnc (zypper or apt-get) and if you don't have your .vnc/passwd already set up, set one up:

x11vnc -storepassword

Then execute x11vnc as follows:

x11vnc -usepw -forever -noxdamage -display :0

And it will allow you to control your machine from elsewhere.  Don't use the ncache feature, because Mac screen sharing will display it as a huge vertical window -- this took me hours to figure out!

Now, finally, using my Mac as a viewer, I have full control of my Linux workstation, my Windows laptop, my work Linux workstation via VPN, and my new Linux NUC system (more on that in another note).

Screen sharing can be slightly glitchy but the rewards are substantial.

For full information on x11vnc and excellent step-by-step instructions, visit the author's page:

UPDATE 2015-08-25
The clipboard sharing is problematical.  I finally found this option that at least lets the Linux client send clipboard to the Mac host, i.e. if you are on a Mac and viewing your remote Linux session using screen sharing, then placing items on the Linux clipboard will also place them on the Mac clipboard.  You may not always want this feature, but if you're going between systems a lot, it's awfully handy to transfer data that otherwise you have to type manually, or copy into a file and scp to the other system for example!  Here is the option:
-input KMBCF,M 
This actually tells x11vnc to allow Keyboard, Mouse, and Clipboard.
Here is the full command line, with the clipboard option added, and forked as a "nohup" process that will continue after your terminal is closed:

  nohup x11vnc -input KMBCF,M -usepw -forever -noxdamage -display :0&

How to remove photos and videos from your Android device

OK, this isn't exactly a Linux topic (though, of course, Android does run on Linux!).  It's just that Google doesn't make it obvious how to clear up space on your mobile device.

Both my wife and I take lots of stills and videos with our phones, and of course eventually the phone fills up.  Unfortunately, our Nexus phones lack a microSD slot, so we can't simply pop in a new card as you would with a regular point-and-shoot camera.

My usual procedure is to manually copy the files onto a PC hard drive where it will get backed up by my regular cron job (described in another note down below).

Since you're a Linux power user, you can enjoy doing it this way.  Install the Google Android developer tools which include adb, a very powerful command line program for doing all sorts of things with your connected Android device.

Note that you must enable debugging on your device.  Go to Settings->About Phone, and tap several times on "Build Number" until it says "debugging enabled".  You can then tap on "Settings->Developer Options" and make sure the "USB debugging" option is turned on.

Once you have installed the SDK tools, set up your phone, and you have a path to adb (usually in  {android-sdk}/platform-tools/ ) you can execute the following command:

adb shell

This puts you into a linux shell session on your Android device.  If you happen to have more than one Android device connected to your computer, you need to find the ID as follows:

adb devices

It will give you a list of hex id strings like this:

Next you have to identify which one is the device you wish to work with.  Easiest way is simply to unplug the device and see which id disappears.  Once you have the right id, you specify the device as follows:

adb -s 015d168951300e07 shell

Now you are in a regular bash-like shell session. Execute these commands:

cd /sdcard/DCIM/Camera
ls -l

This will show you all your current photos and videos from the Camera app (you may have other files elsewhere in the system such as in Downloads or in another app data folder).

Now control-D and get back to your regular shell on your computer.  cd to whatever folder you wish to deposit the images in, and execute this command:

adb -s 015d168951300e07 pull -p -a /sdcard/DCIM/Camera

This will copy the files into the current directory.  The -p means display progress, and -a means preserve file time/date.  This could take quite a while, depending on how much data you're copying.

Once this is done, you can rest easy that your pictures and movies are securely copied and will be backed up by whatever backup process you use.  I usually back it up immediately just to be on the safe side.

Now you take your Android device and make sure your Google Photos app is updated to the latest version.  Don't just select and delete photos because this will delete them from the cloud!!!  I learned this the hard way (but fortunately I had already done the above manual backup, so it wasn't a disaster).

Run Photos, tap the upper left menu button, tap "Device folders", and long press the files you want to delete.  As far as I know, there's no "select all" unfortunately, so you have to manually select them.  Once this is done, tap the upper right menu and choose "Delete device copy".  This will delete the files off your device but preserve them in the cloud.

Now you have freed up gigabytes from your device, and the Photo app still displays the images because it's pulling them from the cloud.

If, like me, you deleted images from the main screen of Photos and it replicated the delete to the cloud, you can restore them easily from your desktop computer.  

Simply go to (used to be and click on the little cloud at the top that lets you upload.  

Select all the files in the backup folder and it will (very slowly) upload them.  It seems to copy over existing images rather than create duplicates, so it's safe enough to do a mass upload if you're not sure what's missing.

That's it!  It used to be that Google Plus understood that you were making more space on your device, and would not replicate a delete to the cloud server, but now they've changed that.  I suspect that after a couple million people have complained that they lost their precious pictures, Google will change this and make it easier to backup pictures and clear local storage, but until then, we have to do it the manual way.  At least, we know exactly what we're getting when we do it from the command line.

Update 2015-08-25

It appears that the Photo app has changed a bit.  Now you can select pictures and videos inside the Photo app, either individually or by the date, and remove them from the device while preserving them in the cloud.  Long-press one picture/video, and it goes into selection mode.  Then tap additional items to select them.  Then tap the upper right menu (3 vertical dots) which will offer to remove from device.  This clears up space while preserving the images in your Google account.

Unfortunately there's no indicator as to whether a picture has already been removed from your device.  If you select such a picture, you'll see a trash can icon in the upper right instead of a menu.  When you try to delete the picture, it will warn you that you are removing it from all devices and from your cloud account.

Again, the safest thing is to back up the Camera folder to your local hard disk first, then you can go about clearing the large video files off your device.  Hard disk space is cheap these days; you can get a 3-terabyte USB 3.0 disk for about $120, so just go for it and keep local copies of all those precious pics.

How to keep OpenConnect VPN running.

If you work remotely, as I do, you probably need to keep a VPN (virtual private network) running on your workstation or laptop.  I use openconnect ( to emulate the Cisco AnyConnect product which, of course, my employer doesn't provide for Linux or MacOS.

On Linux, openconnect just works.  If you would like to learn the details of how to install and configure it, post a comment here and I'll provide more information.

My Mac, however, is a different situation.  For some reason, although openconnect works great, it disconnects from the network after a few hours.  I found myself periodically re-running the script, re-logging in, etc., several times a day.  Annoying.

Finally I came up with this solution to keep my VPN running smoothly on the Mac.  Create a script -- I call it "vpn" -- in your bin directory (e.g.,  /Users/johnsmith/bin/).

Put this in the "vpn" script (all one line):
echo myServerPassword | openconnect -u johnsmith --authgroup='MY_VPN_AuthGroup' myServerIPaddress --no-cert-check --reconnect-timeout 3000 --passwd-on-stdin

This passes your VPN password into the openconnect command.  You will obviously need to put in the correct authgroup and ip address for your system.  The reconnect-timeout is an attempt to maximize the time before openconnect gives up on a connection; I'm not sure it makes that much difference.
Save the script.  Open a terminal window and become superuser (sudo su).

Start your VPN script from the command line like this:

while true ; do sh /Users/johnsmith/bin/vpn ; done

This tells bash to execute vpn in a loop.  Whenever vpn times out and exits, the while loop will restart it, until you kill it with control-C.

Now you can minimize this terminal window and forget about it.  The VPN will constantly restart itself and will not need to prompt you for either your local superuser password or your server password.

You can also put this "while" command into a cron job if you prefer to have it always running without a terminal.  I like having the terminal so that I can see what's happening, in case there's a connectivity problem with the server.

Backups are a pain but an absolute necessity.  We in the Linux world have many excellent options when it comes to backing up our data, though many of us probably don't do backups as often as we should.

I'm going to show you how to use rsync, a powerful and easy to use command that makes backing up your files a snap.

First, you need an external hard disk to be mounted to your system.  Typically, you plug one into a USB port and Linux will detect the hard disk and assign it a device name, for example:
You can detect this device name by running dmesg immediately after plugging in your device.

EDIT:  If you know the label of the external drive, it's easier to mount by label rather than by device id since dev id can change.  To locate the label, check in /dev/disk/by-label.  For example, the "My Passport" 2TB external drive I plugged into my hub has this device label:

To mount this drive, use this type of command (all on one line):
sudo mount -o users,rw,uid=ttraub,gid=users /dev/disk/by-label/My\\x20Passport /mnt/bk

Replace "ttraub" with your user id, and replace /mnt/bk with whatever empty directory you have created for this purpose.  Now you have a permanent, fixed command to mount your drive, that will always work, and you can make a script or alias, or add it to your fstab so it will always mount at boot time.

Then, choose your folders that you want to back up.  Typically, you will want to back up your entire directory tree.  Once you have run one backup, rsync is smart enough to only update the changed files.  Here is a typical command, in this case to back up the user "ttraub" in /home/ttraub:

cd /home
# the following is all one line
rsync -arvp ttraub --exclude foo.bak --exclude ttraub/.cache /var/run/media/ttraub/My\ Passport|grep -v ".*$"

Let's break this down.  Essentially we are saying, back up everything in /home/ttraub to the device mounted on /var/run/media/ttraub/My\ Passport

An alternative destination would be your own folder that you create, e.g.:
sudo mkdir /mnt
sudo mkdir /mnt/bk
Then mount the drive as in the above example and your command will look like this:
rsync -arvp ttraub --exclude foo.bak --exclude ttraub/.cache /mnt/bk |grep -v ".*$"

Same thing as above, just  a shorter path!

Note that when you plug in a USB hard drive, OpenSuSE and KDE will pop up a notification offering to mount it for you, and the default location will be under /var/run/media/USER.

The hard drive I'm using is a two-terabyte Western Digital Passport USB 3.0, an inexpensive and reliable device that you can leave mounted or carry around; it's about the size of a cigarette package.

The source is the folder "ttraub".  The destination is the My\ Passport drive.  So far, so good?

Now to the options:
-a = archive mode, will recurse and preserve every file attribute.
-r = recursive mode, redundant with -a (left out of habit)
-p = preserve owner attributes
-v = verbose, will list the files that it is backing up

--exclude = don't back up or update a pattern.  You could also use a file of exclude patterns and reference it with --exclude-from=FILE.

Lastly, I have a grep to remove the stdout output because verbose outputs every file name discovered.  rsync reports the changed files to stderr, which is not caught by the pipe and grep.

The first time you run such a command, of course, it will copy everything to the destination drive, much like this command:
cp -prv . /mnt/MyDestinationDrive

The next time you run it, rsync will compare each file and only back up the changed files.  Hence, it will do an incremental backup.

One more option to keep in mind:  
-n = dry run.  It shows you what it would do, but doesn't actually do anything.  Very handy when setting this command up.

Next, you can place this command into a shell script, e.g.:
cd /home
rsync -arpv ttraub --exclude --exclude ttraub/.cache /var/run/media/ttraub/My\ Passport | grep -v ".*/$"

and name it something like "bk".  Make it executable so that whenever you feel like it, you can simply type "bk" and do a quick backup of whatever work you feel should be preserved.

When you are happy with how your rsync command is working, you can then make it a cron job to run daily or even more often.
You would edit your user crontab (no need to sudo):
crontab -e

It will put you into vi editor.  Type "i" to insert some text and enter something like this:
0 2 * * * sh /home/ttraub/bin/bk
(then press ESC, and type ZZ to save and exit.)

This tells the system to execute, as your user, the shell script "bk", every day at 2am.  The zero means zero minutes past the hour, "2" refers to 0200 hours or 2am, and three asterisks mean "every day of month", "every month", and "every day of the week".  Look up "crontab" for more information.

If your mail daemon is running, the system will email you every day after running this cron job, and you will see the files that were backed up.

Congratulations!  You now have an automatic incremental backup running every night and you can rest easy.  Of course, don't forget to leave your external drive mounted!

Note by the way that you can also run rsync over the internet, but it requires some setting up of SSL public keys.  Here's a pretty good explanation of how to run rsync over ssh:

Elsewhere I will describe how to generate and share your public keys so that you can ssh to remote machines without the need to enter a password.

Post has attachment
I recently received a new backlit keyboard for my Opensuse Linux computer, an AJazz Cyborg Soldier gaming keyboard.  It's a full size, full travel keyboard with three different colors of LED back lighting and the price is so cheap at $34 (from an Ebay seller) that I just had to try one.  My desk is a bit cramped and this is the sparsest, most compact full keyboard I could find, and would shave an inch or so off the footprint that my Linux keyboard takes up.

Unfortunately, it doesn't work with Linux.  They did say in the specs that it's compatible with Windows and Mac operating systems, and I did find a testimonial in one of the Amazon user reviews that it fails in Linux, but I was determined to try anyway. 

The problem is that the Ctrl and Alt keys are differently mapped from standard keyboards and are interpreted as Shift keys by the OS.  I ran xev which reveals that the scan codes for Ctrl and Alt are indeed the same as for Shift.  That means that at the X Window layer, you can't remap these keys to do the right thing.  It's a USB keyboard driver problem.

I checked the logs (run dmesg right after plugging in the keyboard) and discovered that it identifies itself as a SoNIX USB Keyboard.  So I looked up SoNIX and found that it's a Taiwanese company that manufactures OEM items like USB parts and micro controllers.  I was unable to locate the part without dismantling the keyboard and studying the chip numbers, though it's still possible that they have a Linux driver that might work.

An alternative approach would be to run a low-level USB input/output scanning tool to determine exactly what scan codes are being sent to the computer from the keyboard, then take the source code of the standard keyboard driver and hack it to support those scan codes.  If I find the time, and can figure out how to do this without buying an expensive piece of lab equipment, I'll try to write a keyboard driver to support this peripheral.  Stay tuned!

By the way, as a keyboard, the A-Jazz is not bad at all, except for one flaw (in my opinion):  They placed the \| key directly to the left of the ENTER key, when normally it's positioned above the ENTER key on most keyboards.  Thus, I found myself frequently keying a backslash when I meant to hit ENTER.  It's something you can train your fingers to avoid, but it's annoying that they felt the need to tinker with it in the first place.

I'm still looking for a very compact backlit keyboard (obviously, one that is Linux compatible) so please do chime in if you have seen one.  The only other one that seems remotely acceptable is the Logitech K740 ( which is pretty good, priced at about $50, from a reputable manufacturer, and works with Linux.  However it has this built-in wrist rest that I absolutely cannot fit on my desk (which has two keyboards stacked up, one Linux and one Mac).  Why they didn't make the wrist rest detachable is left as an exercise.  Most regrettable.

rpm intended for a different architecture

Have you ever seen that message while trying to install your 64 bit RPM to your 64 bit OpenSuse Linux OS?  I have seen it many times.  Inexplicably, the command line "rpm -Uvh some_application.rpm" fails with the above message, while YaST and Apper are happy to install the same file with no issues.

Even stranger, I have found very few comments about this problem.  You'd think it would be common, but it's really not.  At last, I found a solution thanks to another user (who shall unfortunately remain anonymous until I can find the web site and comment again).

I usually use xterm as a command line console, just out of habit.  But for some reason, this causes the 32bit versus 64bit confusion. When I executed the same rpm command in konsole, which is a sophisticated upgrade to xterm, it installed the software with no complaint.

I suppose that xterm contains some legacy 32bit code that imposes some kind of limitation on software executed from the shell, possibly a bug or malfunction inside of xterm or inside the X11 system.  At any rate, problem is solved.

If, like me, you like to create MP3 files from older CD's and cassettes, you may end up spending a lot of time editing your MP3's to add metadata (the ID3 information) so that the album, artist, track name, and date appear properly in the display of your player.

I have found kid3 to be a convenient and nicely designed tool for updating MP3's, but unfortunately it appears not to be supported in recent SuSE releases.

It is still possible to download the sources for kid3 at and build them yourself.  However, I found that kid3 won't build in suse 12.3 because of updates to ffmpeg/avutil.h 

So add this line right after #ifdef HAVE_AV_AUDIO_CONVERT


Then the code builds properly and you can go back to using this great tool.

It's also possible to assign metadata from the command line with id3tag (part of the id3lib package obtainable from standard distributions).  For example, suppose you have ripped an old album to hard disk.  Use this command, all on one line:

id3tag --album="Led Zeppelin I" --artist="Led Zeppelin" --song="Good Times, Bad Times" --year="1969" --track=1 --total=9 track1.mp3

You can script-ify this pretty easily, and then just run the script to update all your mp3 files.  It's also helpful to rename the mp3 file to something more obvious, such as 01-Led_Zeppelin_1-Good_Times_Bad_Times.mp3 which is most helpful when manipulating files.

As far as ripping tools go, my favorite has long been grip, but grip appears to have stopped being supported and it was necessary to manually and painfully build it last time I updated the OS.  If there's any interest, I'll post the process in another note.

In a separate article, I'll discuss the whole notion of ripping cassettes and tapes which is fun but can also be tedious without the right tools.

Do you wish to update a file's time/date stamp to match another file?  It's quite easy using touch:
touch -r FILE1 FILE2
sets the access and modification times of FILE2 to match FILE1.

Why would you need this?  If, for example, you use the tool mentioned elsewhere on this page, you'll end up with a file that has today's date and a backup file (*.bak) with the original date.  In my photos and videos folders, I like to preserve the original time/date of files, so this is a neat way to do so.

In fact, I recently added this line to the end of the script in order to make it automatic:
# set new avi file date/time to match original (backed up) file
touch -r $1.bak $1

Ever need to see a good old-fashioned ASCII chart?  In a console:
man ascii

from via
Wait while more posts are being loaded