Fun talk at UW CS by CMU's Chris Harrison (https://www.cs.washington.edu/htbin-post/mvis/mvis?ID=1318) had some great ideas on touch HCI.  Particularly good was TapSense (http://www.chrisharrison.net/index.php/Research/TapSense) which can differentiate between using, eg, your knuckle or your fingertip touching a screen (so, eg, your knuckle could be treated as a right-click and bring up a contextual menu like a right click).  Omnitouch (http://www.chrisharrison.net/index.php/Research/OmniTouch) is also cool and has a bunch of great ideas for virtual keyboards and number pads, which might be one of the solutions to the I/O problems with mobile devices and virtual displays like Google Glass.  The talk is long and won't be available on demand for a few days, but highly recommend at least taking a peek at the web pages describing TapSense and OmniTouch.
Shared publiclyView activity