Fun talk at UW CS by CMU's Chris Harrison ( had some great ideas on touch HCI.  Particularly good was TapSense ( which can differentiate between using, eg, your knuckle or your fingertip touching a screen (so, eg, your knuckle could be treated as a right-click and bring up a contextual menu like a right click).  Omnitouch ( is also cool and has a bunch of great ideas for virtual keyboards and number pads, which might be one of the solutions to the I/O problems with mobile devices and virtual displays like Google Glass.  The talk is long and won't be available on demand for a few days, but highly recommend at least taking a peek at the web pages describing TapSense and OmniTouch.
Shared publiclyView activity