A Wearable User’s Dashboard and Turn Signals
Walking with a wearable computer should be more like driving a car. With a car, much of my attention is dedicated to the environment around me: looking for pedestrians, watching for stop signs, and staying in my lane. Occasionally, when everything looks clear, I might look at the speedometer, fuel gauge, or radio station displayed on the dashboard. These displays are arranged and designed so that I can monitor the car’s state with a glance. Yet the displays don’t intrude-I determine when and how often I look at them.
I’d like the equivalent of my car’s dashboard and turn signals for my wearable. The system should give me constant, useful (and usable) information that I can refer to when I want and that enables me to do the most common tasks simply. In some senses, a wristwatch already serves as a sort of wearable dashboard. While walking down the street, I can check to see if the area immediately before me is clear, turn my wrist, refocus my eyes, check the time, and then refocus my eyes on the world in front of me. The entire process takes approximately two seconds, and I haven’t gone further than I had already determined was safe to walk without looking. A head-up display can provide more advantages than a wristwatch. I can set the HUD’s physical focus to the depth of my focus. This focus matching eliminates the time required for my eye to refocus between the environment and the display. In addition, a HUD requires less eye and arm movement than a watch. Of course, a HUD also provides a higher-resolution image than a wristwatch can effectively display. An interesting question is, what sort of information might be useful for fast access during a user’s daily tasks? What should be on the dashboard of everyday life? How should the data be displayed so as to be both useful and understandable at a glance? Depending on the user, we could imagine displays of calories burned while walking, time until the next meeting, top to-do list items, email status, or even a Google Earth-style view of the user’s surroundings.
Martin Frey, an interface designer in Munich, has combined several of these concepts to create the Just In Time Watch (www.freymartin.de/en/projects/jitwatch
). The JIT-Watch helps its user get to appointments on time by providing a visualization of the time and distance to the appointment location. First, the system accesses the user’s calendar to determine the appointment time and location. By using the Web-based travel and traffic information, it estimates the times when the user should arrive at certain checkpoints, such as a bus stop or subway station. Through GPS, the JITWatch tracks its user’s progress and visualizes the user’s current location versus an on-time travel rate on a completion bar. The watch achieves much of this functionality through Bluetooth connections to sensors and a mobile-phone ٞInternet connection carried elsewhere on the body. Like a car’s dashboard, the JIT-Watch combines relatively complex data into a meaningful and immediately understandable display, letting the user determine his progress with a glance whenever he has a spare second.
We could extend the JITWatch so that when its wearer is running late, she could compose an email or SMS to the other people at the upcoming appointment and give them an updated arrival time. We often use mobile messaging and mobile phones for such microcoordination already. However, by understanding the user's context, devices such as the JITWatch could significantly reduce the user’s burden in signaling his intentions to others. The display could suggest predetermined messages ("I'm coming," "I'll be there shortly," "Where are you?"), and the user could dispatch the message by confirming it with a flick of the wrist (sensed by accelerometers in the wristwatch).
This scenario reveals another goal for wearable computer interfaces that parallels current automobile interfaces: simplicity and low effort in the use of commonly used communication devices. In the automobile, some actions (such as braking) automatically generate signals (brake lights) for others to see. Other common signal interfaces such as turning are designed so that they can be performed with the minimum amount of additional motion and attention. One obvious question then is, what actions do we most often want to perform while moving -that is, what’s the equivalent of the driver’s need for turn signals? Microcoordination seems one potential application, but many are possible. Perhaps a user, while walking, might make a gesture to trigger the reading of an email. Other gestures could speed or slow playback or skip to other messages.
From 2002-2007 I wrote a series of essays for Pervasive Computing magazine on wearable computing. The excerpt above is from the last in 2007 and was requested by +Alex Bravo
. Alex's timing is fortuitous - I just read the Wired essay on the Apple Watchhttp://www.wired.com/2015/04/the-apple-watch/
in which Apple designers are quoted discussing many of the same concepts and much of the same language as in these essays. Over the next few weeks, I hope to revisit the essays here, preparing for the class Mark Billinghurst and I are teaching at CHI2015 called "C20 The Glass Class: Designing Wearable Interfaces"http://chi2015.acm.org/program/courses/
I'm hoping y'all will help me iterate on and improve these essays by commenting here. For the full essay, seehttp://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4160599&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D4160599