#eyetracking #tobii #steelseries #sentry
Eye-tracking won’t have the precision of a mouse for some time.
However, you can combine eye-tracking with other inputs.
Eye-tracking companies have eye tracker features that are for using mouse control and eye control in conjunction.
E.g. eye-tracking is used to initially teleport your cursor near your target, and then you can use the mouse to precisely place the cursor. *Mouse-cursor-teleport user setting: time that mouse controlled cursor must be in rest before eye control is involved again (mouse precision still in use)*
Tobii has a time setting that determines how quickly a teleport-to-point-of-gaze-upon-movement-of-mouse will occur.
You can set the time that a mouse-controlled cursor has to be still before moving the mouse will cause a teleport.
You can decide the amount of time that the mouse has to sit still before eye control is involved again (return of eye control could mean that either gaze controls the cursor again, or the next movement of the mouse will warp/teleport the cursor to the point-of-gaze).
It’s for, “wait, I’m still using the mouse for stability and precision.
The mouse-controlled cursor is still working in this area”. *Mouse-cursor-teleport user setting: point-of-gaze must be a certain distance from the mouse controlled cursor before eye control is involved again (eye-tracking is activated for larger cursor jumps)*
Another setting involves deciding the distance from the mouse-controlled cursor that the point-of-gaze has to be before gaze-teleporting is involved.
It’s for, “some of the targets are close enough, so I can just use the mouse.
I’ll save eye teleporting for when the distance is large”.).
Eye-tracking +keyboard: Eye-Tracking doesn’t have the precision of a mouse, but if an interface element and hit state is large enough, a “click-where-I’m-looking at” keyboard button will work.
Eye tracking + keyboard two-step process: there could be some eye tracking features that allow an eye controlled cursor to snap, zoom, etc. to a smaller target element, or make smaller elements project into large elements.
Sometimes it’s a two-step process, so even if you have the ability to instantly teleport the cursor, “both-hands-on-keyboard + eye-tracking two-step process” may not be suitable in certain situations.
Eye tracking teleport + mouse and keyboard: However, whenever you need the mouse, eye-tracking will still be there to provide an initial cursor teleport.
Without eye-tracking: If you have both hands on the keyboard, you lose time switching one hand to the mouse, and bringing the hand back to the keyboard.
You’re usually choosing between both hands on the keyboard, or one hand on the mouse.
With eye-tracking: With eye tracking, it can be used either with both hands on the keyboard (click-what-I’m-looking-at keyboard button), or one on the mouse (initial cursor teleport, then use the mouse).
You never have to forgo something to use eye-tracking; it’s always ready to make normal computer interaction faster. *Eye-tracking can make on-screen buttons , and thus macros more prevalent*
Eye-tracking can make macros more popular because eye-tracking allows for easier activation, and thus more use of custom widgets and on-screen buttons.
A collection of custom on-screen macro buttons with recognizable, self-documenting text labels is easier to maintain than a collection of Control + Alt + Shift + <whatever> keyboard shortcuts for activating macros.
i.e. Tasker and Tasker Autoinput plugin macros on mobile have a better chance for adoption than AutoHotkey or AutoIt shortcuts on the desktop. *e.g. Control + <whatever> = action vs. visual shortcut: button labeled with action*
e.g. I remapped Control + F1 to launch a google search on whatever is on the clipboard:
^F1::Run google/com/search?hl=en&safe=off&q=%Clipboard% .
With another script, Control +F1could execute something completely different.
Within a single script, depending on the context, such as what program is currently running, or what window is in focus, the use of Control +F1could change again; it can get confusing.
It would be more intuitive to look at a virtual button that is actually labeled, "Google Search the Clipboard", and then tap my activation key. *Touch (on a virtual element, or a physical keyboard) can be faster than the mouse - Jump VNC Android app*
Any time that you can use touch (on a virtual element, or a physical keyboard) instead of the mouse, you can work faster.
I sometimes use the Jump VNC Android app to make my Nexus 10 as the keyboard, and remote into my desktop computer.
Therefore, I have some of experience with interacting with my desktop from a touch interface.
Interacting can be more difficult because the Nexus 10 screen size is relatively small, and I’m using touch to indirectly touch a desktop that has a non-touch UI.
However, when you are able to touch, it’s faster than using a mouse.
Which leads to.. *Comfort and ergonomics*
I know of people that experimented with a touchscreen in a desktop environment.
The problem was that it was too un-ergonomical to keep reaching outwards, as the shoulders tire out fast.
I’ve seen plenty of pictures of environments with 3+ monitors, and I don't think that repeatedly lifting the arms to touch the screens would be comfortable over time.
Gorilla arm syndrome: "failure to understand the ergonomics of vertically mounted touchscreens for prolonged use.
By this proposition the human arm held in an unsupported horizontal position rapidly becomes fatigued and painful". *Vertical touchscreen + “tap-where-I’m-looking” button*
If you have a vertically propped up tablet with an external keyboard, you could remap a keyboard button to be the “tap-where-I’m-looking” button, and avoid the need to keep reaching out to touch the screen. *Already using your eyes*
Most people are already using their eyes in computer interaction anyway.
For example, before you move your mouse to select something, it is very likely that your gaze goes to the target first.
The same thing goes for touch user interfaces.
Most of the time, a person will see a widget that they want to touch before they actually reach out, and physically touch it.