Stream

Join this community to post or comment

Szymon Deja

Discussion  - 
 
WebCam Eye Tracking for usability testing
https://sourceforge.net/projects/gazerecorder/
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
New PyGaze website - open-source toolbox for eye tracking in Python - PyGaze acts as a wrapper around several existing packages - What it adds is a uniform and user-friendly syntax, as well as some gaze contingent functionality and custom online event detection

#eyetracking #python  

http://www.reddit.com/r/EyeTracking/comments/39ri7f/new_pygaze_website_opensource_toolbox_for_eye/
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
The next iteration of Google Glass is already in the works, but not much information has surfaced thus far about what the device's hardware will be like. Google has given much of its focus and atte...
1
1
Cons Bulaquena's profile photo
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Assassins Creed Rogue with Eye Tracking released today. Get your Steelseries Sentry eye tracker now and receive a free coupon for #ACRogue download.
http://ow.ly/KahFs #EyeXImmersion #seethefuture #EyeX
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
>Automatic Calibration of Eye Tracking in Stereoscopic Virtual Environments

http://www.reddit.com/r/oculus/comments/2va81c/automatic_calibration_of_eye_tracking_in/

Point of Regard from Eye Velocity in Stereoscopic Virtual Environments Based on Intersections of Hypothesis Surfaces

ABSTRACT A new method is proposed for utilising scene information for stereo eye tracking in stereoscopic 3D virtual environments.
The ap-proach aims to improve gaze tracking accuracy and reduce the required user engagement with eye tracking calibration procedures.
The approach derives absolute Point of Regard (POR) from the angular velocity of the eyes without user engaged calibration of drift.
The method involves reduction of a hypothesis set for the 3D POR via a process of trans-formation during saccades and intersection with scene geometry during fixations.
A basic implementation of this concept has been demonstrated in simulation using the depth buffer of the scene and a particle repre-sentation for the hypothesis set.
Future research directions will focus on optimisation of the algorithm and improved utilisation of scene informa-tion.
The technique shows promise in improving gaze tracking techniques in general, including relative paradigms such as electrooculography.

Jake Fountain
Stephan K Chalup

#eyetracking #virtualreality  
1
1
Mohamed Ikbel Boulabiar's profile photo
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
>Look into my eyes: Tracking your gaze could be the next big gaming input

PC Assassin's Creed Rogue will use eye tracking for an "infinite screen" effect.

#eyetracking #tobii #assasinscreed  
http://arstechnica.com/gaming/2015/02/look-into-my-eyes-tracking-your-gaze-could-be-the-next-big-gaming-input/
PC Assassin's Creed Rogue will use eye tracking for an "infinite screen" effect.
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Sentry Challenge 2014 (Tobii, SteelSeries, and Overwolf) winners

> Notable Mentions
>
> EyeControl
>
> By Idan “KillingFactory” Aharoni & Barel Azulai
>
> About the app: Eye Control allows users to create dynamic eye controlled onscreen buttons which you set up to do specific functions like screenshot taking or Teamspeak when you look at them.
>
> Think about playing Counter Strike and wanting to change songs, just look at the button on the screen and move on to the next one without skipping a beat.

https://www.youtube.com/watch?v=EzEsXgUOqQE

#eyetracking #tobii #steelseries

http://www.overwolf.com/sentry-challenge-2014/
1
Ignacio Freiberg's profile photo
 
Right, but I don't see it in action. :P
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
The Eye Tribe WebSocket server written in C#. This server makes it possible to write your application on browser like Leap Motion SDK.

https://github.com/nulltask/the-eye-tribe-websocket-server

http://www.reddit.com/r/EyeTracking/comments/2sggco/the_eye_tribe_websocket_server_written_in_c_this/

#eyetracking #eyetribe #javascript #websockets   #chromeextension  
the-eye-tribe-websocket-server - The Eye Tribe WebSocket server written in C#.
1
Add a comment...

About this community

The idea for creating this community is to gather people who wish to be a part and/or contribute in developing an android app which uses a tablet's front camera to capture where the user's eyes are looking allowing him to move around a cursor with only his sight. The reason behind this idea is my father. He was diagnosed with ALS and could only move is eyes, so he couldn't communicate with us anymore and this could have provided him with a way to communicate again.

Jeff Kang
moderator

Discussion  - 
The world's first eye-tracking virtual reality headset secures investment from Samsung Ventures.
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
>Basic eye-gaze tracking systems often use a signal such as blinking the eyes to indicate this choice.
Pradipta's team experimented with several ways to solve the selection problem, including manipulating joystick axes, enlarging predicted targets, and using a spoken keyword such as 'fire' to indicate a target.


#eyetracking  
Mice, and now touchscreens, have become a daily part of our lives in the way we interact with computers. But what about people who lack the ability to use a mouse or touchscreen? Or situations where these would be impractical or outright dangerous?
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
The Eye Tribe Tracker for Windows v0.9.56 released - Automatically loads the default calibration profile

#eyetracking #eyetribe  
Grab it off the download page (login required). Changes as follows: EyeTribe Server - Introducing the default calibration profile. Automatically loads the default calibration profile at startup and saves/overwrites the default calibration profile after every successful calibration.
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Eye Gaze Tracking Using an RGBD Cameran (Kinect): A Comparison with an RGB Solution - Microsoft Research

#eyetracking #microsoft #rgbd #kinect #patent  

Eye tracking via depth camera - Microsoft patent (Kinect?)
https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2014209816
Abstract. Most commercial eye gaze tracking systems are based on the use of infrared lights. However, such systems may not work outdoor or may have a very limited head box for them to work. This paper proposes a non-infrared based approach to track one's eye gaze with an RGBD camera (in our case ...
2
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
>"New mobile devices should be launching in 2015 with The Eye Tribe’s technology in them for the first time" -
“We definitely expect to have some of the first integrated devices this year,” says Johansen, gesturing at the Microsoft Surface tablet in front of him.
“We’re talking devices like these.”

#eyetracking #eyetribe  
New, cheaper eye-tracking technology will be in smartphones launching in 2015, potentially changing the way we interact with apps.
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Congrats to the winners of Sentry Challenge 2014! Check out these amazing eye tracking apps! http://www.overwolf.com/sentry-challenge-2014/
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Another eye-tracking HMD. They claim they have the "first eye tracking HMD", which is obviously false (both for Rift-like and traditional HMDs).

I am not convinced that this is going to be the next-best-thing for general VR neither, but gaze tracking is useful for various things like usability testing, user attention tracking, etc.

Let's just hope it won't cost arm and leg and one's firstborn as most of these HMD eye tracking kits do.
FOVE, a VR headset prototype in the works by a Japan-based team, is quickly closing the experience gap between itself and the Oculus Rift. If they continue at this pace, they could catch up, and with a trick up their sleeve—eye-tracking. When I first took at look at FOVE’s VR headset back in November, my… Read the full article …
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
OmniVision And SMI Bring High Performance Eye Tracking Technology To Wide Range Of Consumer Applications -
"As early as 2015, computing display solutions such as smart glasses, VR head-mounted displays, tablets, laptops, info terminals and more can affordably incorporate gaze interaction modalities

#eyetracking #smi #omnivision #oculusrift  
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
#eyetracking #tobii #steelseries #sentry
Eye-tracking won’t have the precision of a mouse for some time.
However, you can combine eye-tracking with other inputs.
 
Eye-tracking companies have eye tracker features that are for using mouse control and eye control in conjunction.
E.g. eye-tracking is used to initially teleport your cursor near your target, and then you can use the mouse to precisely place the cursor.
 
*Mouse-cursor-teleport user setting: time that mouse controlled cursor must be in rest before eye control is involved again (mouse precision still in use)*
 
Tobii has a time setting that determines how quickly a teleport-to-point-of-gaze-upon-movement-of-mouse will occur.
You can set the time that a mouse-controlled cursor has to be still before moving the mouse will cause a teleport.
 
You can decide the amount of time that the mouse has to sit still before eye control is involved again (return of eye control could mean that either gaze controls the cursor again, or the next movement of the mouse will warp/teleport the cursor to the point-of-gaze).
It’s for, “wait, I’m still using the mouse for stability and precision.
The mouse-controlled cursor is still working in this area”.
 
*Mouse-cursor-teleport user setting: point-of-gaze must be a certain distance from the mouse controlled cursor before eye control is involved again (eye-tracking is activated for larger cursor jumps)*
 
Another setting involves deciding the distance from the mouse-controlled cursor that the point-of-gaze has to be before gaze-teleporting is involved.
It’s for, “some of the targets are close enough, so I can just use the mouse.
I’ll save eye teleporting for when the distance is large”.).
 
Eye-tracking +keyboard: Eye-Tracking doesn’t have the precision of a mouse, but if an interface element and hit state is large enough, a “click-where-I’m-looking at” keyboard button will work.
 
Eye tracking + keyboard two-step process: there could be some eye tracking features that allow an eye controlled cursor to snap, zoom, etc. to a smaller target element, or make smaller elements project into large elements.
Sometimes it’s a two-step process, so even if you have the ability to instantly teleport the cursor, “both-hands-on-keyboard + eye-tracking two-step process” may not be suitable in certain situations.
 
Eye tracking teleport + mouse and keyboard: However, whenever you need the mouse, eye-tracking will still be there to provide an initial cursor teleport.
 
Without eye-tracking: If you have both hands on the keyboard, you lose time switching one hand to the mouse, and bringing the hand back to the keyboard.
You’re usually choosing between both hands on the keyboard, or one hand on the mouse.
 
With eye-tracking: With eye tracking, it can be used either with both hands on the keyboard (click-what-I’m-looking-at keyboard button), or one on the mouse (initial cursor teleport, then use the mouse).
You never have to forgo something to use eye-tracking; it’s always ready to make normal computer interaction faster.
 
*Eye-tracking can make on-screen buttons , and thus macros more prevalent*
 
Eye-tracking can make macros more popular because eye-tracking allows for easier activation, and thus more use of custom widgets and on-screen buttons.
A collection of custom on-screen macro buttons with recognizable, self-documenting text labels is easier to maintain than a collection of Control + Alt + Shift + <whatever> keyboard shortcuts for activating macros.
i.e. Tasker and Tasker Autoinput plugin macros on mobile have a better chance for adoption than AutoHotkey or AutoIt shortcuts on the desktop.
 
*e.g. Control + <whatever> = action vs. visual shortcut: button labeled with action*
 
e.g. I remapped Control + F1 to launch a google search on whatever is on the clipboard:
^F1::Run google/com/search?hl=en&safe=off&q=%Clipboard% .
 
With another script, Control +F1could execute something completely different.
Within a single script, depending on the context, such as what program is currently running, or what window is in focus, the use of Control +F1could change again; it can get confusing.
 
It would be more intuitive to look at a virtual button that is actually labeled, "Google Search the Clipboard", and then tap my activation key.
 
*Touch (on a virtual element, or a physical keyboard) can be faster than the mouse - Jump VNC Android app*
 
Any time that you can use touch (on a virtual element, or a physical keyboard) instead of the mouse, you can work faster.
I sometimes use the Jump VNC Android app to make my Nexus 10 as the keyboard, and remote into my desktop computer.
Therefore, I have some of experience with interacting with my desktop from a touch interface.
Interacting can be more difficult because the Nexus 10 screen size is relatively small, and I’m using touch to indirectly touch a desktop that has a non-touch UI.
However, when you are able to touch, it’s faster than using a mouse.
 
Which leads to..
 
*Comfort and ergonomics*
 
I know of people that experimented with a touchscreen in a desktop environment.
The problem was that it was too un-ergonomical to keep reaching outwards, as the shoulders tire out fast.
 
I’ve seen plenty of pictures of environments with 3+ monitors, and I don't think that repeatedly lifting the arms to touch the screens would be comfortable over time.
 
Gorilla arm syndrome: "failure to understand the ergonomics of vertically mounted touchscreens for prolonged use.
By this proposition the human arm held in an unsupported horizontal position rapidly becomes fatigued and painful".
 
*Vertical touchscreen + “tap-where-I’m-looking” button*
 
If you have a vertically propped up tablet with an external keyboard, you could remap a keyboard button to be the “tap-where-I’m-looking” button, and avoid the need to keep reaching out to touch the screen.
 
*Already using your eyes*
 
Most people are already using their eyes in computer interaction anyway.
 
For example, before you move your mouse to select something, it is very likely that your gaze goes to the target first.
The same thing goes for touch user interfaces.
Most of the time, a person will see a widget that they want to touch before they actually reach out, and physically touch it.
 
Consumer affordable eye-tracking,
and a demonstration of how it could be used in games
- aiming, or live-streaming giving even more insights into pro players actions.
1
Pavel S's profile photoJeff Kang's profile photo
2 comments
 
+Pavel S​ Tobii EyeX eye tracker.
It's a developer eye tracker, like the Eye Tribe eye tracker, but many people are just using them raw and straight out of the box.
It shouldn't be hard to add though.
It's just that the companies are currently working on the accuracy and stability of their eye trackers, while developers are working on their own software.

Add a comment...