Let your eyes do the talking.
See all
Members (39)

Stream

Jeff Kang
moderator

Discussion  - 
 
Eye Tracking Pan/Tilt Wireless IP Camera

This experiment demonstrates eye tracking control of a low cost IP camera.
A Tobii EyeX controller mounted beneath the screen tracks subtle eye movements.

Custom software was developed to stream the video image from a Foscam pan/tilt wireless IP Camera.
The software translates EyeX behaviors to commands for controlling the camera's pan and tilt motors.

We see potential applications for this in perceptual robotics, telepresence, and assistive technologies.

#eyetracking #tobii #wirelesscamera #ipcamera #eyex
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Go language bindings for The Eye Tribe

Golang support

by Zephyyrr » 03 Jul 2014, 07:12

Hey there, people!

I took the liberty to implement EyeTribe bindings for Go (http://golang.org).
You can find it at https://github.com/zephyyrr/thegotribe.

Calibration is not yet implemented (either do a pull request or wait for me to need it).
Any bugs found is welcome as a issue.

#golang #eyetracking #theeyetribe  
#go  
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
 
Found on Twitter: TheEyeTribe: We will be going to next week. Let us know if you'd like to meet and learn more about our innovative eye tracking tech http://ow.ly/2IqWTj
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Text 2.0 framework (create eye tracking apps using HTML, CSS and JavaScript) now known as gaze.io

>Gaze.io is the fastest possible way to create stunning cross platform eye tracking apps, and it supports both interaction and analysis. In runs on Chrome, Firefox, Internet Explorer 10+ and Safari, on Windows, Mac OS and Linux, given the eye tracker runs on said platform and has an open API.

http://gaze.io/

#eyetracking #html   #javascript #css  
1
Add a comment...
It's not too often we see a truly novel gaming accessory; generally speaking companies are happy to slap a few stickers or a new coat of paint on existing..
1
Add a comment...
 
Mouse and Keyboard Cursor Warping to Accelerate and Reduce the Effort of Routine HCI Input Tasks

"Gaze tracking has traditionally been suggested as a possible alternative to traditional pointing mechanisms for computer input. However, the accuracy limitations of gaze estimation algorithms has constrained the potential of gaze tracking to become a reliable substitute of the mouse or the keyboard. Additionally, overloading a perceptual channel such as vision with a motor control task can be cumbersome and fatiguing. This work explores how to use gaze tracking to aid traditional cursor positioning methods with both the mouse and the keyboard during standard human computer interaction (HCI) tasks. The proposed approach consists of eliminating a large portion of the manual effort involved in cursor movement by warping the cursor to the estimated point of regard (PoR) of the user on the screen as estimated by video-oculography gaze tracking. With the proposed approach, bringing the mouse cursor or the keyboard cursor to a target position on the screen still involves a manual task but the effort involved is substantially reduced in terms of mouse movement amplitude or number of keystrokes performed. This is accomplished by the cursor warping from its original position on the screen to whatever position the user is looking at when a single keystroke or a slight mouse movement is detected. The user adjust then the final fine-grained positioning of the cursor manually. Requiring the user to be looking at the target position to bring the cursor there only requires marginal adaptation on the part of the user since most of the time, that is the default behavior during standard HCI. This work has carried out an extensive user study on the effects of cursor warping in common computer tasks involving cursor repositioning. The results show how cursor warping using gaze tracking information can speed up and reduced the physical effort required to complete several common computer tasks: mouse/trackpad target acquisition, text cursor positioning, mouse/trackpad/keyboard based text selection and drag and drop operations. The effects of gaze tracking and cursor warping on some of these tasks have never been studied before. The results show unequivocally that cursor warping using gaze tracking data can significantly speed up and reduce the manual effort involved in HCI for most, but not all, of the previously listed tasks."

#eyetracking #researchpaper  
1
Add a comment...
 
TETBeams is an experimental target-shooting game that uses an eye-controlled interface.

Read more at GameDes.in: http://gamedes.in/
Full source on GitHub: https://github.com/johnmquick/tetbeams

http://www.reddit.com/r/EyeTracking/comments/249grb/tetbeams_is_a_prototype_video_game_that/

#eyetracking  #theeyetribe  #tobii  #opensource  
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
#eyetracking
PDFs and webpages automatically scroll when the point of gaze is at the top or bottom edge of the window.
1
Add a comment...
 
*Gazespeaker – design your own grids that have cells that launch actions, predictive keyboard, automatic scrolling, shareable grids, desktop or tablet*

http://www.gazespeaker.org/  
Gazespeaker is another open source accessibility program for controlling the computer with an eye tracker.
 
Some of the functionalities that are listed on its website include:
 
    “display communication grids
    integrated grid visual editor
    write a text with an auto-adaptative predictive keyboard
    read web pages on the internet
    read ebooks (in html format)
    read scanned books or commic strips”

I found the creator’s blog, and screenshots and info about the program when it was a work in progress can be found in a post from last year. He/she mentions that ITU Gazetracker was used to test the program as it was being created.
 
*Work from within the program with built-in interfaces for eye-tracking e.g. custom email interface, and web browser*
 
It feels similar to GazeTalk, as the user works more within the program. For example, you can enter a POP/SMTP server, and pull the data from an email service (?). The program then provides a user with an email interface that works with eye-tracking (i.e. large buttons).  Gazespeaker can also pull websites into its own built-in web-viewer. The browser is compatible with eye tracking, where scrolling down automatically occurs when a user’s gaze is at the bottom of a window.
 
Similarly, Gazetalk has its own email, web, and media viewer.
 
On the contrary, programs like bkb, PCEye, and GazeMouse try to assist the user in working with outside interfaces. That is, they have features like magnification to deal with the more standard-sized elements.
 
*Customized grids with customized cells made with a visual editor (cells launch AutoHotkey, AutoKey, Autoit, PYAHK, Sikuli etc.?)*
 
One awesome feature of the program is the ability to design your own grids and cells with a visual editor. Grids hold cells. The software lets you define the dimensions of grids and cells, label and decorate cells, and you can have cells launch some of the predefined actions that the program provides.
 
(Perhaps in the future, the program could work with other programs like AutoHotkey, Autoit, AutoKey, PYAHK, Sikuli etc.. In addition to launching predefined actions, the cells could launch more customized actions that can be built with the scripting programs).
 
(AutoIt has a macro recorder called Au3Record)
 
(Sikuli uses the Tesseract optical character recognition program – instead of writing keystrokes to access the interface elements that could be involved in macros, you just take screenshots of the interface elements. e.g. click <screenshot of interface element>. (Picture of the in-line screenshots that are used in Sikuli scripting i/imgur.com/2dqGSPr.png).)
 
*Sharing custom grids and cells – visual grids are easier to share (as opposed to sharing something like an AutoHotkey or Autoit text file)*
 
On the website, it mentions that grids are stored in a standard XML file, and can be shared with other people.
 
I have some AutoHotkey scripts, and macros are launched by inputting keystrokes. I wouldn’t bother to try sharing some of my text files, as I doubt anyone’s going to take the time to memorize the very personalized set of keystrokes.
 
Gazespeaker cells can have customized labeling, unlike physical keyboard buttons. With an eye tracker, on-screen buttons like the cells are just as fast to activate. Look at the on-screen button, and then press a “click-where-I’m-looking-at” keyboard button.
 
E.g. Instead of memorizing and pressing something like Control + F6 to launch a favorite command, you could take a cell, stick and easily recognizable text label on it, and then activate the cell.

http://www.reddit.com/r/EyeTracking/comments/25f3dl/gazespeaker_design_your_own_grids_that_have_cells/

#eyetracking #assistivetechnology #opensource #aac #autism #spinalcordinjury #spinalcord #rettsyndrome #als #lougehrigsdisease #stroke #cerebralpalsy   #theeyetribe #tobii #disability #accessibility  
1
karl o'keeffe's profile photoJeff Kang's profile photo
4 comments
 
Thanks for that Jeff. 
Add a comment...

About this community

The idea for creating this community is to gather people who wish to be a part and/or contribute in developing an android app which uses a tablet's front camera to capture where the user's eyes are looking allowing him to move around a cursor with only his sight. The reason behind this idea is my father. He was diagnosed with ALS and could only move is eyes, so he couldn't communicate with us anymore and this could have provided him with a way to communicate again.
 
Eye-tracking wheelchair helps the severely disabled steer new course - "move around simply by looking to where they wish to travel" - also for robots and drones


Eye-tracking wheelchair helps the severely disabled steer new course

Wednesday, Jul 02, 2014 - 02:26
 
Scientists in London have developed an algorithm-based decoder system that enables wheelchair users to move around simply by looking to where they wish to travel.
The researchers at Imperial College London say the system is inexpensive and easy to use and could transform the lives of people who are unable to use their limbs.
Jim Drury has more.
 
▲ Hide Transcript
 
Algorithms working with inexpensive software could help quadriplegics steer wheelchairs simply by looking in their desired direction of travel.
An Imperial College London team says their newly devised system can read eye movements to tell if a person is merely gazing or wants to move.
Co-designer and student Kirubin Pillay says it's simple to use.
SOUNDBITE (English) KIRUBIN PILLAY, STUDENT, IMPERIAL COLLEGE LONDON, SAYING: "At the moment I'm just moving forward by looking to the floor but exactly at points on the floor that I would like to go to and the wheelchair is responding, so if I look right, slightly towards Will, I'll move there and if I look left as well I'll move there as well, and it just responds to my gaze and my desired location that I would like to go to." Visual information detected by cameras trained on both eyes is analysed by algorithms within 10 milliseconds, and translated into instructions for movement that's almost instantaneous, says researcher William Abbott.
SOUNDBITE (English) WILLIAM ABBOTT, RESEARCHER, IMPERIAL COLLEGE LONDON, SAYING: "We actually move our eyes upwards of three times a second, so there's huge information there, so essentially we track the pupil of the eye and via a calibration process we relate that to where the subject's looking in the world around them." Multiple sclerosis or spinal cord injury patients with severe paralysis, are usually able to move their eyes because they're directly connected to the brain.
For now, the team is keeping details of its decoding technology secret, but Pillay says it's an improvement on existing eye tracking systems.
SOUNDBITE (English) KIRUBIN PILLAY, PHD STUDENT, IMPERIAL COLLEGE LONDON, SAYING: "Current tracking software often uses a screen-based system where you have a screen open and you look at locations on the screen.
The problem with that is that it's very simplistic and also diverts the users' attention from the outside world and therefore there's more risk of not noticing obstacles or other things in the way." While the technology has been designed for the disabled, team leader Dr Aldo Faisal, from Imperial's Brain and Behaviour Lab, says it has much wider application.
SOUNDBITE (English) PROJECT LEADER, DR ALDO FAISAL, IMPERIAL COLLEGE LONDON, SAYING: "You could use it maybe one day to drive your car, you could use it to operate a robot, you may be able to use it to fly planes or drones or spaceships with this type of technology." Tests on able-bodied volunteers found they steered through crowded buildings faster and with fewer mistakes than when using other eye tracking technologies.
Trials on disabled patients are about to start, and the team hopes its system could be commercially available within three years.

#als #eyetracking #lougehrigsdisease #assistivetechnology #autism #spinalcordinjury #spinalcord #rettsyndrome #stroke #cerebralpalsy #disability #accessibility  
#multiplesclerosis  
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
*BBC video interview: Gal Sont, a programmer with ALS, creates Click2Speak, an on-screen keyboard that is powered by Swiftkey*

Gal Sont is a programmer that was diagnosed with ALS in 2009.
He created Click2Speak, an on-screen keyboard that is powered by Swiftkey.
 
Features:
 
    Works with all standard Microsoft Windows applications.
    Includes Swiftkey’s powerful features like the award-winning prediction engine, and 'Flow'.
    Supports more than 60 languages.
    Floats over other applications.
    Includes advanced visual and audio features.
    Auto-spacing and auto-capitalization.
    Choose between different layouts and sizing options.
    Contains Dwell feature that allows you to imitate a mouse click by hovering.
 
"After being diagnosed with the disease, I contacted other individuals who suffer from ALS at different stages, and began to learn about the different challenges that I would face as my disease progressed.
I also learned about the tech solutions they used to cope with these challenges.
The most basic challenge was typing, which is done using a virtual on screen keyboard, a common solution shared by not only individuals affected by ALS, but a variety of illnesses such as brain trauma, MS and spinal cord injuries victims.
The fully featured advanced on screen keyboards, again proved relatively very expensive (starting at $250), so I decided to develop the ultimate on screen keyboard on my own.
Through the development process, my own physical condition continued to deteriorate and I reached the point of needing to use these cameras and on screen keyboards myself.
I started with Microsoft’s 'ease of access’ keyboard that comes with windows.
This is an acceptable keyboard and it has a reasonable prediction engine.
 
For my own development needs I purchased the developer version of TOBII’s eye gaze camera.
This allowed me to code (with my eyes!) additional important features that were lacking in the Microsoft keyboard for eye control such as highlighted keys, virtual keys, auto scroll, right click, drag and much more.
 
It quickly became apparent that using our 'powered by Swiftkey’ keyboard enabled me to work faster and more accurately.
Friends who used other solutions prior to ours (not necessarily Microsoft’s) were delighted with the results, albeit a small sample size.
 
This started a new journey that introduced me to Swiftkey’s revolutionary technologies and how we customize them to our specific needs.
I reached a first version of our keyboard and distributed it to friends who also suffer from ALS.
They gave us invaluable feedback through the development process, and they all raved about its time saving capabilities and accuracy and how it makes their lives a little easier.
Even Swiftkey’s 'Flow’ feature is translated successfully to this environment; basically, it replaces the finger when using Swiftkey on an Android device with an eye/head/leg when using a PC/Tablet/laptop + camera/other input device + our Swiftkey powered keyboard installed.
 
At this point I had my good friend Dan join me in this endeavor as I needed help with detail design, quality assurance, market research, project management, and many other tasks.
We formed 'Click2Speak’, and we plan to make the world a better place! ...”.
 
http://www.click2speak.net/

http://www.reddit.com/r/EyeTracking/comments/29vprj/bbc_video_interview_gal_sont_a_programmer_with/

#als #eyetracking #lougehrigsdisease #swiftkey #tobii #assistivetechnology #aac #autism #spinalcordinjury #spinalcord #rettsyndrome #stroke #cerebralpalsy #tobii #disability #accessibility  
2
Add a comment...
 
Homebrew Oculus Rift Eye Tracker
I recently purchased two commercial eye trackers: the Tobii EyeX and the Eye Tribe tracker, both $99. I'm excited about the possibilities for eye tracking as an input method and for telepresence, but the reality of these trackers is disappointing. The accur...
I recently purchased two commercial eye trackers: the Tobii EyeX and the Eye Tribe tracker, both $99. I'm excited about the possibilities for eye tracking as an input method and for telepresence, but the reality of these trac...
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
2nd batch of The Eye Tribe Tracker has now started shipping to developers around the world. Some have already received the shipping confirmation email, while others will receive this as the package leaves the warehouse during the next couple of weeks. Please be patient... We are doing everything we can to get The Eye Tribe Tracker to the world as fast as possible.
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Tobii EyeX and Gazespeaker installation

"This video presents the installation of the Tobii EyeX eye tracker, the installation of the Gazespeaker program and the first steps with the program. Gazespeaker is designed for eye tracking and for people with disabilities. You can speek only with your eyes ! Moreover Gazespeaker is free."

http://www.gazespeaker.org/

#eyetracking #assistivetechnology #opensource #aac #autism #spinalcordinjury #spinalcord #rettsyndrome #als #lougehrigsdisease #stroke #cerebralpalsy #tobii #theeyetribe #disability #accessibility  
2
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Today we are happy to announce the release of the first EyeX SDK for .NET!
Available on the Dev Zone downloads page. http://ow.ly/xbB4f
1
Add a comment...
 
Google Glass typing concepts with Minuum Keyboard: eye-typing at 0:45 seconds, typing on forearms (computer vision gesture recognition) at 0:49 seconds.

#googleglass  #minuumkeyboard  #computervision  #eyetracking #augmentedreality  
1
Add a comment...

Jeff Kang
moderator

Discussion  - 
 
Open source, eye-tracking, predictive-typing software program by Team Gleason takes 1st place in the Washington State University EECS Senior Design Poster Contest – Uses: Android, Windows-8, The Eye Tribe, The Pupil

Team Gleason takes 1st place in the EECS Senior Design Poster Contest
 
"Fifteen competing senior design teams from EECS displayed their posters in the halls of the department on April 24th. The judging was administered by five industry representatives specializing in areas such as: microelectronics, power systems, electrical engineering and software development. The winning team, Team Gleason, was chosen based on their poster, their project as a whole, and their presentation.
 
Team Gleason has been developing a reliable predictive-typing software program which runs on a generic Android or Windows-8 tablet; and uses two hardware platforms for eye tracking: The Eye Tribe and The Pupil."
 
http://school.eecs.wsu.edu/story_team_gleason_wins_senior_design

About WSU Team Gleason
 
"Former WSU football star9 and New Orleans Saints special teams cult hero10 Steve Gleason11 has ALS12, a debilitating and cruel disease that strips its victims of the ability to control their muscles and kills them in 2-5 years. However, until the very end patients can control their eyes and eyelids. This is the primary way they communicate – by moving their eyes whereby a tablet computer tracks their eyes – after their voice goes.
 
The equipment for this is crude, e.g. it does not do much predictive typing like any smart phone does when it guesses the rest of your word. It is also very expensive, $4K to start with a barebones system and often way more (Steve’s system costs $20K). Even $4K is far beyond what Medicare will cover and what many victims of ALS can afford to pay themselves, so sadly they simply can’t communicate effectively once their voice is gone. Tragically, the patients’ minds are still perfectly good; they are basically trapped incommunicado.
 
WSU’s “World Class, Face to Face” students and faculty will develop inexpensive technology and release it under open source license with no royalties in order to disrupt this unacceptable status quo for ALS patients."

http://teamgleason.eecs.wsu.edu/

Washington State Magazine - Predictive software helps communication
 
"“I can crank out about 20 words per minute,” Gleason wrote in SportsIllustrated.com. “For 4,500 words, that’s almost four hours to finish this column.” This slow typing rate makes it difficult for ALS patients to actively participate in conversations even with the text-to-speech software.
 
As part of their senior design project, the students are combating that issue by programming eye-tracking software that is predictive. Like a smartphone’s auto-complete function, it anticipates a word or phrase based on a couple of letters. Currently, the students are putting the software on PUPIL, a 3-D printed set of glasses that connects to a computer to translate eye movement into computer action. The program will be open source with no royalties, making it freely available to the public."

http://wsm.wsu.edu/s/index.php?id=1097

http://www.reddit.com/r/EyeTracking/comments/24num6/open_source_eyetracking_predictivetyping_software/

#als #eyetracking #teamgleason #theeyetribe #pupil #windows8 #android #accessibility   #a11y #AAC
1
Add a comment...
 
Oculus rift hacked to include eye tracking!

#oculusrift #vr  
1
Add a comment...