On Friday 26 July Barney Hawes from communications technology company Sensory Software joined us to talk about their latest experiments using eye tracking as a control mechanism, allowing you to interact with digital and physical media, just by looking at it.

For people with physical disabilities, getting access to a computer can open up a world of opportunity. Advances in eye gaze technology are allowing people to talk, helping very young children to interact with the world, and enabling adults to control their home environment.

Look to Learn

Barney started by introducing us to their new tool called Look to Learn. Look to Learn is a software package with 40 activities designed for people starting out with eye gaze technology. The activities have been specially created to provide a fun way to improve access and choice making skills. Each activity develops a different skill, ranging from early cause and effect through to accurate eye gaze control.

Barney explained that the 40 Look to Learn activities that have been split into five areas of learning and development:

    Sensory - Designed to teach cause and effect
    Explore - Encourages the user to engage with the whole screen
    Target - Helps improve accuracy of eye gaze access
    Choose - Develops choice making skills
    Control - Fine tunes eye gaze access

He then set up Look to Learn on his computer and invited us to demo the software. You can watch a trailer with it in action here. Barney explained that the software has been developed in conjunction with teachers and therapists to help provide a tool for assessment and develop eye tracking skills. REACT producer Matt Davenport had a go at ‘fart clouds’ wherein staring at a particular cloud caused it to animate and give off a sound that...well you can imagine.

The assessment criteria are:

    Eye tracking – where on the screen is the user looking? Are they engaged with appropriate  content?
    Targeting – learn to find targets and fix your gaze on it.
    
He explained that the built in analysis tool shows where somebody has looked on the screen during an activity in the form of a heat map. Heat maps can be saved, printed off and used to measure progress and record successes. The games can help as a progression step to develop control using eye tracking that can be used for more advanced software, such as The Grid’2.  

The Grid'2

Barney explained that the Grid’2 is software that allows people with limited or unclear speech to use a computer as a voice output communication aid, using symbols or text to build sentences. You can also access your Windows desktop and other programs, with the built in Computer Control features using your gaze as a mouse curser. He told us about one client who was paralysed from the neck down who that used the Grid’2 software to be able to fix his son’s computer. He also used to send and receive email and sms messages, browse the web and listen to music.

Barney explained that the software just needs a PC running Windows XP, Vista, Windows 7 or Windows 8 and it’s benefit is it’s range of inputs from eye gaze, switches, headpointer, touchscreen, and mouse, meaning it’s completely adaptable depending on the ability of the user.

Barney then spoke about Tony Nicklinson who suffered from locked-in syndrome after a brain stem stroke in 2005. Tony used the Grid’2 software linked to an eye-tracker to argue in court his right to end his own life. Although he lost his court case, the software allowed him to represent himself responding to circumstances in real time, giving him back his voice. You can read more about the case here.

In the Q & A we discussed all sorts of ways in which this technology could be developed and deployed. From things that we can do right now, using eye tracking as a games controller, to things that are coming over the horizon like integrating eye gaze control with google glass or the Oculus Rift for a more responsive, augmented or virtual reality experiences.