Skip to content

🎉 Communication Works: live AT events coming in May 2024. Click here to find out more!

Your web browser is out of date.

We no longer support your browser. Please open our website in another browser, e.g. Google Chrome

The Ace Centre logo
app icon

EyeCommander

EyeCommander is a desktop application that uses the front facing camera of any Windows device to detect when you blink. EyeCommander then uses that blink as a switch input for communication software.

Check out the EyeCommander documentation, including the version history.

System Requirements

EyeCommander is compatible with all modern versions of Windows, its been tested on Windows 10 and up.

You MUST grant Administrator rights to EyeCommander when prompted.

EyeCommander is uses a lot of CPU power to run, this is due to the fact that every frame has to be analysed by a machine learning model to extract all of your facial features.
EyeCommander shows a ‘frames per second’ counter in the top left hand corner of the video feed. The higher the number of frames per second the more responsive and accurate the blink detection will be. The highest you will get is 30 frames per second and anything lower than 5 frames per second will be too low to work at all. You can run EyeCommander on GridPad devices but we have found that they get fairly low frame rates. EyeCommander still works at a low frame rate, however it will be less accurate and responsive but you might find its still usable for your use case.
From our experience we have had the best success with Surface Pro tablets. They have enough processing power to run EyeCommander easily at 30 frames per second and work for our clients needs.

 

Feedback on any beta features is always welcome.

Checkout this video which outlines what EyeCommander is and includes a short demo:

Getting started with EyeCommander

To install EyeCommander go to our page of downloads here and follow the instructions there.

Watch our help series on YouTube which shows you how to install EyeCommander and calibrate it:

Who would use EyeCommander?

To help explain EyeCommander we created two personas that are not based on specific individuals but a collection of people who could benefit from using this technology.

Our first persona is Alexis Anderson who had a brain stem stroke and cannot use body movement to trigger switches. She uses paper based AAC with a limited group of communication partners who understand how she indicates her selections by blinking. Alexis is frustrated that she can only communicate when one of these communication partners are available. Her current solution also means that she cannot access environment control.

Our second persona is Ben Brown who has MND and currently uses eye gaze when he has the energy to do so. Typically, he has to move to a low-tech solution as he becomes tired throughout the day. Ben is very proficient with his high tech device and has learnt where all his vocabulary is stored. It is very problematic that his access depends on having enough energy to use eye gaze and Ben’s family is worried that he will be not able to access his eye gaze as time goes on.

Ben and Alexis are both examples of people who would benefit from using EyeCommander. Ben could use it when he is tired and he can continue to use the same vocabulary no matter his energy levels. Alexis can use EyeCommander to communicate independently and not be reliant on the presence of a specific communication partner.

Previous Work

Detecting an eye blink is something that has been done before, however the solutions don’t work in every scenario which is what lead us to create EyeCommander. Some of the problems we found with solutions that already exist are explained below.

Android recently added a new accessibility feature to allow users to control their device using eye gestures. It is great and if an individual uses an Android app to communicate then this can be a great option.

Mac computers also have a similar solution built into their operating system, allowing users to trigger switches using blinks or other facial gestures. If you want to access your Mac this way it is a good option. However, we found that the Mac facial gestures don’t allow you to change and calibrate the detection enough for it to work for everyone.

Another similar software solution is SmyleMouse. Like EyeCommander it uses consumer grade hardware to detect facial gestures. However, SmyleMouse doesn’t detect blinks or eye gesture, which we plan on implementing in EyeCommander in the future. Other software solutions exist, include SViaCam and Blink1.5, but we found they didn’t fully met our needs.

Nous is a hardware device that users wear to detect their blinks. However, we wanted our solution to work on consumer grade hardware that was readily available. There are a variety of other hardware options on the market, such as IST Switch, EyeControl, and a-blinx.

The main difference between EyeCommander and these previous solutions is that it allows you to control Windows devices and will work with the computer you already have meaning you do not have to purchase and maintain new hardware.

Background

EyeCommander was initially developed by a group of students and graduates as part of the Project Propel program to give them real world experience. They were given the brief to create something that could detect eye gestures using consumer grade hardware.

The group created a machine learning model that could detect when someone looked up, down, left, or right. The machine learning model was trained using a varied set of videos of people doing these gestures. The model was then retrained to individual users so that it worked for them.

The work the group did proved that it is possible to create a very accurate eye gesture detection algorithm.

The current version of EyeCommander doesn’t use the eye gesture model but the plan is to include it in future versions.

Future of EyeCommander

EyeCommander will continue to be developed by Ace Centre and the wider AAC community. We have created a roadmap, with things we want to add and future plans for the project. If you have ideas for us to add please let us know.

We are looking for people to contribute to EyeCommander and to help us add more features. If you want to get involved, then check out the code on Github. Have a read through the open issues and comment if you want to be a part of it and we can help you get up and running.

Feedback

If you have any feedback on EyeCommander then email me at [email protected]