Photo: Mikal Schlosser

Eye movements can open doors

Software and programming Hardware and components Computer calculations
DTU students win prizes for developing new technology based on eye tracking via the mobile.

A few eye movements in front of your mobile is enough to open doors and turn off lights.

Two BSc Eng students at DTU have developed a new technology—Enable—which via eye tracking in smartphones allows people with physical disabilities to turn on machines, turn up the heat, or switch off the lights in their home. Eye tracking can activate functions in apps developed for smart home technology.

“We have developed eye tracking for people with motor disabilities, because this group will benefit most—also financially—from using the technology. Where the homes of people with physical disabilities today are designed with individual solutions depending on whether they can use their arms or legs, eye tracking via the mobile will be a solution which can be used by virtually all, regardless of disability,” says Elias Lundgaard Pedersen, a BSc Eng student on the Design and Innovation programme.

Eye tracking is already used today on computers at hospitals for children with cerebral palsy, i.e. paralysis caused by brain damage, or for adults with the nervous system disease ALS and neurological disorders, where the patient cannot speak or use a keyboard and a mouse. Using eye tracking, they can write emails, go on the internet, and communicate with their surroundings with text that is read aloud with synthetic speech. The technology can also be used to control technology in their home, but this requires that additional equipment is connected their computer.

According to Elias Lundgaard Pedersen, the technology has not been introduced in mobiles until now because mobiles have so far not been used extensively as a link to other technologies. So linking eye tracking, apps on the mobile, and smart home technology opens up a whole new user interface.

“Instead of having a computer installed on the wheelchair, people with physical disabilities will have access to everything from their mobile. And the technology is inexpensive. Everyone can buy a mobile with this technology in a store—also in the USA, where it can be difficult for people with disabilities to get access to aids without a health insurance,” says Elias Lundgaard Pedersen.

Photo: Mikal Schlosser

A grant and two prizes
The technology behind eye tracking via the mobile is described in the bachelor project written by Elias Lundgaard Pedersen in 2017 in collaboration with fellow student Frederik Østergaard Neble, also a BSc Eng student on the Design and Innovation programme. The two students have subsequently received a grant from the Bevica Foundation and two prizes for their work.

Today, they are working as project employees in the company Social Digital in Copenhagen, they have access to a business developer, and are writing the codes required to move the eye tracking from a microprocessor to a mobile phone.

The technology has already been tested by users at Geelsgårdskolen—a school for disabled children and children with special needs—in the form of a microprocessor connected to a computer. In the coming months, the students must write a business plan, file a patent application for their invention, and find an investor.

Useful in other industries
To be able to develop the technology for other business areas, the students need funding that allows them to employ more experienced programmers and developers.

Elias Lundgaard Pedersen and Frederik Østergaard Neble see excellent opportunities to further develop Enable into solutions targeted at other sectors, for example the pharmaceutical industry, where eye tracking of equipment may be a solution in labs, where the staff may not touch any surfaces.

In the long term, widespread use of eye tracking is expected as a way of navigating on mobiles in the same way as consumers today scroll and swipe their way through programs.

Two kinds of eye tracking

Traditional eye tracking

Uses so-called gaze point tracking that works by comparing the location of the pupil in the eye with known points on the screen. The first time you use the system, you must calibrate it by looking at specific points on the screen. In addition, gaze point tracking requires very accurate pupil recognition. The technology often involves the use of stereoscopic cameras, infrared lamps, and comprehensive image recognition, which places great demands on the processors.

 

Gesture-based eye tracking

Uses the movements of the eyes to control the system. An eye gesture can be compared to swiping left with your finger on the phone to open the camera, but instead the same movement is made with the eyes. To perform an action such as ‘turn on the light’, you first look to the right, then to the left, and then to the right again. Using gesture-based eye tracking reduces the technical requirements for accurate pupil recognition in the system, and both the hardware and software can be made less complex and costly.