For people suffering from conditions such as cerebral palsy, motor neurone disease (MND) or so-called locked-in syndromes, being able to move around and interact in a virtual environment is a "truly liberating experience", said Howell Istance, a computer scientist who helped develop the software.
"Until now, gaze-tracking technology has mainly been used for typing with visual keyboards, for browsing the web and other text-based applications. We have taken it to an entirely new level by using eye movements to control an avatar in a virtual environment, allowing people with disabilities to appear and interact just like able-bodied people if they wish", explained Howell Istance of De Montfort University in the United Kingdom.
The gaming-with-gaze software works in combination with commercially available eye trackers that use cameras to monitor users' eye movements as they gaze at a computer screen. The developers studied the eye movements of able-bodied gamers to create a visual heat map in order to trigger commands depending on where users look. Different patterns of eye movements are translated into so-called gaze gestures that are used to trigger movement or action commands.
Glancing to the left or right will turn the virtual character in that direction, for example, while staring at the centre of the screen will make the avatar run forwards. Because the software is independent of the game itself, it can be used to play virtually any game that requires mouse and keyboard inputs.
Communicating with other players is made possible by gazing at letters on an onscreen visual keyboard, while different combinations of gestures can be used to perform different actions.
"In the current set up, we have programmed 12 gesture sequences to activate different keyboard or mouse events", Howell Istance stated. "Many more commands are possible but the total number is limited by the users' memory and the need to differentiate between when someone wants to input a command and when they are just looking at the screen."
The approach contrasts with slower and more laborious gaze-based input techniques, which work sufficiently well for typing a message or surfing the internet but which are too slow and tiring to enable users to match the speed and accuracy needed to play real-time 3D games.
The gaming-with-gaze software should make the avatars of people with disabilities almost indistinguishable in their behaviour and abilities from those of able-bodied people in on-line games and environments.
"It could be life changing for the large number of paralysed people whose only means of communicating is with their eyes. Second Life, for example, could really be a second life for them, providing not only entertainment but versatile electronic services, for example, education", stated Aulikki Hyrskykari, a researcher at Tampere University.
"Obviously there will be limitations to what users with disabilities can do - such as not being able to perform several actions simultaneously - but they can choose what activities they participate in. That is a choice they did not have before", Howell Istance added.
The free software was made available to download on 26 May, coinciding with an annual public conference in Copenhagen organised by participants in the COGAIN network. The researchers plan to encourage people with disabilities who are already using eye-tracking systems to install the software and use it with whatever on-line games they would like to play.
Another team of researchers working in the COGAIN network developed free software to turn a high-definition video camera into an eye-tracker, providing a low-cost alternative to expensive commercial systems.
"Based on the feedback we received, we will continue to develop the gaming software", Howell Istance stated. "The goal is to have a single version of the software that is flexible enough to adapt to the interface of any on-line game and to the requirements and limitations of any user."
The COGAIN network is funded under the European Union's Sixth Framework Programme for research. More information is available at the COGAIN project website. This article has been reprinted from the ICT Results website.