For those of us who remember the first days of computers, the mouse was a glorious invention to remove the tedium of navigating through keyboard and cursor keys. Its ease of use, multiple functionality and speed was a huge step forward to make computers user-friendly. In the ever-expanding innovation explosion the emphasis is now on increasing human-computer interaction and reducing the dependence on intermediary interfaces. Enter eye gaze technology.
Systems Engineering Prof. Thomas Hutchinson first developed the idea in the 1980s. The basis of the invention comes down to a simple concept: If we can monitor the movement of our eyes, we can determine where we are looking. If a computer can map the movement of our eyes and determine our fixations, then we can directly navigate and control a computer without the use of intermediate technology such as a mouse.
|
  |
Hutchinson's prototype technology in the 1980s was MS-DOS based, with limited accuracy and selective users. The project, entitled "Eye-gaze Response Interface Computer Aid" or ERICA, has achieved significant success since then. It now runs on Windows-based technology for all types of users, including those who wear glasses, and is completely capable of emulating mouse functions. Some of ERICA's more celebrated users include physicist Stephen Hawking and Irish writer Christy Brown, whose life story was chronicled in the movie "My Left Foot." In the majority of cases the users have been stroke victims or sufferers of Lou Gehrig's disease and muscular dystrophy conditions.
How does it work?
The device works with the simplest of equipment: A Light Emitting Diode (LED), available from Radio Shack for just $2.38, a CCD (charge-coupled device) camera, two gold mirrors and a frame-grabber card responsible for taking data from the camera to the computer. Essentially, the light emitting diode lets out a weak beam of infrared light (880 nanometers) with a low intensity so it does not hurt the eye. This beam touches the surface of the eye and the light is absorbed by the retina. Since the proteins in the retina resonate at the 880 nanometer infrared wavelength, the light is re-emitted, and a large portion of it is focused by the eye lens into the CCD camera, situated just behind the LED. Frame-grabber software digitizes the CCD camera information and this is reported to the computer.
The incident LED light back reflects off the front surface of the eye. This is called the "glint." The re-emitted light from the retina gives rise to a bright circle of light called the bright eye, the center of which is effectively the illuminated pupil. Both the glint and the bright eye are observed by the camera. The vectorial difference between these two spots of light determines the eye's motion.
Since the position of the pupil determines where you are looking on the monitor, any quantified measurement of the eye's motion can be translated into monitor coordinates. Similarly, if the eye rests, the computer can determine that the user is fixating on a particular object.
For any new user, the system is difficult to use at first. The user must concentrate for long periods of time and keep his head perfectly still. However, after a couple of tries the user soon becomes accustomed to its style of operation. Chris Lankford, a Systems Engineering graduate student, is now capable of manipulating ERICA with ease.
"Once you get practice you are just able to go," Lankford said. He admitted that the system cannot work for everyone, though. "It won't work for people who can't fixate, or who have eye-related trauma." He also added that ERICA would not work outside, in strong sunlight, and that at present, users have to keep their heads perfectly still.
Applications
ERICA revolutionizes word processing packages such as Microsoft Word. A keyboard image appears on the screen and the user focuses on the relevant letter to type it, or even a relevant word starting with that letter. The ERICA keyboard software can also be coupled to speech recognition software.
Hutchinson also has developed gaze tracking analysis software that enables researchers to understand how users interact with the World Wide Web, and other Windows-based software. On the computer screen there are well-defined areas of interest called "Look Zones," which track viewing characteristics such as the amount of time spent viewing, number of fixations and average length of fixation.
Lucent Technology is taking advantage of this software on its Web page. The gaze tracking software enables Lucent to determine the most popular images on the web page. "One could be reading a headline and the article just pops up," Lankford said. This technique "non-invasively knows where you are looking."
Fourth-year Systems Engineering student Ben Darling performed a study to identify the eye motion of autistic subjects using the software, at the University of Cambridge's Autistic Research Center in England. He wanted to determine whether autistic subjects identify images differently from non-autistic subjects. He said that while "control subjects [non-autistic children] concentrate on the eyes of a particular image, autistic subjects may actually be looking at different features of a face."
In the future
The market for eye gaze technology is vast, not just in the present applications but in the wide range of conceivable applications. These include alignment of icons on a screen, virtual reality, non-invasive lie detector, pupil response to information, and the use of both eyes in a 3D application.
"Over the past couple of years it has grown at the University ... it's now used in the Education School," said fourth-year Systems Engineering student Kirsten Ramm.
But regardless of the importance of eye gaze in the current technology explosion and its creative and innovative conception, it fulfills another important role. As fourth year Systems Engineering student Mary Hubbard summed it up, "It's a wonderful tool for people to communicate with friends and family, and for people who can't speak anymore"