A gaze point based cursor positioning system

Design team members: Julie Agar, Lianne Chong, Po-Yan Tsang

Supervisor: Prof. M.E. Jernigan


Although the mouse is an increasingly popular computer input device, its traditional design can produce adverse physiological effects in long-term users. Attempts have been made to discover alternatives for the mouse that reduce or eliminate these effects. One alternative device is the eye-tracker, which monitors a user’s eye movements and positions the cursor on the screen accordingly. Unfortunately, many existing eye-trackers are expensive and difficult to use.

Project description

The objective of this project is to use an image-capturing device and image-processing techniques to design and develop an affordable, user-friendly computer cursor-positioning system controlled by the user’s point of gaze. The system is primarily targeted towards general computer users, but people with hand disabilities are also targeted as a secondary audience. The system will use an image-capturing device to capture the user’s eyes in images. Then through image processing techniques, the system will translate the images into eye movements and in turn interpret such movements as cursor positions. The system should be affordable in the sense that if the user can afford a computer system, he or she should also be able to purchase the system. The system should also be user-friendly, requiring as little assistance as possible from the user. The user should not be required to wear extra equipment such as headsets or electrode sensors to help locate the eyes.

Design methodology

To achieve the project objectives, an iterative approach was chosen, as seen in Figure 1. There are five major stages. The first stage, selecting the method of image acquisition, should occur only once. The next stages (2-4) form an ongoing cycle. After the first pass through the cycle, the system will contain basic functionality in all necessary areas. Additional iterations of these three stages, referred to as the improvement stage, allow the system to gain more functionality and user-friendliness.

1. Determine image data acquisition method

At the present time, there are many image-capturing devices that could possibly be used for data acquisition purposes in this project. This step involves finding a suitable low-cost commercially available device that will meet the project needs.

2. Develop method for detecting a pupil in an image

Once an image-capturing device has been selected, a process for identifying a pupil within the captured image must be found. Upon completion of this stage, a software program will be developed that can accurately isolate the eye and the pupil from other data in the captured images.

3. Determine direction of eye gaze based on pupil location

An algorithm is necessary to determine the user’s gaze points by extracting necessary information from the pupil image isolated in the previous stage. During this stage, this algorithm will be developed and written in a standard programming language.

4. Enable cursor movements using point of gaze locations

After eye gaze tracking has been successfully implemented, an interface with the cursor position on the display will be developed. The cursor position will be set to the region in which the eye gaze is located.

5. Improvements

With each additional pass through stages 2-4, enhancements and error handling abilities will be added to the existing system.