What is it?
Eye and motion tracking both works by using some type of sensors or optical hardware (such as a camera) to track movements, and software to translate those movements into the desired actions.
Eye trackers (or gaze trackers) allow users to control the cursor and interact with the computer using their eyes instead of the mouse. They are commonly used by people with very limited mobility. Eye tracking is implemented with the right combination of hardware and software. Some eye trackers are worn on the head, others are positioned on top of the screen or device being used, and some track magnetic dots placed on contact lenses. In all cases, they allow the cursor to follow the gaze of the user. Dwelling (staring at a certain part of the screen for a longer time) or consciously blinking in a particular way can then be used to “click”, for example on icons, buttons, or a virtual keyboard.
Motion trackers are commonly used to track head movements (in which case they’re sometimes called “head pointers”, even though they are not physical pointing devices). They may also track the movement of other body parts through wearable techs, such as gloves. They work similarly to eye trackers, but they do require that the user has the ability to move and control the head or other body part.
Motion trackers were more commonly used in the past, but they have been losing popularity as eye trackers (which can be used by a wider array of people) have increased in accuracy and decreased in price.
- Interactive voice response system and eye-tracking interface in assistive technology for disabled (PDF) by Kalle Kenttälä (2019)
- Move the pointer using head pointer on Mac by Apple
- Get started with eye control in Windows 10 by Windows Support
- 10 Free Eye Tracking Software Programs [Pros and Cons] by Bryn Farnsworth (2019)