Aga Bojko

Eye Tracking the User Experience


Скачать книгу

reflection of the light in this room in each of them. If I keep my head still and look to the left, to the right, up, and down (demonstrating), the corneal reflection doesn’t move—only the pupil does. You can see that the relationship between the pupil center and corneal reflection changes (see Figure 1.2).

Image

      JOHN: So where you are looking can be determined from the location of the pupil center relative to the corneal reflection.

      YOU: Exactly. Now, if I move my head slightly while looking at the same spot (demonstrating), the relationship between the pupil center and corneal reflection remains the same (see Figure 1.3). Even though I’m moving, the eye tracker would know I’m looking at the same spot.

Image

      JOHN: So what’s inside of the eye tracker that allows it to do something like that?

      YOU: Modern commercial eye trackers consist of two main components. The first one, a source of near-infrared light, creates the reflection in the eye. The second component is a video camera sensitive to near-infrared light. The camera is focused on the eye and records the reflection. The software then figures out the location of the gaze and superimposes it onto an image of what you were looking at, such as a Web page.

      JOHN: Why is infrared light needed? Wouldn’t regular light work?

      YOU: The trick is to use a wavelength that is invisible to people, and thus not distracting, yet reflected by the eye.

      JOHN: But isn’t infrared light dangerous?

      YOU: Any light wavelength—ultraviolet, visible, and infrared—can be harmful in high intensities, but the exposure from the eye tracker is just a tiny fraction of the maximum exposure allowed by safety guidelines. There is no danger, even if I were to track your eyes for hours.

      This is when you and John realize that everyone else who was initially listening to your conversation has already walked away, and you decide to rejoin the party.

       Webcam Eye Tracking

      While most commercial eye trackers are based on the infrared illumination approach described in this chapter, it is important to mention the recently evolving appearance-based systems. Instead of relying on infrared light, these low-cost solutions use off-the-shelf webcams to extract and track eye features on the face. Webcam eye tracking is most often employed in remote testing, during which participants use their computers at home or at work without having to come to a lab.

      One of the current constraints of webcam eye tracking is poorer accuracy as compared to the standard infrared devices. The accuracy decreases even further when participants move around or move their computer—something that’s difficult to control in a remote session (see Figure 1.4). In addition, the rate at which the gaze location is sampled by webcams is relatively low, which greatly limits data analysis.

Image

      If you take a step back just for a bit, you’ll realize that when people talk about eye trackers recording eye movements, they usually take it for granted that the eyes move. Out of the hundreds of conversations I’ve had with people new (and not so new) to eye tracking, not once has anyone (not even John) questioned why the eyes move. They just do, right?

      Human eyes, without rotating, cover a visual field of about 180 degrees horizontally (90 degrees to the left and 90 degrees to the right) and 90 degrees vertically (see Figure 1.5). Any time your eyes are open, the image of what you see is projected onto the retina. The retinal cells convert that image into signals, which are then transmitted to the brain. The cells responsible for high visual acuity are clustered in the center of the retina, which is called the fovea (refer to Figure 1.1). When you are looking at something directly, its image falls upon your fovea, and thus it is much sharper and more colorful than images that fall outside of the fovea.

Image

      The foveal area is quite small—it spans only two degrees, which is often compared to the size of a thumbnail at arm’s length. Even though you typically do not realize it, the image becomes blurry right outside of the fovea in the area called the parafovea (2–5 degrees) and even more blurry in the periphery (see Figure 1.6). Therefore, eye movements are necessary to bring things into focus. This is an important, information-filtering mechanism—if everything were in focus all at once, your brain would be overloaded with information!

Image

      Your eyes jump from place to place a few times per second (three to four times, on average). These rapid movements, called saccades, are the fastest movements produced by an external part of the human body. To prevent blurring, your vision is mostly suppressed during saccades. Visual information is only extracted during fixations, which is when the eyes are relatively motionless and are focusing on something (see Figure 1.7). Fixations tend to last between one-tenth and one-half of a second, after which the eye moves (via a saccade) to the next part of the visual field. Although there are a few other types of eye movements, saccadic eye movements, consisting of saccades and fixations, are most common and of the greatest interest to UX research.

Image