The eyes are the window to the….. body?

Author: Kerry Hudson //


Eye tracking can be a great research tool for understanding a wide range of behaviours across development. Improvements in eye tracking technology means we no longer need to rig-up participants in heavy head-mounted equipment and use chin rests to keep their heads still. The use of screen-based and mobile eye tracking has opened up the technique for easy use in infants, people with neurodevelopmental disorders and even animals. This makes eye tracking an excellent tool for understanding embodiment and multimodal research (see Eye tracking: what can we use it for in multimodal studies?).

So far much of my work with eye tracking has not directly addressed issues of embodiment, although data from eye tracking can be used to infer behavioural and cognitive states in a participant- for example we can use pupil size as an indicator of physiological arousal or duration of blinks as a measure of drowsiness. I used these types of measures as part of a project I was involved with in the Centre for Integrative Neuroscience and Neurodynamics at the University of Reading in partnership with Neurosense. In this project we simultaneously recorded eye movement and electroencephalography (EEG) data in adults watching television adverts for a range of products and health advertisements- this allowed us to understand how engaged people were with the adverts and using these measures even helped us to predict the adverts that led to the most product sales.

In another project at the University of Reading we used eye movements as a marker of early processing of language in typically developing children as well as children with Down’s syndrome and Williams syndrome. For this task we looked at the pattern of saccades (fast eye movements) and fixations in participants when they heard an ambiguous sentence and had to select between two visual representations of the sentence. Here eye movements told us how participants were processing the sentence and trying to work out its meaning, so we could understand participants’ cognition even in groups that were not able to tell us verbally how they disambiguated the sentences. So, simple measurements can tell us a great deal about how the body is responding to and processing to a range of stimuli.

Section of a datafile from an eye tracker.

Section of a datafile from an eye tracker.

As well as response to stimuli, eye tracking can be used to interact with virtual environments; eye tracking has useful applications in navigation of virtual environments and human-computer interaction. This not only makes interaction with virtual worlds more immersive but can also open up technology to people with physical disabilities by using eye movements for computer interaction. Recently Borland, Peck and Slater (2013) have studied a couple of conditions with eye movements of the avatar mapped to the participant’s eye movements, and also eye movements of the avatar to simulate the participant’s eye movements. The study has shown that use of contingent eye movements of self-avatars and ones self increases the sense that the avatar belongs to you and so represents your own body. This means that not only can we use eye movements as a marker of cognition and physiological states, but also as a means of using the physical body as virtual tool; this makes eye tracking such an interesting and useful technique to understand embodiment.

Borland, D., Peck, T., &  Slater, M. (2013). An evaluation of self-avatar eye movement for virtual embodiment, IEEE Transactions on Visualization and Computer Graphics, 19 (4), 591-596. doi:10.1109/TVCG.2013.24


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: