eyes tea :: boston

home :: what :: when :: where :: who :: history :: references :: links

history

 

 

 

 

 
Driver in simulator

What the driver’s eyes tell the car’s brain.
or
Using eye movements in intelligent vehicles

Andrew Liu, Ph.D., Research Scientist
Massachusetts Institute of Technology, Man Vehicle Laboratory
70 Vassar Street, Cambridge, MA 02139-4307
Phone: (617) 253-7758
Email: amliu@MIT.EDU
WWW: web.mit.edu/amliu/www/
 

Thanks Matthias for giving me an opportunity to talk about some of my research interests in eye movements today. I’ve been involved mostly with the microgravity and space research for the past few years, but it’s nice to put my feet back on the ground, blow a little dust off of this topic. To the few familiar faces, I hope that I have a few new tidbits for you and for others, I hope you find it interesting.

I must first acknowledge the paper from which the title comes: Lettvin’s et al. (1959) paper "What the frog's eye tells the frog's brain" illustrated how neural activity in the retina encoded or represented information about the visual environment (e.g., motion detectors, edge detectors) and passed that info to the brain, rather than merely transducing a complex image to some inner screen in the brain. Similarly, I believe that there is information in the driver’s eye movement sequences (which is a reflection of what is going on inside the driver’s head) that can tell the car’s brain about what is going on in the driver’s environment. So in my talk today, I will be describing one promising method which I have explored to use this eye movement information so it could be used by a “smart car” or a car with a “brain” to improve its interaction with the driver.

 
 

Outline

Here is a little road map for today’s talk.
First, I’ll start with a few comments dealing with the bigger picture of driving and how it fits into society. Basically what issues the new technology is trying to address. Also I’ll describe a few considerations about the user that are needed when thinking about the use of these “smart” systems. These are probably quite familiar to you, so I won’t spend much time on them. The technology currently under development for future cars: some systems are beginning to be introduced, others are being studied, and some are in really early stages or not even in production at all. I’ll also mention a cautionary example to keep in the back of our minds.

The second part will focus on what’s going on with driver’s eyes. The basic assumption is that the eye movements are the output of a top-down controller (the mental model of the driver). I will also briefly survey past research in the domain of driver’s eye movements - mainly touching on studies that illustrate top-down cognitive influences on driver eye movements, and I won’t get into all the topics of how experience, age, etc. affect eye movements. Then I will describe some of the approaches that have been used to analyze and interpret these eye movement data.

Finally, in the last portion, I focus on what’s happening in the car’s brain. I’ll talk about one possible methodology (MDM) that could use eye movement information to help predict the intentions of drivers. This approach is called Markov Dynamic Models and is based on Hidden Markov Models (HMM) approaches that have been used in speech recognition and has parallels with previous work in eye movements. Finally, I will describe some of the implementations of this approach to recognize driver behavior.

 
 

Part 1: The big Picture ...

Part 2: What’s going on at the driver’s eyes?

Part 3: What’s going on in the car’s brain?

References

 
 
     
     

 

© 2003 • contact Matthias Roetting • last revision December 3rd, 2003