Most laboratory research on cognition focuses on people acting alone.  But in the real world, cognition is often collaborative; people work with others when they search for things, communicate, learn, remember, and solve problems. To act jointly, people need to coordinate their individual processes and actions with others. 


When two people are face-to-face, eyegaze is a potent form of communication; people are quite sensitive to where their interactive partners are looking.  More and more often, communication takes place between two people at a distance, electronically mediated.  We have put together the first shared gaze system in which the output from two lightweight head-mounted eyetrackers is not only synchronized with speech for analysis as experimental data, but is also transmitted as a continuous stream of interpersonal cues (see Zelinsky, Dickinson, Chen, Neider, & Brennan, 2005). The eyegaze of each partner is displayed on the screen of the other in the form of a gaze cursor, so that that each can tell where the other is looking, as well as whether they’re both looking at the same thing. 


nsf This material is based upon work supported by NSF
under Grant #0527585



Gregory Zelinsky greg zelinsky
Susan E. Brennan
Dimitris Samaras
Former Students: Mark Neider (Postdoc, Beckman Institute), Chris Dickinson (Postdoc, U. Delaware), Xin Chen (Postdoc, York University
Former Postdoc: Joy E. Hanna (Ass't Prof., Oberlin College)
Current Students: Hyejin Yang, Karla Batres, Anna Kuhlen, Alexia Galati