TY - JOUR
T1 - Human classifier
T2 - Observers can deduce task solely from eye movements
AU - Bahle, Brett
AU - Mills, Mark
AU - Dodd, Michael D.
N1 - Funding Information:
The study was partially supported by NIH Grant R01 EY022974. The authors thank Jordan Marshall and Alex Olsen for their assistance with data collection and Gerald McDonnell and Monica Rosen for helpful comments on an earlier version of this manuscript. They also thank Greg Zelinsky, Franco Amati, Joe Schmidt, and an anonymous reviewer for helpful comments on a previous version of this manuscript.
Publisher Copyright:
© 2017, The Psychonomic Society, Inc.
PY - 2017/7/1
Y1 - 2017/7/1
N2 - Computer classifiers have been successful at classifying various tasks using eye movement statistics. However, the question of human classification of task from eye movements has rarely been studied. Across two experiments, we examined whether humans could classify task based solely on the eye movements of other individuals. In Experiment 1, human classifiers were shown one of three sets of eye movements: Fixations, which were displayed as blue circles, with larger circles meaning longer fixation durations; Scanpaths, which were displayed as yellow arrows; and Videos, in which a neon green dot moved around the screen. There was an additional Scene manipulation in which eye movement properties were displayed either on the original scene where the task (Search, Memory, or Rating) was performed or on a black background in which no scene information was available. Experiment 2 used similar methods but only displayed Fixations and Videos with the same Scene manipulation. The results of both experiments showed successful classification of Search. Interestingly, Search was best classified in the absence of the original scene, particularly in the Fixation condition. Memory also was classified above chance with the strongest classification occurring with Videos in the presence of the scene. Additional analyses on the pattern of correct responses in these two conditions demonstrated which eye movement properties successful classifiers were using. These findings demonstrate conditions under which humans can extract information from eye movement characteristics in addition to providing insight into the relative success/failure of previous computer classifiers.
AB - Computer classifiers have been successful at classifying various tasks using eye movement statistics. However, the question of human classification of task from eye movements has rarely been studied. Across two experiments, we examined whether humans could classify task based solely on the eye movements of other individuals. In Experiment 1, human classifiers were shown one of three sets of eye movements: Fixations, which were displayed as blue circles, with larger circles meaning longer fixation durations; Scanpaths, which were displayed as yellow arrows; and Videos, in which a neon green dot moved around the screen. There was an additional Scene manipulation in which eye movement properties were displayed either on the original scene where the task (Search, Memory, or Rating) was performed or on a black background in which no scene information was available. Experiment 2 used similar methods but only displayed Fixations and Videos with the same Scene manipulation. The results of both experiments showed successful classification of Search. Interestingly, Search was best classified in the absence of the original scene, particularly in the Fixation condition. Memory also was classified above chance with the strongest classification occurring with Videos in the presence of the scene. Additional analyses on the pattern of correct responses in these two conditions demonstrated which eye movement properties successful classifiers were using. These findings demonstrate conditions under which humans can extract information from eye movement characteristics in addition to providing insight into the relative success/failure of previous computer classifiers.
KW - Categorization
KW - Cognitive
KW - Eye movements
KW - Visual search
UR - http://www.scopus.com/inward/record.url?scp=85019151632&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019151632&partnerID=8YFLogxK
U2 - 10.3758/s13414-017-1324-7
DO - 10.3758/s13414-017-1324-7
M3 - Article
C2 - 28493106
AN - SCOPUS:85019151632
SN - 1943-3921
VL - 79
SP - 1415
EP - 1425
JO - Attention, Perception, and Psychophysics
JF - Attention, Perception, and Psychophysics
IS - 5
ER -