Differential Eye Movements in Verbal and Nonverbal Search

Creative Commons License

Hurley R. S. , Sander J., Nemeth K., Lapin B. R. , Huang W., Seckin M.

FRONTIERS IN COMMUNICATION, vol.6, 2021 (Journal Indexed in ESCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 6
  • Publication Date: 2021
  • Doi Number: 10.3389/fcomm.2021.654575
  • Keywords: visual search, eye movements, language, nonverbal processing, picture superiority, VISUAL-SEARCH, TIME-COURSE, NEURAL MECHANISMS, ATTENTION, PICTURE, INFORMATION, LANGUAGE, WORDS, FIXATION, SHAPE


In addition to "nonverbal search" for objects, modern life also necessitates "verbal search" for written words in variable configurations. We know less about how we locate words in novel spatial arrangements, as occurs on websites and menus, than when words are located in passages. In this study we leveraged eye tracking technology to examine the hypothesis that objects are simultaneously screened in parallel while words can only be found when each are directly foveated in serial fashion. Participants were provided with a cue (e.g. rabbit) and tasked with finding a thematically-related target (e.g. carrot) embedded within an array including a dozen distractors. The cues and arrays were comprised of object pictures on nonverbal trials, and of written words on verbal trials. In keeping with the well-established "picture superiority effect," picture targets were identified more rapidly than word targets. Eye movement analysis showed that picture superiority was promoted by parallel viewing of objects, while words were viewed serially. Different factors influenced performance in each stimulus modality; lexical characteristics such as word frequency modulated viewing times during verbal search, while taxonomic category affected viewing times during nonverbal search. In addition to within-platform task conditions, performance was examined in cross-platform conditions where picture cues were followed by word arrays, and vice versa. Although taxonomically-related words did not capture gaze on verbal trials, they were viewed disproportionately when preceded by cross-platform picture cues. Our findings suggest that verbal and nonverbal search are associated with qualitatively different search strategies and forms of distraction, and cross-platform search incorporates characteristics of both.