Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||How low can you go? detecting style in extremely low resolution images|
|Citation:||Journal of Experimental Psychology: Human Perception and Performance, 2019; 45(5):573-584|
|Publisher:||American Psychological Association|
|Rachel A. Searston, Matthew B. Thompson, John R. Vokey, Luke A. French, Jason M. Tangen|
|Abstract:||Humans can see through the complexity of scenes, faces, and objects by quickly extracting their redundant low-spatial and low-dimensional global properties, or their style. It remains unclear, however, whether semantic coding is necessary, or whether visual stylistic information is sufficient, for people to recognize and discriminate complex images and categories. In two experiments, we systematically reduce the resolution of hundreds of unique paintings, birds, and faces, and test people’s ability to discriminate and recognize them. We show that the stylistic information retained at extremely low image resolutions is sufficient for visual recognition of images and visual discrimination of categories. Averaging over the 3 domains, people were able to reliably recognize images reduced down to a single pixel, with large differences from chance discriminability across 8 different image resolutions. People were also able to discriminate categories substantially above chance with an image resolution as low as 2 x 2 pixels. We situate our findings in the context of contemporary computational accounts of visual recognition and contend that explicit encoding of the local features in the image, or knowledge of the semantic category, is not necessary for recognizing and distinguishing complex visual stimuli.|
|Keywords:||Visual recognition; visual discrimination; ensemble; gist; perceptual expertise|
|Rights:||© 2019 American Psychological Association|
|Appears in Collections:||Psychology publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.