Things are not always as they appear. New visual perception research at The University of Texas at Austin, published in the Proceedings of the National Academy of Sciences, explains the natural limits of what humans can see and how to find what nature hides.
UT Austin researchers investigated the three main background properties that affect the ability to see objects: the luminance or brightness, the contrast (the variation in luminance) and the similarity of the background to the orientation and shape of the object. Using an experimental and theoretical approach involving analysis of millions of natural images, the researchers found that the ability to detect the differences between the object and the background was predicted directly from the physics of natural stimuli.
“The discovery of highly systematic laws for perception in natural scenes — made possible by constrained statistical sampling — is a potential game changer,” said the paper’s lead author Wilson Geisler, a UT Austin professor of psychology and director of the Center for Perceptual Systems. “It demonstrates how to study complex real-world perception with the same level of rigor that was previously achieved only with simple synthetic stimuli.”
In order for people to pick out an object against a background, the object must differ from the background by a “just noticeable difference,” a threshold quantified by the minimum difference a person can detect the majority of the time. Even as the properties of both the object and the background vary, the threshold remains in constant proportion to the product of the background properties — a generalized version of Weber’s law.
“The ability of these background properties to mask objects is well known for simple laboratory stimuli,” Geisler said. “However, it was not known how these properties combine to mask objects in natural scenes.”
The researchers considered the effects of stimulus uncertainty. Under real-world conditions, the properties of the object and of the background against which the object appears will randomly vary from one occasion to the next, creating a stimulus uncertainty that can also affect accuracy in detecting the object.
Their findings showed that the detrimental effects of this uncertainty can be minimized by estimating the luminance, contrast and similarity at the object’s possible locations, and then dividing the neural responses at each of these locations by the product of these estimates. The researchers found strong evidence that these computations are done automatically in the human visual system.
Knowing this may lead to improved radiology technology to help radiologists identify abnormalities in the human body; or better security imaging at airports to detect suspicious items in luggage; or enhanced camouflage design to disguise soldiers in war zones, Geisler said.
“There are many potential applications of these findings. For example, radiological images are highly complex, like the natural images that drove the evolution of the human visual system. Thus, the perceptual laws for natural images may predict when a radiologist will have difficulty detecting suspicious objects in a radiological image. These predictions could be used to alert the radiologist to locations where extra scrutiny would be advised,” Geisler said.