Support our work
Decorative header background

Multiple distance cues do not prevent systematic biases in reach to grasp movements

Publication year 2019
Published in Psychological Research
Authors C. Bozzacchi, Karl K Kopiske, Robert Volcic, Fulvio Domini,
The order of authors may deviate from the original publication due to temporary technical issues.

The perceived distance of objects is biased depending on the distance from the observer at which objects are presented, such that the egocentric distance tends to be overestimated for closer objects, but underestimated for objects further away. This leads to the perceived depth of an object (i.e., the perceived distance from the front to the back of the object) also being biased, decreasing with object distance. Several studies have found the same pattern of biases in grasping tasks. However, in most of those studies, object distance and depth were solely specified by ocular vergence and binocular disparities. Here we asked whether grasping objects viewed from above would eliminate distance-dependent depth biases, since this vantage point introduces additional information about the object’s distance, given by the vertical gaze angle, and its depth, given by contour information. Participants grasped objects presented at different distances (1) at eye-height and (2) 130 mm below eye-height, along their depth axes. In both cases, grip aperture was systematically biased by the object distance along most of the trajectory. The same bias was found whether the objects were seen in isolation or above a ground plane to provide additional depth cues. In two additional experiments, we verified that a consistent bias occurs in a perceptual task. These findings suggest that grasping actions are not immune to biases typically found in perceptual tasks, even when additional cues are available. However, online visual control can counteract these biases when direct vision of both digits and final contact points is available.

Support our work!

The Friends Foundation facilitates groundbreaking brain research. You can help us with that.

Support our work