The extent of visual space inferred from perspective angles (original) (raw)
Related papers
The perspective structure of visual space
Luneburg’s model has been the reference for experimental studies of visual space for almost seventy years. His claim for a curved visual space has been a source of inspiration for visual scientists as well as philosophers. The conclusion of many experimental studies has been that Luneburg’s model does not describe visual space in various tasks and conditions. Remarkably, no alternative model has been suggested. The current study explores perspective transformations of Euclidean space as a model for visual space. Computations show that the geometry of perspective spaces is considerably different from that of Euclidean space. Collinearity but not parallelism is preserved in perspective space and angles are not invariant under translation and rotation. Similar relationships have shown to be properties of visual space. Alley experiments performed early in the nineteenth century have been instrumental in hypothesizing curved visual spaces. Alleys were computed in perspective space and compared with reconstructed alleys of Blumenfeld. Parallel alleys were accurately described by perspective geometry. Accurate distance alleys were derived from parallel alleys by adjusting the interstimulus distances according to the size-distance invariance hypothesis. Agreement between computed and experimental alleys and accommodation of experimental results that rejected Luneburg’s model show that perspective space is an appropriate model for how we perceive orientations and angles. The model is also appropriate for perceived distance ratios between stimuli but fails to predict perceived distances.
Perception of Perspective Angles
We perceive perspective angles, that is, angles that have an orientation in depth, differently from what they are in physical space. Extreme examples are angles between rails of a railway line or between lane dividers of a long and straight road. In this study, subjects judged perspective angles between bars lying on the floor of the laboratory. Perspective angles were also estimated from pictures taken from the same point of view. Converging and diverging angles were judged to test three models of visual space. Four subjects evaluated the perspective angles by matching them to nonperspective angles, that is, angles between the legs of a compass oriented in the frontal plane. All subjects judged both converging and diverging angles larger than the physical angle and smaller than the angles in the proximal stimuli. A model of shallow visual space describes the results. According to the model, lines parallel to visual lines, vanishing at infinity in physical space, converge to visual lines in visual space. The perceived shape of perspective angles is incompatible with the perceived length and width of the bars. The results have significance for models of visual perception and practical implications for driving and flying in poor visibility conditions.
Geometric Constraints of Visual Space
i-Perception, 2021
Perspective space has been introduced as a computational model of visual space. The model is based on geometric features of visual space. The model has proven to describe a range of phenomena related to the visual perception of distance and size. Until now, the model lacks a mathematical description that holds for complete 3D space. Starting from a previously derived equation for perceived distance in the viewing direction, the suitability of various functions is analyzed. Functions must fulfill the requirement that straight lines, oriented in whatever direction in physical space, transfer to straight lines in visual space. A second requirement is that parallel lines oriented in depth in physical space, converge to a finite vanishing point in visual space. A rational function for perceived distance, compatible with the perspective-space model of visual space, satisfies the requirements. The function is unique. Analysis of alternative functions shows there is little tolerance for deviations. Conservation of the straightness of lines constrains visual space to having a single geometry. Visual space is described by an analytical function having one free parameter, that is, the distance of the vanishing point.
Perspective Space as a Model for Distance and Size Perception
i-Perception, 2017
In the literature, perspective space has been introduced as a model of visual space. Perspective space is grounded on the perspective nature of visual space during both binocular and monocular vision. A single parameter, that is, the distance of the vanishing point, transforms the geometry of physical space into that of perspective space. The perspective-space model predicts perceived angles, distances, and sizes. The model is compared with other models for distance and size perception. Perspective space predicts that perceived distance and size as a function of physical distance are described by hyperbolic functions. Alternatively, power functions have been widely used to describe perceived distance and size. Comparison of power and hyperbolic functions shows that both functions are equivalent within the range of distances that have been judged in experiments. Two models describing perceived distance on the ground plane appear to be equivalent with the perspective-space model too. The conclusion is that perspective space unifies a number of models of distance and size perception.
Spatial Judgments from Different Vantage Points: A Different Perspective
PsycEXTRA Dataset
Todorovic (2008) reported that there are systematic errors in the perception of 3-D space when viewing 2-D linear perspective drawings depending on the observer's vantage point. Because Todorovic's findings were restricted to the horizontal plane, the current study was designed to determine whether the magnitude of these errors would be similar in the vertical plane. Participants viewed a 2D image containing rows of columns aligned on parallel converging lines receding to a vanishing point. They were asked to judge where in the physical room the next column should be placed. The results support Todorovic (2008) in that systematic deviations in the spatial judgments depended on vantage point for both the horizontal and vertical planes. However, the pattern of deviation differed between the two planes. While judgments in both planes failed to compensate adequately for the vantage point shift, the vertical plane induced greater distortions of the stimulus image itself within each vantage point.
The illusion of perceived metric 3d structure
… Visualization, 2002. INFOVIS …, 2002
Running head: Perceived metric 3D structure Perceived metric 3D structure Abstract A large body of results on the characteristics of human spatial vision suggests that space perception is distorted. Recent studies indicate that the geometry of visual space is best understood as Affine. If this is the case, it has far reaching implications on how 3D visualizations can be successfully employed. For instance, all attempts to build visualization systems where users are expected to discover relations based on Euclidean distances or shapes will be ineffective. Because visualization can, and sometimes does, employ all possible types of depth information and because the results from vision research usually concentrates on one or two such types, three experiments were performed under near optimal viewing conditions. The aim of the experiments was twofold: To test whether the earlier findings generalize to shape perception under near optimal viewing conditions and to get a sense of the size of the error under such conditions. The results show that the findings do generalize and that the errors are large. The implications of these results for successful visualizations are discussed. Perceived metric 3D structure
Visual perception of extent and the geometry of visual space
Vision Research, 2004
The question of how perceived extents are related to the corresponding physical extents is a very old question that has not been satisfactorily answered. The common model is that perceived extent is proportional to the product of image size and perceived distance. We describe an experiment that shows that perceived extents are substantially larger than this model predicts. We propose a model that accounts for our results and a large set of other results. The principal assumption of the model is that, in the computation of perceived extent, the visual angle signal undergoes a magnifying transform. Extent is often perceived more accurately than the common model predicts, so the computation is adaptive. The model implies that, although the perception of location and the perception of extent are related, they not related by Euclidean geometry, nor by any metric geometry. Nevertheless, it is possible to describe the perception of location and extent using a simple model.
Vision, 2018
This paper attempts to differentiate between two models of visual space. One model suggests that visual space is a simple affine transformation of physical space. The other proposes that it is a transformation of physical space via the laws of perspective. The present paper reports two experiments in which participants are asked to judge the size of the interior angles of squares at five different distances from the participant. The perspective-based model predicts that the angles within each square on the side nearest to the participant should seem smaller than those on the far side. The simple affine model under our conditions predicts that the perceived size of the angles of each square should remain 90 •. Results of both experiments were most consistent with the perspective-based model. The angles of each square on the near side were estimated to be significantly smaller than the angles on the far side for all five squares in both experiments. In addition, the sum of the estimated size of the four angles of each square declined with increasing distance from the participant to the square and was less than 360 • for all but the nearest square.
Axiomathes, 2019
Every linear perspective image has a center of the perspective construction. Only when observed from that location does a 2D image provide the same stimulus as the original 3D scene. Geometric analyses indicate that observing the image from other vantage points should affect the perceived spatial structure of the scene conveyed by the image, involving transformations such as shear, compression, and dilation. Based on previous research, this paper presents a detailed account of these transformations. The analyses are presented in a uniform manner, illustrated with special 3D diagrams, and embedded in a wider framework of related perspective paradigms. Such analyses provide the potential theoretical basis for empirical work on the effects of changes of observer vantage points on the perception of spatial structure in perspective images.