Calibration Method For An Augmented Reality System (original) (raw)
Related papers
Fast calibration for augmented reality
Proceedings of the ACM symposium on Virtual reality software and technology - VRST '99, 1999
Augmented Reality overlays computer generated images over the real world. To correctly depict spatial relations between real and virtual objects, these images have to be generated using transformations which correctly project a point in virtual space onto its corresponding point in the real world.
Calibration Requirements and Procedures for Augmented Reality
1997
Augmented reality entails the use of models and their associated renderings to supplement information in a real scene. In order for this information to be relevant or meaningful, the models must be positioned and displayed in such a way that they blend into the real world in terms of alignments, perspectives, illuminations, etc. For practical reasons the information necessary to obtain this realistic blending cannot be known a priori, and cannot be hard-wired into a system. Instead a number of calibration procedures are necessary so that the location and parameters of each of the system components are known. In this paper we identify the calibration steps necessary to build a complete computer model of the real world and then, using the augmented reality system developed at ECRC (Grasp) as an example, we describe each of the calibration processes.
Object calibration for augmented reality
Computer Graphics …, 1995
Augmented reality involves the use of models and their associated renderings to supplement information in a real scene. In order for this information to be relevant or meaningful, the models must be positioned and displayed in such a way that they align with their corresponding ...
Calibration Errors in Augmented Reality: A Practical Study
2005
This paper confronts some theoretical camera models to reality and evaluates the suitability of these models for effective augmented reality (AR). It analyses what level of accuracy can be expected in real situations using a particular camera model and how robust the results are against realistic calibration errors. An experimental protocol is used that consists of taking images of a particular scene from different quality cameras mounted on a 4DOF micro-controlled device. The scene is made of a calibration target and three markers placed at different distances of the target. This protocol enables us to consider assessment criteria specific to AR as alignment error and visual impression, in addition to the classical camera positioning error.
Robust camera pose estimation using 2d fiducials tracking for real-time augmented reality systems
Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry - VRCAI '04, 2004
Augmented reality (AR) deals with the problem of dynamically and accurately align virtual objects with the real world. Among the used methods, vision-based techniques have advantages for AR applications, their registration can be very accurate, and there is no delay between the motion of real and virtual scenes. However, the downfall of these approaches is their high computational cost and lack of robustness. To address these shortcomings we propose a robust camera pose estimation method based on tracking calibrated fiducials in a known 3D environment, the camera location is dynamically computed by the Orthogonal Iteration Algorithm. Experimental results show the robustness and the effectiveness of our approach in the context of real-time AR tracking.
CAMERA CONSTRAINT ON MULTI-RANGE CALIBRATION OF AUGMENTED REALITY SYSTEMS FOR CONSTRUCTION SITES
2008
Accurate calibration methods are critical to achieving accurate registration in Augmented Reality (AR) systems. Most calibration methods developed so far focus on achieving the accurate registration within certain view distance ranges. Examination of work tasks in construction, however, indicates that the AR system registration accuracy may not satisfy the accuracy demands of work tasks due to variations in the view
Automated camera calibration and 3D egomotion estimation for augmented reality applications
Lecture Notes in Computer Science, 1997
This paper addresses the problem of accurately tracking the 3D motion of a monocular camera in a known 3D environment and dynamically estimating the 3D camera location. For that purpose we propose a fully automated landmarkbased camera calibration method and initialize a motion estimator, which employes extended Kalman filter techniques to track landmarks and to estimate the camera location at any given time. The implementation of our approach has been proven to be efficient and robust and our system successfully tracks in real-time at approximately 10 Hz. We show tracking results of various augmented reality scenarios.
Marker-based non-overlapping camera calibration methods with additional support camera views
Image and Vision Computing, 2018
Simple methods to calibrate non-overlapping cameras using markers on the cameras are proposed. By adding an augmented reality (AR) marker to a camera, we can find the transformation between the fixed AR marker and the camera. With such information, the relative pose between cameras can be found as long as the markers are visible to additional support cameras. The proposed method consists of two steps: (1) use of an extra support camera and a chessboard to find the transformation between the AR marker and the camera and (2) use of the transformation between markers to calibrate non-overlapping cameras. Compared to an existing method, the proposed method works stably and uses fewer images.
Calibration-free augmented reality
IEEE Transactions on Visualization and Computer Graphics, 1998
Camera calibration and the acquisition of Euclidean 3D measurements have so far been considered necessary requirements for overlaying three-dimensional graphical objects with live video. In this article, we describe a new approach to videobased augmented reality that avoids both requirements: It does not use any metric information about the calibration parameters of the camera or the 3D locations and dimensions of the environment's objects. The only requirement is the ability to track across frames at least four fiducial points that are specified by the user during system initialization and whose world coordinates are unknown.
LARGE SCALE CALIBRATION FOR AUGMENTED REALITY ON CONSTRUCTION SITES
2000
Accurate calibration methods are a key to achieve accurate registration in AR systems. Most calibrations developed thus far represent the system accuracy as pixel errors regardless of the view distance. Examination of work tasks in construction, however, indicates that the system accuracy may not satisfy the accuracy demands of specific work tasks due to variations in the view distance. This