Auto Camera Calibration Research Papers (original) (raw)

This thesis describes an approach for online calibration of stereo cameras on an embedded system. It introduces a new functionality in cyber physical systems measuring the quality of service of this functionality in cyber physical... more

This thesis describes an approach for online calibration of stereo cameras on an embedded system. It introduces a new functionality in cyber physical systems measuring the quality of service of this functionality in cyber physical systems. Thus, the manuscript proposes the dynamically monitoring and calculation of the internal sensor parameters (required for many computer vision tasks). The method both increases security and improves the performance of systems using stereo cameras. It prolongs the life of the devices, thanks to this self-repair procedure, which can increase autonomy. Systems such as mobile robots or smart glasses in particular can directly benefit from this technique.
The stereo camera is a sensor capable of providing a wide spectrum of data. Beforehand, the sensor must know the position of each camera. Therefore, the system requires an initial calibration. However, camera positions can change over time due to interactions with the external environment for example (shocks, vibrations...). This time, a recalibration operation must allow correcting these effects... Indeed, misunderstood data can lead to errors and malfunction of applications. In order to counter such a scenario, the system must have an internal mechanism, service quality, to decide whether the current parameters are correct and/or calculate new ones, if necessary.
The approach proposed in this thesis realizes a self-calibration method based on the use of data coming only from the observed scene (without controlled models). First of all, we consider calibration as a system process running in the background and having to run continuously in real time. This internal calibration is not the main task of the system, but the procedure on which high-level applications rely. For this reason, system constraints severely limit the algorithm in terms of complexity, memory and time. The proposed calibration method requires few resources and uses standard data from computer vision applications, so it is hidden within the application pipeline.
In this manuscript, we present many discussions to topics related to the online stereo calibration on embedded systems, such as problems on the extraction of robust points of interest, the calculation of the scale factor, hardware aspects, high level applications requiring this approach, etc. Finally, this thesis describes and explains a methodology for the constitution of a new type of dataset to represent a change in camera position to validate the approach. The manuscript also explains the different work environments used in the realization of the datasets and the camera calibration procedure. In addition, it presents the first prototype of a smart helmet, on which the proposed self-calibration service is dynamically executed. Finally, a real-time characterization on an embedded ARM Cortex A7 processor is performed.