maria rendas | Centre National de la Recherche Scientifique / French National Centre for Scientific Research (original) (raw)
Papers by maria rendas
We consider a non-parametric Maximum Likelihood (ML) density estimation problem with censored obs... more We consider a non-parametric Maximum Likelihood (ML) density estimation problem with censored observations for a biophysical model describing the production of nitrogen bubbles during deep-sea diving. The ultimate objective is to predict the bubble production associated with a diving profile, in order to prevent decompression sickness accidents. Observations correspond to quantized counts of bubbles circulating in the blood (right ventricule), called grades, for a series of divers that used various diving profiles, each diver having an individual parameter value for the biophysical model. Assuming that the relation between observed grades and volume of gaz produced is known, we estimate the distribution of the model parameters in the population of divers considered by maximum Travail partiellement financé par le projet DGA–RAPID “SAFE DIVE”
2018 26th European Signal Processing Conference (EUSIPCO), 2018
This work compares the performance of parametric mixed-effects models to a completely non-paramet... more This work compares the performance of parametric mixed-effects models to a completely non-parametric (NP) approach for modelling life-long evolution of competition performance for athletes. The difficulty of the problem lies in the strongly unbalanced characteristics of the functional dataset. The prediction performance of the identified models is compared, revealing the advantages and limitations of the two approaches. As far as we know this is the first time NP modelling of athletic performance is attempted, our study confirming its appropriateness whenever sufficiently rich datasets are available.
2010 20th International Conference on Pattern Recognition, 2010
IFAC Proceedings Volumes
Abstract In this paper we address the problem of chemical plume tracing using an autonomous under... more Abstract In this paper we address the problem of chemical plume tracing using an autonomous underwater vehicle. We assume that turbulent water flow induces a random pattern of the readings of a binary (on off) sensor used to detect the chemical being dispersed in the water. Our algorithm is completely sensor driven, not requiring idependent measures of the water flow direction and intensity. Instead, it relies on a stochastic model of the plume dispersion which is parametrised by the chemical source position, which we want to estimate, and by other nuisance parameters - comprising water flow - which are jointly estimated along with the desired source location. Instead of being based on a discrete controller that chains a series of perception driven behaviors, the mapping trajectory is chosen, at each step, as the direction along which the expected probability (given all data observed so far) of detecting the chemical is maximal.
SHORT ABSTRACT: The paper is a contribution to the problem of producing sensor-specific seabed ma... more SHORT ABSTRACT: The paper is a contribution to the problem of producing sensor-specific seabed maps, that describe the morphological features of each different region in the area considered. It applies the Minimum Description Length (MDL) principle to identify the dimensionality of a statistical shape model that describes them. Our shape model combines the finite-dimensional representation of continuous curves with the notion of shape space as introduced by Kendall to obtain a finite representation of closed curves with a prespecified set of invariances. The work presented can be considered as an extension of previous work by other authors which use MDL to adjust the complexity of a spline representation to a single contour present in an image, while we use MDL to adjust the complexity of a statistical model for the spline representation of a set of contours.
This paper describes adaptive sampling of the bathymetry of a prescribed area using an inexpensiv... more This paper describes adaptive sampling of the bathymetry of a prescribed area using an inexpensive autonomous vehicle equipped of a bathymetric sensor. The proposed mapping approach should not be understood as an alternative to the present use of oceanographic vessels equipped of sophisticated sensors, but rather as a complementary observation tool, enabling rapid, inexpensive and frequent observation of areas under study. In some sense, the high quality of traditional surveying techniques is traded for the possibility of achieving a denser temporal sampling, and direct acquisition of relevant surface features, which may be important to enable analysis of impact of temporally localized perturbations, like storms, or of external (human) intervention in a given area. The work described has been carried out in the context of the EU funded (IST program) project SUMARE – acronym for Survey of Marine Resources – that develops adaptive sampling techniques for the observation and monitoring...
Thepaperpresents workontheconstruction of video mosaics oftheseabottom, acquired byanautonomous u... more Thepaperpresents workontheconstruction of video mosaics oftheseabottom, acquired byanautonomous un- derwater robot. Theapproach presented, basedontheMinimum Description Length, whilestrongly grounded onprobabilistic theory, isabletosolve theimageregistration problem without needforuser-defined probabilistic models(noneedforneither worldnorsensor models), presenting several advantages over traditional techniques. Mosaicconstruction isbasedonadense locally estimated puretranslational displacement field. Thefact thattheproposed metric forimageco-registration doesnot require thepresence ofdistinctive features (like corner-like structures), together withthefactthatitisinherently robust tosmoothillumination variations across theimageplane, makes itparticularly suitable fortheunderwater environment, where individual features present mostoften astrong resemblance, and whereillumination variations areknowntobefrequent, dueto theneedtocarry thelight source onboard theplatform. Evenifa puretra...
We study the Cram&-Rao lower bound for source localization in the context of multiple stochastic ... more We study the Cram&-Rao lower bound for source localization in the context of multiple stochastic sources, mul- tipath propagation, and observations in an array of sensors. We derive a general expression which is then specialized to sim- pler configurations and related to results previously reported in the literature. The special case of a single stochastic source in a multipath environment is treated in detail. We assess the relative importance for source localization of the temporal (multipath) and spatial (array baseline) structures of the in- coming wavefield. We show that for an array of K sensors the multipath contribution to the Fisher information matrix can be interpreted as the contribution of K independent arrays whose size depends on the number of spatially resolved replicas. We analyze the degradation due to unknown source spectra. When the source spectrum is completely arbitrary, source location is not possible with a single sensor. If a parametric form of the source spe...
Increasingly complex numerical models are involved in a variety of modern engineering application... more Increasingly complex numerical models are involved in a variety of modern engineering applications, ranging from evaluation of environmental risks to optimisation of sophisticated industrial processes. Study of climat change is an extremely well-known example, while its current use in other domains like pharmaceutics (the so-called in vitro experiments), aeronautics or even cosmetics are less well known of the general public. These models allow the prediction of a number of variables of interest for a given configuration of a number of factors that potentially affect them. Complex models depend in general on a large number of such factors, and their execution time may range from a couple of hours to several days. In many cases, collectively falling in the domain of risk analysis, the interest is in identifying how often, under what conditions, or how strongly, a certain phenomenon may happen. In addition to the numerical model that predicts the variable of interest, it is then neces...
The incremental construction of nested designs having good spreading properties over the d-dimens... more The incremental construction of nested designs having good spreading properties over the d-dimensional hypercube is considered, for values of d such that the 2 d vertices of the hypercube are too numerous to be all inspected. A greedy algorithm is used, with guaranteed efficiency bounds in terms of packing and covering radii, using a 2 d−m fractional-factorial design as candidate set for the sequential selection of design points. The packing and covering properties of fractional-factorial designs are investigated and a review of the related literature is provided. An algorithm for the construction of fractional-factorial designs with maximum packing radius is proposed. The spreading properties of the obtained incremental designs, and of their lower dimensional projections, are investigated. An example with d = 50 is used to illustrate that their projection in a space of dimension close to d has a much higher packing radius than projections of more classical designs based on Latin hy...
OCEANS 2015 - MTS/IEEE Washington, Oct 1, 2015
The paper proposes rCV, a new randomised Cross Validation (CV) criterion specially designed for u... more The paper proposes rCV, a new randomised Cross Validation (CV) criterion specially designed for use with data acquired over non-uniformly scattered designs, like the linear transect surveys typical in environmental observation. The new criterion enables a robust parameterisation of interpolation algorithms, in a manner completely driven by the data and free of any modelling assumptions. The new CV method randomly chooses the hold-out sets such that they reflect, statistically, the geometry of the design with respect to the unobserved points of the area where the observations are to be extrapolated, minimising biases due to the particular geometry of the designs. Numerical results on both simulated and realistic datasets show its robustness and superiority, leading to interpolated fields with smaller error.
Technometrics, 2016
We consider the problem of constructing metamodels for computationally expensive simulation codes... more We consider the problem of constructing metamodels for computationally expensive simulation codes; that is, we construct interpolation/prediction of functions values (responses) from a finite collection of evaluations (observations). We use Gaussian process modeling and Kriging, and combine a Bayesian approach, based on a finite set of covariance functions, with the use of localized models, indexed by the point where the prediction is made. Our approach does not yield a single generative model for the unknown function, but by letting the weights of the different covariance functions depend on the prediction site, it gives enough flexibility for predictions to accommodate to non-stationarity. Contrary to Kriging prediction with plug-in parameter estimates, the resulting Bayesian predictor is constructed explicitly, without requiring any numerical optimization. It inherits the smoothness properties of the covariance functions that are used and its superiority over the plug-in Kriging predictor (sometimes also called empirical-best-linear-unbiased predictor) is illustrated on various examples, including the reconstruction of an oceanographic field from a small number of observations.
We consider the problem of constructing metamodels for computationally expensive simulation codes... more We consider the problem of constructing metamodels for computationally expensive simulation codes; that is, we construct interpolators/predictors of functions values (responses) from a finite collection of evaluations (observations). We use Gaussian process (GP) modeling and kriging, and combine a Bayesian approach, based on a finite set GP models, with the use of localized covariances indexed by the point where the prediction is made. Our approach is not based on postulating a generative model for the unknown function, but by letting the covariance functions depend on the prediction site, it provides enough flexibility to accommodate arbitrary nonstationary observations. Contrary to kriging prediction with plug-in parameter estimates, the resulting Bayesian predictor is constructed explicitly, without requiring any numerical optimization, and locally adjusts the weights given to the different models according to the data variability in each neighborhood. The predictor inherits the ...
In this paper we present a novel approach to mobile robot navigation in unstructured environments... more In this paper we present a novel approach to mobile robot navigation in unstructured environments. Natural scenes can very often be considered as random fields where a large number of individual objects appear to be randomly scattered. This randomness can be described by statistical models. In this paper we consider that a natural scene can be interpreted as realisations of Random Closed Sets (RCS), whose global characteristics are mapped. Contrary to the feature based approach, this environment representation does not require the existence of outstanding objects in the workspace, and is robust with respect to small dynamic changes. We address the problem of estimating the position of a mobile robot, assuming that a statistical model, serving as a map of the environment, is available to it a priori. Simulation results demonstrate the feasibility of our approach.
This paper presents a communication/coordination/ processing architecture for distributed adaptiv... more This paper presents a communication/coordination/ processing architecture for distributed adaptive observation of a spatial field using a fleet of autonomous mobile sensors. One of the key difficulties in this context is to design scalable algorithms for incremental fusion of information across platforms robust to what is known as the “rumor problem”. Incremental fusion is in general based on a Bayesian approach, and algorithms (e.g. the Covariance Intersection, CI) which propagate consistent characterizations of the estimation error under this challenging situation have been proposed. In this paper, we propose to base inter-sensor fusion on a deterministic approach which considers that bounds to the observation errors are known, wich is intrinsically robust to the rumor problem. We present the equations that enable the determination of the ellipsoidal domain of uncertainty that covers the intersection of the individual sets describing sensor’s uncertainty, and show that they solve ...
We study the problem of model fitting in the framework of nested probabilistic families. Our crit... more We study the problem of model fitting in the framework of nested probabilistic families. Our criteria are: (i) sparsity of the identified representation, (ii) its ability to fit the (finite length) data set available. As we show in this paper, current methodologies, often taking the form of penalized versions of the data likelihood, cannot simultaneously satisfy these requirements, as the examples presented clearly demonstrate. On the contrary, maximization of the Bayesian model posterior, even without assumption of a complexity penalizing prior, is able to select models with appropriate complexity, enabling sound determination of its parameters in a second step. 1.1 Problem formulation 1.
We address the problem of surveying of oceanic parameters using autonomous instrumented mobile pl... more We address the problem of surveying of oceanic parameters using autonomous instrumented mobile platforms. As an example, we consider the problem of current mapping in coastal areas. We study the impact on survey efficiency of using a priori knowledge concerning the surveyed field for on-line guidance of the sensors, as an alternative to the classical approach of executing a predefined trajectory, or to the more recently proposed perception-driven observation strategies. Availability of this a priori model enables extrapolation of the measurements, as well as the determination of the information yield by future observations, allowing the search for the best next observation point. In the paper, we present simulation results of the proposed on-line guidance based on information gain, and compare its efficiency to standard survey strategies.
In this communication we address the problem of shape recognition in the context of autonomous na... more In this communication we address the problem of shape recognition in the context of autonomous navigation of mobile robots. We present an association procedure which is able to establish the correspondence between elements of a learned fuzzy internal representation of the environment and currently perceived objects. Using this procedure, a mobile robot is able to periodically decrease the uncertainty affecting its position.
This paper addresses the impact of uncertainty modeling and of the use of perceptual guidance on ... more This paper addresses the impact of uncertainty modeling and of the use of perceptual guidance on the navigational autonomy for vehicles operating in unknown environments, concentrating around four main topics: security control, planning, perceptual guidance and mapping. We present some of the results obtained in the European Union project NARVAL, which globally aims at using the perception capabilities of an autonomous vehicle to extend its autonomy range. One of the major issues addressed is related to guarantee vehicle's safety. We propose novel methodologies to (i) minimize the risk of loosing the vehicle and (ii) avoid destructive interference with objects present in its workspace. Planning under uncertainty is also studied, and algorithms to deene trajectories for guiding the robot to a target region, exploring a delimited environment, or for optimizing autonomous beacon laying, are presented. The problem of using perceptual data for direct control is discussed in connectio...
We consider a non-parametric Maximum Likelihood (ML) density estimation problem with censored obs... more We consider a non-parametric Maximum Likelihood (ML) density estimation problem with censored observations for a biophysical model describing the production of nitrogen bubbles during deep-sea diving. The ultimate objective is to predict the bubble production associated with a diving profile, in order to prevent decompression sickness accidents. Observations correspond to quantized counts of bubbles circulating in the blood (right ventricule), called grades, for a series of divers that used various diving profiles, each diver having an individual parameter value for the biophysical model. Assuming that the relation between observed grades and volume of gaz produced is known, we estimate the distribution of the model parameters in the population of divers considered by maximum Travail partiellement financé par le projet DGA–RAPID “SAFE DIVE”
2018 26th European Signal Processing Conference (EUSIPCO), 2018
This work compares the performance of parametric mixed-effects models to a completely non-paramet... more This work compares the performance of parametric mixed-effects models to a completely non-parametric (NP) approach for modelling life-long evolution of competition performance for athletes. The difficulty of the problem lies in the strongly unbalanced characteristics of the functional dataset. The prediction performance of the identified models is compared, revealing the advantages and limitations of the two approaches. As far as we know this is the first time NP modelling of athletic performance is attempted, our study confirming its appropriateness whenever sufficiently rich datasets are available.
2010 20th International Conference on Pattern Recognition, 2010
IFAC Proceedings Volumes
Abstract In this paper we address the problem of chemical plume tracing using an autonomous under... more Abstract In this paper we address the problem of chemical plume tracing using an autonomous underwater vehicle. We assume that turbulent water flow induces a random pattern of the readings of a binary (on off) sensor used to detect the chemical being dispersed in the water. Our algorithm is completely sensor driven, not requiring idependent measures of the water flow direction and intensity. Instead, it relies on a stochastic model of the plume dispersion which is parametrised by the chemical source position, which we want to estimate, and by other nuisance parameters - comprising water flow - which are jointly estimated along with the desired source location. Instead of being based on a discrete controller that chains a series of perception driven behaviors, the mapping trajectory is chosen, at each step, as the direction along which the expected probability (given all data observed so far) of detecting the chemical is maximal.
SHORT ABSTRACT: The paper is a contribution to the problem of producing sensor-specific seabed ma... more SHORT ABSTRACT: The paper is a contribution to the problem of producing sensor-specific seabed maps, that describe the morphological features of each different region in the area considered. It applies the Minimum Description Length (MDL) principle to identify the dimensionality of a statistical shape model that describes them. Our shape model combines the finite-dimensional representation of continuous curves with the notion of shape space as introduced by Kendall to obtain a finite representation of closed curves with a prespecified set of invariances. The work presented can be considered as an extension of previous work by other authors which use MDL to adjust the complexity of a spline representation to a single contour present in an image, while we use MDL to adjust the complexity of a statistical model for the spline representation of a set of contours.
This paper describes adaptive sampling of the bathymetry of a prescribed area using an inexpensiv... more This paper describes adaptive sampling of the bathymetry of a prescribed area using an inexpensive autonomous vehicle equipped of a bathymetric sensor. The proposed mapping approach should not be understood as an alternative to the present use of oceanographic vessels equipped of sophisticated sensors, but rather as a complementary observation tool, enabling rapid, inexpensive and frequent observation of areas under study. In some sense, the high quality of traditional surveying techniques is traded for the possibility of achieving a denser temporal sampling, and direct acquisition of relevant surface features, which may be important to enable analysis of impact of temporally localized perturbations, like storms, or of external (human) intervention in a given area. The work described has been carried out in the context of the EU funded (IST program) project SUMARE – acronym for Survey of Marine Resources – that develops adaptive sampling techniques for the observation and monitoring...
Thepaperpresents workontheconstruction of video mosaics oftheseabottom, acquired byanautonomous u... more Thepaperpresents workontheconstruction of video mosaics oftheseabottom, acquired byanautonomous un- derwater robot. Theapproach presented, basedontheMinimum Description Length, whilestrongly grounded onprobabilistic theory, isabletosolve theimageregistration problem without needforuser-defined probabilistic models(noneedforneither worldnorsensor models), presenting several advantages over traditional techniques. Mosaicconstruction isbasedonadense locally estimated puretranslational displacement field. Thefact thattheproposed metric forimageco-registration doesnot require thepresence ofdistinctive features (like corner-like structures), together withthefactthatitisinherently robust tosmoothillumination variations across theimageplane, makes itparticularly suitable fortheunderwater environment, where individual features present mostoften astrong resemblance, and whereillumination variations areknowntobefrequent, dueto theneedtocarry thelight source onboard theplatform. Evenifa puretra...
We study the Cram&-Rao lower bound for source localization in the context of multiple stochastic ... more We study the Cram&-Rao lower bound for source localization in the context of multiple stochastic sources, mul- tipath propagation, and observations in an array of sensors. We derive a general expression which is then specialized to sim- pler configurations and related to results previously reported in the literature. The special case of a single stochastic source in a multipath environment is treated in detail. We assess the relative importance for source localization of the temporal (multipath) and spatial (array baseline) structures of the in- coming wavefield. We show that for an array of K sensors the multipath contribution to the Fisher information matrix can be interpreted as the contribution of K independent arrays whose size depends on the number of spatially resolved replicas. We analyze the degradation due to unknown source spectra. When the source spectrum is completely arbitrary, source location is not possible with a single sensor. If a parametric form of the source spe...
Increasingly complex numerical models are involved in a variety of modern engineering application... more Increasingly complex numerical models are involved in a variety of modern engineering applications, ranging from evaluation of environmental risks to optimisation of sophisticated industrial processes. Study of climat change is an extremely well-known example, while its current use in other domains like pharmaceutics (the so-called in vitro experiments), aeronautics or even cosmetics are less well known of the general public. These models allow the prediction of a number of variables of interest for a given configuration of a number of factors that potentially affect them. Complex models depend in general on a large number of such factors, and their execution time may range from a couple of hours to several days. In many cases, collectively falling in the domain of risk analysis, the interest is in identifying how often, under what conditions, or how strongly, a certain phenomenon may happen. In addition to the numerical model that predicts the variable of interest, it is then neces...
The incremental construction of nested designs having good spreading properties over the d-dimens... more The incremental construction of nested designs having good spreading properties over the d-dimensional hypercube is considered, for values of d such that the 2 d vertices of the hypercube are too numerous to be all inspected. A greedy algorithm is used, with guaranteed efficiency bounds in terms of packing and covering radii, using a 2 d−m fractional-factorial design as candidate set for the sequential selection of design points. The packing and covering properties of fractional-factorial designs are investigated and a review of the related literature is provided. An algorithm for the construction of fractional-factorial designs with maximum packing radius is proposed. The spreading properties of the obtained incremental designs, and of their lower dimensional projections, are investigated. An example with d = 50 is used to illustrate that their projection in a space of dimension close to d has a much higher packing radius than projections of more classical designs based on Latin hy...
OCEANS 2015 - MTS/IEEE Washington, Oct 1, 2015
The paper proposes rCV, a new randomised Cross Validation (CV) criterion specially designed for u... more The paper proposes rCV, a new randomised Cross Validation (CV) criterion specially designed for use with data acquired over non-uniformly scattered designs, like the linear transect surveys typical in environmental observation. The new criterion enables a robust parameterisation of interpolation algorithms, in a manner completely driven by the data and free of any modelling assumptions. The new CV method randomly chooses the hold-out sets such that they reflect, statistically, the geometry of the design with respect to the unobserved points of the area where the observations are to be extrapolated, minimising biases due to the particular geometry of the designs. Numerical results on both simulated and realistic datasets show its robustness and superiority, leading to interpolated fields with smaller error.
Technometrics, 2016
We consider the problem of constructing metamodels for computationally expensive simulation codes... more We consider the problem of constructing metamodels for computationally expensive simulation codes; that is, we construct interpolation/prediction of functions values (responses) from a finite collection of evaluations (observations). We use Gaussian process modeling and Kriging, and combine a Bayesian approach, based on a finite set of covariance functions, with the use of localized models, indexed by the point where the prediction is made. Our approach does not yield a single generative model for the unknown function, but by letting the weights of the different covariance functions depend on the prediction site, it gives enough flexibility for predictions to accommodate to non-stationarity. Contrary to Kriging prediction with plug-in parameter estimates, the resulting Bayesian predictor is constructed explicitly, without requiring any numerical optimization. It inherits the smoothness properties of the covariance functions that are used and its superiority over the plug-in Kriging predictor (sometimes also called empirical-best-linear-unbiased predictor) is illustrated on various examples, including the reconstruction of an oceanographic field from a small number of observations.
We consider the problem of constructing metamodels for computationally expensive simulation codes... more We consider the problem of constructing metamodels for computationally expensive simulation codes; that is, we construct interpolators/predictors of functions values (responses) from a finite collection of evaluations (observations). We use Gaussian process (GP) modeling and kriging, and combine a Bayesian approach, based on a finite set GP models, with the use of localized covariances indexed by the point where the prediction is made. Our approach is not based on postulating a generative model for the unknown function, but by letting the covariance functions depend on the prediction site, it provides enough flexibility to accommodate arbitrary nonstationary observations. Contrary to kriging prediction with plug-in parameter estimates, the resulting Bayesian predictor is constructed explicitly, without requiring any numerical optimization, and locally adjusts the weights given to the different models according to the data variability in each neighborhood. The predictor inherits the ...
In this paper we present a novel approach to mobile robot navigation in unstructured environments... more In this paper we present a novel approach to mobile robot navigation in unstructured environments. Natural scenes can very often be considered as random fields where a large number of individual objects appear to be randomly scattered. This randomness can be described by statistical models. In this paper we consider that a natural scene can be interpreted as realisations of Random Closed Sets (RCS), whose global characteristics are mapped. Contrary to the feature based approach, this environment representation does not require the existence of outstanding objects in the workspace, and is robust with respect to small dynamic changes. We address the problem of estimating the position of a mobile robot, assuming that a statistical model, serving as a map of the environment, is available to it a priori. Simulation results demonstrate the feasibility of our approach.
This paper presents a communication/coordination/ processing architecture for distributed adaptiv... more This paper presents a communication/coordination/ processing architecture for distributed adaptive observation of a spatial field using a fleet of autonomous mobile sensors. One of the key difficulties in this context is to design scalable algorithms for incremental fusion of information across platforms robust to what is known as the “rumor problem”. Incremental fusion is in general based on a Bayesian approach, and algorithms (e.g. the Covariance Intersection, CI) which propagate consistent characterizations of the estimation error under this challenging situation have been proposed. In this paper, we propose to base inter-sensor fusion on a deterministic approach which considers that bounds to the observation errors are known, wich is intrinsically robust to the rumor problem. We present the equations that enable the determination of the ellipsoidal domain of uncertainty that covers the intersection of the individual sets describing sensor’s uncertainty, and show that they solve ...
We study the problem of model fitting in the framework of nested probabilistic families. Our crit... more We study the problem of model fitting in the framework of nested probabilistic families. Our criteria are: (i) sparsity of the identified representation, (ii) its ability to fit the (finite length) data set available. As we show in this paper, current methodologies, often taking the form of penalized versions of the data likelihood, cannot simultaneously satisfy these requirements, as the examples presented clearly demonstrate. On the contrary, maximization of the Bayesian model posterior, even without assumption of a complexity penalizing prior, is able to select models with appropriate complexity, enabling sound determination of its parameters in a second step. 1.1 Problem formulation 1.
We address the problem of surveying of oceanic parameters using autonomous instrumented mobile pl... more We address the problem of surveying of oceanic parameters using autonomous instrumented mobile platforms. As an example, we consider the problem of current mapping in coastal areas. We study the impact on survey efficiency of using a priori knowledge concerning the surveyed field for on-line guidance of the sensors, as an alternative to the classical approach of executing a predefined trajectory, or to the more recently proposed perception-driven observation strategies. Availability of this a priori model enables extrapolation of the measurements, as well as the determination of the information yield by future observations, allowing the search for the best next observation point. In the paper, we present simulation results of the proposed on-line guidance based on information gain, and compare its efficiency to standard survey strategies.
In this communication we address the problem of shape recognition in the context of autonomous na... more In this communication we address the problem of shape recognition in the context of autonomous navigation of mobile robots. We present an association procedure which is able to establish the correspondence between elements of a learned fuzzy internal representation of the environment and currently perceived objects. Using this procedure, a mobile robot is able to periodically decrease the uncertainty affecting its position.
This paper addresses the impact of uncertainty modeling and of the use of perceptual guidance on ... more This paper addresses the impact of uncertainty modeling and of the use of perceptual guidance on the navigational autonomy for vehicles operating in unknown environments, concentrating around four main topics: security control, planning, perceptual guidance and mapping. We present some of the results obtained in the European Union project NARVAL, which globally aims at using the perception capabilities of an autonomous vehicle to extend its autonomy range. One of the major issues addressed is related to guarantee vehicle's safety. We propose novel methodologies to (i) minimize the risk of loosing the vehicle and (ii) avoid destructive interference with objects present in its workspace. Planning under uncertainty is also studied, and algorithms to deene trajectories for guiding the robot to a target region, exploring a delimited environment, or for optimizing autonomous beacon laying, are presented. The problem of using perceptual data for direct control is discussed in connectio...