Eric PARENT | AgroParisTech - Academia.edu (original) (raw)
Papers by Eric PARENT
Climatic Change, 2009
Based on a previous township-scale model, a spatio-temporal framework is proposed to study the fl... more Based on a previous township-scale model, a spatio-temporal framework is proposed to study the fluctuations of avalanche occurrence possibly resulting from climate change. The regional annual component is isolated from the total variability using a two-factor nonlinear analysis of variance. Moreover, relying on a Conditional AutoRegressive sub-model for the spatial effects, the structured time trend is distinguished from the random noise with different time series sub-models including autocorrelative, periodic and change-point models. The hierarchical structure obtained takes into account the uncertainty related to the estimation of the annual component for the quantification of the time trend. Bayesian inference is performed using Monte Carlo simulations. This allows a comparison of the different time series models and the prediction of future activity in an explicit unsteady context. Application to the northern French Alps illustrates the information provided by the model's different components, mainly the spatial and temporal terms as well as the spatio-temporal fluctuation of the relative risk. For instance, it shows no strong modifications in mean avalanche activity or in the number of winters of low or high activity over the last 60 years. This suggests that climate change has recently had little impact on the avalanching rhythm in this region. However, significant temporal patterns are highlighted: a complex combination of abrupt changes and pseudo-periodic cycles of approximately 15 years. For anticipating the future response of snow avalanches to climate change, correlating them with fluctuations of the constraining climatic factors is now necessary.
Atmosphere, 2021
Assessment of human exposure to atmospheric metals is a challenge, and mosses seem to be good bio... more Assessment of human exposure to atmospheric metals is a challenge, and mosses seem to be good biomonitors to help this purpose. Lacking roots, they are easy to collect and analyze. However, to our knowledge, no formal comparison was made between cadmium (Cd) measurements in Grimmia mosses and alternative forecasts of atmospheric Cd pollution as those produced by the CHIMERE chemistry transport model. This work aims at studying this link to improve further biomonitoring. We compare 128 Cd measurements in the cemetery mosses of Paris and Lyon metropolitan areas (France) to CHIMERE Cd atmospheric forecasts. The area to consider around the cemetery for the CHIMERE forecasts has been defined by Kendall rank correlations between both information sources—Cd in mosses and CHIMERE Cd forecasts—from different area sizes. Then, we fit linear models to those two data sets including step-by-step different sources of uncertainty. Finally, we calculate moss predictions to compare predictions and m...
Quantitative simulation is probably the major tool of industrial R&D studies. Com- puter codes ar... more Quantitative simulation is probably the major tool of industrial R&D studies. Com- puter codes are widely used to predict the behavior and the reliability of a complex system in given operating conditions or, in the design phase, to ensure that it will fulfill given required performances. Whatever their complexity, quantitative models are essentially physics-based and, consequently, de- terministic. On the other hand, their inputs and/or the code itself may be aected by uncertainties of various nature which aect the final results and must be properly taken into account. Uncertainty analysis has gained more and more importance in industrial practice in the last years and is at the heart of several working groups and funded projects involving industrial and academic researchers. We present hereby how the common industrial approach to uncertainty analysis can be formulated in a full Bayesian and decisional setting. First, we will introduce the methodological approach to industrial unce...
Le carbone du sol est important non seulement pour assurer la securite alimentaire en maintenant ... more Le carbone du sol est important non seulement pour assurer la securite alimentaire en maintenant la fertilite des sols, mais aussi pour limiter le rechauffement climatique en augmentant la sequestration du carbone dans le sol. Il est urgent de comprendre la reaction du carbone du sol face au rechauffement climatique et au changement des pratiques agricoles. Des modeles bio-physiques ont ete developpes depuis quelques decennies pour etudier la matiere organique du sol (SOM). Cependant, il existe encore une forte incertitude sur les mecanismes controlant la dynamique de la SOM, du niveau microbien aux echelles globales. Dans cet article, nous proposons une approche statistique bayesienne de selection de variables pour mieux cerner la dynamique du carbone du sol en examinant la variation en profondeur du radiocarbone pour 159 profils sous differentes conditions de climat (temperature, precipitations, ...) et d'environnement (type de sol, type d'usage du sol, ...). La recherche ...
arXiv: Methodology, 2019
When computer codes are used for modeling complex physical systems, their unknown parameters are ... more When computer codes are used for modeling complex physical systems, their unknown parameters are tuned by calibration techniques. A discrepancy function may be added to the computer code in order to capture its discrepancy with the real physical process. By considering the validation question of a computer code as a Bayesian selection model problem, Damblin et al. (2016) have highlighted a possible confounding effect in certain configurations between the code discrepancy and a linear computer code by using a Bayesian testing procedure based on the intrinsic Bayes factor. In this paper, we investigate the issue of code error identifiability by applying another Bayesian model selection technique which has been recently developed by Kamary et al. (2014). By embedding the competing models within an encompassing mixture model, Kamary et al. (2014)'s method allows each observation to belong to a different mixing component, providing a more flexible inference, while remaining competiti...
SIAM/ASA Journal on Uncertainty Quantification, 2018
Making good predictions of a physical system using a computer code requires the inputs to be care... more Making good predictions of a physical system using a computer code requires the inputs to be carefully specified. Some of these inputs, called control variables, reproduce physical conditions whereas other inputs, called parameters, are specific to the computer code and most often uncertain. The goal of statistical calibration consists in reducing their uncertainty with the help of a statistical model which links the code outputs with the field measurements. In a Bayesian setting, the posterior distribution of these parameters is typically sampled using MCMC methods. However, they are impractical when the code runs are highly timeconsuming. A way to circumvent this issue consists of replacing the computer code with a Gaussian process emulator, then sampling a surrogate posterior distribution based on it. Doing so, calibration is subject to an error which strongly depends on the numerical design of experiments used to fit the emulator. Under the assumption that there is no code discrepancy, we aim to reduce this error by constructing a sequential design by means of the Expected Improvement criterion. Numerical illustrations in several dimensions assess the efficiency of such sequential strategies.
Revue De Statistique Appliquee, 2004
Une procédure bayésienne de sélection/validation différentielle pour déterminer le domaine d'attr... more Une procédure bayésienne de sélection/validation différentielle pour déterminer le domaine d'attraction des valeurs extrêmes Revue de statistique appliquée, tome 52, n o 4 (2004), p. 5-31 <http://www.numdam.org/item?id=RSA_2004__52_4_5_0> © Société française de statistique, 2004, tous droits réservés. L'accès aux archives de la revue « Revue de statistique appliquée » (http://www.sfds.asso.fr/publicat/rsa.htm) implique l'accord avec les conditions générales d'utilisation (http://www.numdam.org/conditions). Toute utilisation commerciale ou impression systématique est constitutive d'une infraction pénale. Toute copie ou impression de ce fichier doit contenir la présente mention de copyright. Article numérisé dans le cadre du programme Numérisation de documents anciens mathématiques http://www.numdam.org/ 5
Journal De La Societe Francaise De Statistique Revue De Statistique Appliquee, 2011
L'analyse d'incertitudes a pour objet de quantifier le degré de connaissance affectant la valeur ... more L'analyse d'incertitudes a pour objet de quantifier le degré de connaissance affectant la valeur d'une quantité d'intérêt, caractéristique du fonctionnement d'un système physique, et liée à des enjeux décisionnels. La plupart des approches rencontrées en ingénierie font appel à l'inférence statistique, et se divisent en trois grandes classes. Les techniques dites "plug-in" fournissent une estimation ponctuelle de la quantité d'intérêt, valable uniquement en présence d'un grand nombre de données. Lorsque le nombre de données est réduit, il est préférable de faire appel aux procédures de Bayes, qui déduisent une valeur optimale de la grandeur d'intérêt d'une loi a priori décrivant l'incertitude sur les paramètres du modèle et d'une fonction de coût formalisant les enjeux décisionnels. Enfin, les approches purement descriptives visent à décrire l'incertitude sur la quantité d'intérêt, plutôt qu'à en fournir une estimation ponctuelle. De nombreuses heuristiques ont été proposées pour contourner le problème du choix d'une fonction de coût pour l'estimation de la quantité d'intérêt dans un cadre bayésien. Nous considérons en particulier celle qui consiste à remplacer dans la définition de la quantité d'intérêt la distribution réelle de la variable de sortie du système, qui est en général inconnue, par sa distribution prédictive. Nous montrons que cette approche amène implicitement à utiliser un estimateur bayésien, relatif à une fonction de coût qui dépend entèrement de l'expression de la quantité d'intérêt. Ce résultat démontre qu'une estimation ponctuelle sous incertitude repose nécessairement sur le choix, conscient ou non, d'une fonction de coût. Nous illustrons notre propos sur un jeu de données réelles de hauteurs et débits d'un cours d'eau, et discutons plus généralement la pertinence de chaque approche en fonction des enjeux de l'étude, et de la connaissance plus ou moins explicite dont dispose l'analyste.
Statistics in medicine, Jan 30, 2015
Following the pattern of phase I clinical trials for cytotoxic drugs, dose-finding clinical trial... more Following the pattern of phase I clinical trials for cytotoxic drugs, dose-finding clinical trials in oncology of molecularly targeted agents (MTA) aim at determining the maximum tolerated dose (MTD). In classical phase I clinical trials, MTD is generally defined by the number of patients with short-term major treatment toxicities (usually called dose-limiting toxicities, DLT), occurring during the first cycle of study treatment (e.g. within the first 3weeks of treatment). However, S. Postel-Vinay (2011) highlighted that half of grade 3 to 4 toxicities, usually considered as DLT, occur after the first cycle of MTA treatment. In addition, MTAs could induce other moderate (e.g. grade 2) toxicities which could be taken into account depending on their clinical importance, chronic nature and duration. Ignoring these late toxicities may lead to an underestimation of the drug toxicity and to wrong dose recommendations for phase II and III clinical trials. Some methods have been proposed, s...
Cybergeo, 2001
Environmental scientists often face situations where: (i) stimulus-response relationships are non... more Environmental scientists often face situations where: (i) stimulus-response relationships are non-linear; (ii) data are rare or imprecise; (iii) facts are uncertain and stimulus-responses relationships are questionable. In this paper, we focus on the first two points. A powerful and easy-to-use statistical method, the Metropolis-Hastings algorithm, allows the quantification of the uncertainty attached to any model response. This stochastic simulation technique is able to reproduce the statistical joint distribution of the whole parameter set of any model. The Metropolis-Hastings algorithm is described and illustrated on a typical environmental model: the biochemical oxygen demand (BOD). The aim is to provide a helpful guideline for further, and ultimately more complex, models. As a first illustration, the MH-method is also applied to a simple regression example to demonstrate to the practitioner the ability of the algorithm to produce valid results.
Water Resources Research, 2003
This paper presents a distribution-free statistical test aimed at detecting gradual changes in th... more This paper presents a distribution-free statistical test aimed at detecting gradual changes in the hydrological behavior of watersheds. The proposed test uses a rainfall-runoff model to identify watershed behavior over successive time periods and a resampling approach to quantify the significance of trends. The method can be applied with any model deemed suitable for the studied watershed. To assess test efficiency, we used three different case studies: An afforested agricultural watershed, a burnt-over forested watershed, and a watershed covered by old-growth forest. All three watersheds had a long period of rainfall and runoff records (60, 35, and 40 years, respectively), on which stationarity could be tested. The test was shown to adequately detect gradual changes, and it can therefore be useful to identify hydrological trends, wherever rainfall and streamflow time series are available.
Environmental and Ecological Statistics, 2009
A parsimonious model is presented as an alternative to delta approaches to modelling zero-inflate... more A parsimonious model is presented as an alternative to delta approaches to modelling zero-inflated continuous data. The data model relies on an exponentially compound Poisson process, also called the law of leaks (LOL). It represents the process of sampling resources that are spatially distributed as Poisson distributed patches, each containing a certain quantity of biomass drawn from an exponential distribution. In an application of the LOL, two latent structures are proposed to account for spatial dependencies between zero values at different scales within a hierarchical Bayesian framework. The LOL is compared to the delta-gamma () distribution using bottomtrawl survey data. Results of this case study emphasize that the LOL provides slightly better fits to learning samples with a very high proportion of zero values and small strictly positive abundance data. Additionally, it offers better predictions of validation samples.
Modelling has become an unavoidable step in the hydrological analyses. Intensive monitoring of ex... more Modelling has become an unavoidable step in the hydrological analyses. Intensive monitoring of experimental or representative catchments in different environments allowed identifying of dominant processes that further enabled the choice of an appropriate model structure to be used for simulation, forecast or scenario analyses. Progresses have been done in building different modelling structures (conceptual and physically based, lumped and distributed) that are now being extensively used at different scales, with different amounts of input data and together with different calibration procedures. The present study will analyse the role of uncertainty, in a Bayesian context, as a tool to learn how input data (discharge, soil moisture, chemistry, rainfall) could be used to evaluate two semi and fully distributed modelling structures (TOPMODEL and WASIM-ETH) in two different catchments: a small (2km2) one in the Swiss Plateau region (the Haute-Mentue catchment) and a medium (100km2) one in low range mountain in central Germany (Weisse Elster catchment).
Journal of the Royal Statistical Society: Series C (Applied Statistics), 2014
Journal of Hydrology( …, 2000
A Bayesian method is presented for the analysis of two types of sudden change at an unknown time-... more A Bayesian method is presented for the analysis of two types of sudden change at an unknown time-point in a sequence of energy inflows modeled by independent normal random variables. First, the case of a single shift in the mean level is revisited to show how such a problem can be straightforwardly addressed through the Bayesian framework. Second, a change in variability is investigated. In hydrology, to our knowledge, this problem has not been studied from a Bayesian perspective. Even if this model is quite simple, no analytic solutions for parameter inference are available, and recourse to approximations is needed. It is shown that the Gibbs sampler is particularly suitable for change-point analysis, and this Markovian updating scheme is used. Finally, a case study involving annual energy inflows of two large hydropower systems managed by Hydro-Québec is presented in which informative prior distributions are specified from regional information.
Sciences Eaux & Territoires
Prédétermination et calcul de risque dans le cas des avalanches : avantages et limites des méthod... more Prédétermination et calcul de risque dans le cas des avalanches : avantages et limites des méthodes utilisées en ingénierie et perspectives d'avenir Si l'on connaît assez bien les principales zones où se produisent les avalanches, il est plus difficile de prévoir les caractéristiques précises de ces événements extrêmes. Après un tour d'horizon des méthodes traditionnelles utilisées en ingénierie paravalanche pour prédéterminer les événements extrêmes et calculer les risques, les auteurs de l'article nous présentent ici des nouvelles méthodes combinant statistiques et modèles dynamiques d'écoulement, qui permettent d'employer de manière plus rigoureuse les notions de période de retour et de risque. Caractériser les avalanches rares Aléa de référence et période de retour
Revue des sciences de l'eau
Pour étudier les incertitudes d'un modèle hydrologique, on peut employer l'analyse de fia... more Pour étudier les incertitudes d'un modèle hydrologique, on peut employer l'analyse de fiabilité, méthode couramment utilisée dans le domaine de la résistance des structures et du génie hydraulique. Cet outil peut être transposé dans le domaine hydrologique pour juger de la qualité d'un modèle. Un modèle hydrologique sera fiable si, sur une série de données test, l'ajustement obtenu avec un jeu de paramètres recommandé pour un fonctionnement passe-partout est, selon toute probabilité, d'une qualité proche de l'ajustement idéal correspondant au calage du jeu de paramètres sur l'échantillon test. Ce papier montre comment adapter chacun des concepts de l'analyse de fiabilité en hydrologie et détaille la technique des deux premiers moments afin de calculer explicitement la fiabilité d'un modèle-pluie débit en réalisant un développement limité au voisinage du point de fonctionnement du modèle. Cette approche par analyse de fiabilité est appliquée au mod...
Revue des sciences de l'eau
Quel est le nombre d'échantillons à prélever pour analyse bactériologique dans un réseau de d... more Quel est le nombre d'échantillons à prélever pour analyse bactériologique dans un réseau de distribution d'eau potable afin réaliser un autocontrôle optimal du point de vue économique (coûts analytiques et coût des actions curatives), tout en limitant les risques de dégradation de la qualité ? Pour répondre à cette question, nous proposons un modèle probabiliste qui simule le choix de la décision curative lorsque les analyses indiquent des résultats insatisfaisants ainsi que l'effet de cette décision sur la qualité de l'eau du réseau. Les différentes actions curatives et leur efficacité ont été déterminées empiriquement à partir de l'expertise du gestionnaire du réseau de la Banlieue de Paris et des données collectées de 1992 à 1996. Le modèle s'appuie sur un schéma Markovien d'évolution du couple (Qualité de l'eau, Action curative). Par programmation dynamique, on calcule le coût moyen de la politique décisionnelle de la Banlieue de Paris et le risqu...
Climatic Change, 2009
Based on a previous township-scale model, a spatio-temporal framework is proposed to study the fl... more Based on a previous township-scale model, a spatio-temporal framework is proposed to study the fluctuations of avalanche occurrence possibly resulting from climate change. The regional annual component is isolated from the total variability using a two-factor nonlinear analysis of variance. Moreover, relying on a Conditional AutoRegressive sub-model for the spatial effects, the structured time trend is distinguished from the random noise with different time series sub-models including autocorrelative, periodic and change-point models. The hierarchical structure obtained takes into account the uncertainty related to the estimation of the annual component for the quantification of the time trend. Bayesian inference is performed using Monte Carlo simulations. This allows a comparison of the different time series models and the prediction of future activity in an explicit unsteady context. Application to the northern French Alps illustrates the information provided by the model's different components, mainly the spatial and temporal terms as well as the spatio-temporal fluctuation of the relative risk. For instance, it shows no strong modifications in mean avalanche activity or in the number of winters of low or high activity over the last 60 years. This suggests that climate change has recently had little impact on the avalanching rhythm in this region. However, significant temporal patterns are highlighted: a complex combination of abrupt changes and pseudo-periodic cycles of approximately 15 years. For anticipating the future response of snow avalanches to climate change, correlating them with fluctuations of the constraining climatic factors is now necessary.
Atmosphere, 2021
Assessment of human exposure to atmospheric metals is a challenge, and mosses seem to be good bio... more Assessment of human exposure to atmospheric metals is a challenge, and mosses seem to be good biomonitors to help this purpose. Lacking roots, they are easy to collect and analyze. However, to our knowledge, no formal comparison was made between cadmium (Cd) measurements in Grimmia mosses and alternative forecasts of atmospheric Cd pollution as those produced by the CHIMERE chemistry transport model. This work aims at studying this link to improve further biomonitoring. We compare 128 Cd measurements in the cemetery mosses of Paris and Lyon metropolitan areas (France) to CHIMERE Cd atmospheric forecasts. The area to consider around the cemetery for the CHIMERE forecasts has been defined by Kendall rank correlations between both information sources—Cd in mosses and CHIMERE Cd forecasts—from different area sizes. Then, we fit linear models to those two data sets including step-by-step different sources of uncertainty. Finally, we calculate moss predictions to compare predictions and m...
Quantitative simulation is probably the major tool of industrial R&D studies. Com- puter codes ar... more Quantitative simulation is probably the major tool of industrial R&D studies. Com- puter codes are widely used to predict the behavior and the reliability of a complex system in given operating conditions or, in the design phase, to ensure that it will fulfill given required performances. Whatever their complexity, quantitative models are essentially physics-based and, consequently, de- terministic. On the other hand, their inputs and/or the code itself may be aected by uncertainties of various nature which aect the final results and must be properly taken into account. Uncertainty analysis has gained more and more importance in industrial practice in the last years and is at the heart of several working groups and funded projects involving industrial and academic researchers. We present hereby how the common industrial approach to uncertainty analysis can be formulated in a full Bayesian and decisional setting. First, we will introduce the methodological approach to industrial unce...
Le carbone du sol est important non seulement pour assurer la securite alimentaire en maintenant ... more Le carbone du sol est important non seulement pour assurer la securite alimentaire en maintenant la fertilite des sols, mais aussi pour limiter le rechauffement climatique en augmentant la sequestration du carbone dans le sol. Il est urgent de comprendre la reaction du carbone du sol face au rechauffement climatique et au changement des pratiques agricoles. Des modeles bio-physiques ont ete developpes depuis quelques decennies pour etudier la matiere organique du sol (SOM). Cependant, il existe encore une forte incertitude sur les mecanismes controlant la dynamique de la SOM, du niveau microbien aux echelles globales. Dans cet article, nous proposons une approche statistique bayesienne de selection de variables pour mieux cerner la dynamique du carbone du sol en examinant la variation en profondeur du radiocarbone pour 159 profils sous differentes conditions de climat (temperature, precipitations, ...) et d'environnement (type de sol, type d'usage du sol, ...). La recherche ...
arXiv: Methodology, 2019
When computer codes are used for modeling complex physical systems, their unknown parameters are ... more When computer codes are used for modeling complex physical systems, their unknown parameters are tuned by calibration techniques. A discrepancy function may be added to the computer code in order to capture its discrepancy with the real physical process. By considering the validation question of a computer code as a Bayesian selection model problem, Damblin et al. (2016) have highlighted a possible confounding effect in certain configurations between the code discrepancy and a linear computer code by using a Bayesian testing procedure based on the intrinsic Bayes factor. In this paper, we investigate the issue of code error identifiability by applying another Bayesian model selection technique which has been recently developed by Kamary et al. (2014). By embedding the competing models within an encompassing mixture model, Kamary et al. (2014)'s method allows each observation to belong to a different mixing component, providing a more flexible inference, while remaining competiti...
SIAM/ASA Journal on Uncertainty Quantification, 2018
Making good predictions of a physical system using a computer code requires the inputs to be care... more Making good predictions of a physical system using a computer code requires the inputs to be carefully specified. Some of these inputs, called control variables, reproduce physical conditions whereas other inputs, called parameters, are specific to the computer code and most often uncertain. The goal of statistical calibration consists in reducing their uncertainty with the help of a statistical model which links the code outputs with the field measurements. In a Bayesian setting, the posterior distribution of these parameters is typically sampled using MCMC methods. However, they are impractical when the code runs are highly timeconsuming. A way to circumvent this issue consists of replacing the computer code with a Gaussian process emulator, then sampling a surrogate posterior distribution based on it. Doing so, calibration is subject to an error which strongly depends on the numerical design of experiments used to fit the emulator. Under the assumption that there is no code discrepancy, we aim to reduce this error by constructing a sequential design by means of the Expected Improvement criterion. Numerical illustrations in several dimensions assess the efficiency of such sequential strategies.
Revue De Statistique Appliquee, 2004
Une procédure bayésienne de sélection/validation différentielle pour déterminer le domaine d'attr... more Une procédure bayésienne de sélection/validation différentielle pour déterminer le domaine d'attraction des valeurs extrêmes Revue de statistique appliquée, tome 52, n o 4 (2004), p. 5-31 <http://www.numdam.org/item?id=RSA_2004__52_4_5_0> © Société française de statistique, 2004, tous droits réservés. L'accès aux archives de la revue « Revue de statistique appliquée » (http://www.sfds.asso.fr/publicat/rsa.htm) implique l'accord avec les conditions générales d'utilisation (http://www.numdam.org/conditions). Toute utilisation commerciale ou impression systématique est constitutive d'une infraction pénale. Toute copie ou impression de ce fichier doit contenir la présente mention de copyright. Article numérisé dans le cadre du programme Numérisation de documents anciens mathématiques http://www.numdam.org/ 5
Journal De La Societe Francaise De Statistique Revue De Statistique Appliquee, 2011
L'analyse d'incertitudes a pour objet de quantifier le degré de connaissance affectant la valeur ... more L'analyse d'incertitudes a pour objet de quantifier le degré de connaissance affectant la valeur d'une quantité d'intérêt, caractéristique du fonctionnement d'un système physique, et liée à des enjeux décisionnels. La plupart des approches rencontrées en ingénierie font appel à l'inférence statistique, et se divisent en trois grandes classes. Les techniques dites "plug-in" fournissent une estimation ponctuelle de la quantité d'intérêt, valable uniquement en présence d'un grand nombre de données. Lorsque le nombre de données est réduit, il est préférable de faire appel aux procédures de Bayes, qui déduisent une valeur optimale de la grandeur d'intérêt d'une loi a priori décrivant l'incertitude sur les paramètres du modèle et d'une fonction de coût formalisant les enjeux décisionnels. Enfin, les approches purement descriptives visent à décrire l'incertitude sur la quantité d'intérêt, plutôt qu'à en fournir une estimation ponctuelle. De nombreuses heuristiques ont été proposées pour contourner le problème du choix d'une fonction de coût pour l'estimation de la quantité d'intérêt dans un cadre bayésien. Nous considérons en particulier celle qui consiste à remplacer dans la définition de la quantité d'intérêt la distribution réelle de la variable de sortie du système, qui est en général inconnue, par sa distribution prédictive. Nous montrons que cette approche amène implicitement à utiliser un estimateur bayésien, relatif à une fonction de coût qui dépend entèrement de l'expression de la quantité d'intérêt. Ce résultat démontre qu'une estimation ponctuelle sous incertitude repose nécessairement sur le choix, conscient ou non, d'une fonction de coût. Nous illustrons notre propos sur un jeu de données réelles de hauteurs et débits d'un cours d'eau, et discutons plus généralement la pertinence de chaque approche en fonction des enjeux de l'étude, et de la connaissance plus ou moins explicite dont dispose l'analyste.
Statistics in medicine, Jan 30, 2015
Following the pattern of phase I clinical trials for cytotoxic drugs, dose-finding clinical trial... more Following the pattern of phase I clinical trials for cytotoxic drugs, dose-finding clinical trials in oncology of molecularly targeted agents (MTA) aim at determining the maximum tolerated dose (MTD). In classical phase I clinical trials, MTD is generally defined by the number of patients with short-term major treatment toxicities (usually called dose-limiting toxicities, DLT), occurring during the first cycle of study treatment (e.g. within the first 3weeks of treatment). However, S. Postel-Vinay (2011) highlighted that half of grade 3 to 4 toxicities, usually considered as DLT, occur after the first cycle of MTA treatment. In addition, MTAs could induce other moderate (e.g. grade 2) toxicities which could be taken into account depending on their clinical importance, chronic nature and duration. Ignoring these late toxicities may lead to an underestimation of the drug toxicity and to wrong dose recommendations for phase II and III clinical trials. Some methods have been proposed, s...
Cybergeo, 2001
Environmental scientists often face situations where: (i) stimulus-response relationships are non... more Environmental scientists often face situations where: (i) stimulus-response relationships are non-linear; (ii) data are rare or imprecise; (iii) facts are uncertain and stimulus-responses relationships are questionable. In this paper, we focus on the first two points. A powerful and easy-to-use statistical method, the Metropolis-Hastings algorithm, allows the quantification of the uncertainty attached to any model response. This stochastic simulation technique is able to reproduce the statistical joint distribution of the whole parameter set of any model. The Metropolis-Hastings algorithm is described and illustrated on a typical environmental model: the biochemical oxygen demand (BOD). The aim is to provide a helpful guideline for further, and ultimately more complex, models. As a first illustration, the MH-method is also applied to a simple regression example to demonstrate to the practitioner the ability of the algorithm to produce valid results.
Water Resources Research, 2003
This paper presents a distribution-free statistical test aimed at detecting gradual changes in th... more This paper presents a distribution-free statistical test aimed at detecting gradual changes in the hydrological behavior of watersheds. The proposed test uses a rainfall-runoff model to identify watershed behavior over successive time periods and a resampling approach to quantify the significance of trends. The method can be applied with any model deemed suitable for the studied watershed. To assess test efficiency, we used three different case studies: An afforested agricultural watershed, a burnt-over forested watershed, and a watershed covered by old-growth forest. All three watersheds had a long period of rainfall and runoff records (60, 35, and 40 years, respectively), on which stationarity could be tested. The test was shown to adequately detect gradual changes, and it can therefore be useful to identify hydrological trends, wherever rainfall and streamflow time series are available.
Environmental and Ecological Statistics, 2009
A parsimonious model is presented as an alternative to delta approaches to modelling zero-inflate... more A parsimonious model is presented as an alternative to delta approaches to modelling zero-inflated continuous data. The data model relies on an exponentially compound Poisson process, also called the law of leaks (LOL). It represents the process of sampling resources that are spatially distributed as Poisson distributed patches, each containing a certain quantity of biomass drawn from an exponential distribution. In an application of the LOL, two latent structures are proposed to account for spatial dependencies between zero values at different scales within a hierarchical Bayesian framework. The LOL is compared to the delta-gamma () distribution using bottomtrawl survey data. Results of this case study emphasize that the LOL provides slightly better fits to learning samples with a very high proportion of zero values and small strictly positive abundance data. Additionally, it offers better predictions of validation samples.
Modelling has become an unavoidable step in the hydrological analyses. Intensive monitoring of ex... more Modelling has become an unavoidable step in the hydrological analyses. Intensive monitoring of experimental or representative catchments in different environments allowed identifying of dominant processes that further enabled the choice of an appropriate model structure to be used for simulation, forecast or scenario analyses. Progresses have been done in building different modelling structures (conceptual and physically based, lumped and distributed) that are now being extensively used at different scales, with different amounts of input data and together with different calibration procedures. The present study will analyse the role of uncertainty, in a Bayesian context, as a tool to learn how input data (discharge, soil moisture, chemistry, rainfall) could be used to evaluate two semi and fully distributed modelling structures (TOPMODEL and WASIM-ETH) in two different catchments: a small (2km2) one in the Swiss Plateau region (the Haute-Mentue catchment) and a medium (100km2) one in low range mountain in central Germany (Weisse Elster catchment).
Journal of the Royal Statistical Society: Series C (Applied Statistics), 2014
Journal of Hydrology( …, 2000
A Bayesian method is presented for the analysis of two types of sudden change at an unknown time-... more A Bayesian method is presented for the analysis of two types of sudden change at an unknown time-point in a sequence of energy inflows modeled by independent normal random variables. First, the case of a single shift in the mean level is revisited to show how such a problem can be straightforwardly addressed through the Bayesian framework. Second, a change in variability is investigated. In hydrology, to our knowledge, this problem has not been studied from a Bayesian perspective. Even if this model is quite simple, no analytic solutions for parameter inference are available, and recourse to approximations is needed. It is shown that the Gibbs sampler is particularly suitable for change-point analysis, and this Markovian updating scheme is used. Finally, a case study involving annual energy inflows of two large hydropower systems managed by Hydro-Québec is presented in which informative prior distributions are specified from regional information.
Sciences Eaux & Territoires
Prédétermination et calcul de risque dans le cas des avalanches : avantages et limites des méthod... more Prédétermination et calcul de risque dans le cas des avalanches : avantages et limites des méthodes utilisées en ingénierie et perspectives d'avenir Si l'on connaît assez bien les principales zones où se produisent les avalanches, il est plus difficile de prévoir les caractéristiques précises de ces événements extrêmes. Après un tour d'horizon des méthodes traditionnelles utilisées en ingénierie paravalanche pour prédéterminer les événements extrêmes et calculer les risques, les auteurs de l'article nous présentent ici des nouvelles méthodes combinant statistiques et modèles dynamiques d'écoulement, qui permettent d'employer de manière plus rigoureuse les notions de période de retour et de risque. Caractériser les avalanches rares Aléa de référence et période de retour
Revue des sciences de l'eau
Pour étudier les incertitudes d'un modèle hydrologique, on peut employer l'analyse de fia... more Pour étudier les incertitudes d'un modèle hydrologique, on peut employer l'analyse de fiabilité, méthode couramment utilisée dans le domaine de la résistance des structures et du génie hydraulique. Cet outil peut être transposé dans le domaine hydrologique pour juger de la qualité d'un modèle. Un modèle hydrologique sera fiable si, sur une série de données test, l'ajustement obtenu avec un jeu de paramètres recommandé pour un fonctionnement passe-partout est, selon toute probabilité, d'une qualité proche de l'ajustement idéal correspondant au calage du jeu de paramètres sur l'échantillon test. Ce papier montre comment adapter chacun des concepts de l'analyse de fiabilité en hydrologie et détaille la technique des deux premiers moments afin de calculer explicitement la fiabilité d'un modèle-pluie débit en réalisant un développement limité au voisinage du point de fonctionnement du modèle. Cette approche par analyse de fiabilité est appliquée au mod...
Revue des sciences de l'eau
Quel est le nombre d'échantillons à prélever pour analyse bactériologique dans un réseau de d... more Quel est le nombre d'échantillons à prélever pour analyse bactériologique dans un réseau de distribution d'eau potable afin réaliser un autocontrôle optimal du point de vue économique (coûts analytiques et coût des actions curatives), tout en limitant les risques de dégradation de la qualité ? Pour répondre à cette question, nous proposons un modèle probabiliste qui simule le choix de la décision curative lorsque les analyses indiquent des résultats insatisfaisants ainsi que l'effet de cette décision sur la qualité de l'eau du réseau. Les différentes actions curatives et leur efficacité ont été déterminées empiriquement à partir de l'expertise du gestionnaire du réseau de la Banlieue de Paris et des données collectées de 1992 à 1996. Le modèle s'appuie sur un schéma Markovien d'évolution du couple (Qualité de l'eau, Action curative). Par programmation dynamique, on calcule le coût moyen de la politique décisionnelle de la Banlieue de Paris et le risqu...