Robert Tardif - Profile on Academia.edu (original) (raw)
Papers by Robert Tardif
The Last Millennium Reanalysis (LMR) utilizes an ensemble methodology to assimilate paleoclimate ... more The Last Millennium Reanalysis (LMR) utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records and the forward models that map climate variables to proxy measurements. Results based on an updated proxy database and seasonal regression-based forward models are compared to the LMR prototype, which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental-era gridded analyses shows that the new reconstructions of surface air temperature and 500 hPa geopotential height are significantly improved (from 10 % to more than 100 %), while improvements in reconstruction of the Palmer Drought Severity Index are more modest. Additional experiments designed to isolate the sources of improvement reveal the importance of the updated proxy records, including coral records for improving tropical reconstructions, and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and dependence on both temperature and moisture for treering width, also contribute to improvements in reconstructed thermodynamic and hydroclimate variables in midlatitudes. The variability of temperature at multidecadal to centennial scales is also shown to be sensitive to the set of assimilated proxies, especially to the inclusion of primarily moisturesensitive tree-ring-width records.
Atmosphere
Accurate and timely fog forecasts are needed to support decision making for various activities wh... more Accurate and timely fog forecasts are needed to support decision making for various activities which are critically affected by low visibility conditions [...]
Although Atlantic hurricane risk is expected to increase in a warming climate, projecting trends ... more Although Atlantic hurricane risk is expected to increase in a warming climate, projecting trends in hurricane frequency over the present century is still highly uncertain. The short instrumental record limits our understanding of hurricane activity and its relationship to climate, especially on multi-decadal and longer time scales. Here we extend the instrumental hurricane frequency record using two independent sources of information: (1) a reconstruction of basin-wide Atlantic hurricane frequency over the last millennium, developed from sedimentary paleohurricane records; (2) a statistical model of hurricane activity using sea surface temperatures (SSTs) from the Last Millennium Reanalysis (LMR) datasets. We find statistically significant agreement between the two estimates, suggesting that a robust climate signal of hurricane frequency over the past millennium can be captured from proxy data. Neither estimate of hurricane frequency indicates that the late 20th century hurricane fr...
Nature, 2021
Climate changes across the last 24,000 years provide key insights into Earth system responses to ... more Climate changes across the last 24,000 years provide key insights into Earth system responses to external forcing. Climate model simulations 1, 2 and proxy data 3-8 have independently allowed for study of this crucial interval; however, they have at times yielded disparate conclusions. Here, we leverage both types of information using paleoclimate data assimilation 9, 10 to produce the first observationally constrained, full-field reanalysis of surface temperature change spanning the Last Glacial Maximum to present. We demonstrate that temperature variability across the last 24 kyr was linked to two modes: radiative forcing from ice sheets and greenhouse gases; and a superposition of changes in thermohaline circulation and seasonal insolation. In contrast with previous proxy-based reconstructions 6, 7 our reanalysis results show that global mean temperatures warmed between the early and middle Holocene and were stable thereafter. When compared with recent temperature changes 11 , our reanalysis indicates that both the rate and magnitude of modern observed warming are unprecedented relative to the changes of the last 24 kyr.
Journal of Climate, 2021
In the past 40 years, the global annual mean surface temperature has experienced a nonuniform war... more In the past 40 years, the global annual mean surface temperature has experienced a nonuniform warming, differing from the spatially uniform warming simulated by the forced responses of large multimodel ensembles to anthropogenic forcing. Rather, it exhibits significant asymmetry between the Arctic and Antarctic, with intermittent and spatially varying warming trends along the Northern Hemisphere (NH) midlatitudes and a slight cooling in the tropical eastern Pacific. In particular, this “wavy” pattern of temperature changes over the NH midlatitudes features strong cooling over Eurasia in boreal winter. Here, we show that these nonuniform features of surface temperature changes are likely tied together by tropical eastern Pacific sea surface temperatures (SSTs), via a global atmospheric teleconnection. Using six reanalyses, we find that this teleconnection can be consistently obtained as a leading circulation mode in the past century. This tropically driven teleconnection is associate...
Monthly Weather Review, 2020
A sensitivity analysis methodology recently developed by the authors is applied to COAMPS and WRF... more A sensitivity analysis methodology recently developed by the authors is applied to COAMPS and WRF. The method involves varying model parameters according to Latin Hypercube Sampling, and developing multivariate multiple regression models that map the model parameters to forecasts over a spatial domain. The regression coefficients and p values testing whether the coefficients are zero serve as measures of sensitivity of forecasts with respect to model parameters. Nine model parameters are selected from COAMPS and WRF, and their impact is examined on nine forecast quantities (water vapor, convective and gridscale precipitation, and air temperature and wind speed at three altitudes). Although the conclusions depend on the model parameters and specific forecast quantities, it is shown that sensitivity to model parameters is often accompanied by nontrivial spatial structure, which itself depends on the underlying forecast model (i.e., COAMPS vs WRF). One specific difference between these...
Climate of the Past, 2018
The Last Millennium Reanalysis (LMR) employs a data assimilation approach to reconstruct climate ... more The Last Millennium Reanalysis (LMR) employs a data assimilation approach to reconstruct climate fields from annually resolved proxy data over years 0-2000 CE. We use the LMR to examine Atlantic multidecadal variability (AMV) over the last 2 millennia and find several robust thermodynamic features associated with a positive Atlantic Multidecadal Oscillation (AMO) index that reveal a dynamically consistent pattern of variability: the Atlantic and most continents warm; sea ice thins over the Arctic and retreats over the Greenland, Iceland, and Norwegian seas; and equatorial precipitation shifts northward. The latter is consistent with anomalous southward energy transport mediated by the atmosphere. Net downward shortwave radiation increases at both the top of the atmosphere and the surface, indicating a decrease in planetary albedo, likely due to a decrease in low clouds. Heat is absorbed by the climate system and the oceans warm. Wavelet analysis of the AMO time series shows a reddening of the frequency spectrum on the 50to 100-year timescale, but no evidence of a distinct multidecadal or centennial spectral peak. This latter result is insensitive to both the choice of prior model and the calibration dataset used in the data assimilation algorithm, suggesting that the lack of a distinct multidecadal spectral peak is a robust result.
Monthly Weather Review, 2018
A methodology is proposed for examining the effect of model parameters (assumed to be continuous)... more A methodology is proposed for examining the effect of model parameters (assumed to be continuous) on the spatial structure of forecasts. The methodology involves several statistical methods of sampling and inference to assure the sensitivity results are statistically sound. Specifically, Latin hypercube sampling is employed to vary the model parameters, and multivariate multiple regression is used to account for spatial correlations in assessing the sensitivities. The end product is a geographic “map” of p values for each model parameter, allowing one to display and examine the spatial structure of the sensitivity. As an illustration, the effect of 11 model parameters in a mesoscale model on forecasts of convective and grid-scale precipitation, surface air temperature, and water vapor is studied. A number of spatial patterns in sensitivity are found. For example, a parameter that controls the fraction of available convective clouds and precipitation fed back to the grid scale influe...
Climate of the Past Discussions, 2018
The Last Millennium Reanalysis utilizes an ensemble methodology to assimilate paleoclimate data f... more The Last Millennium Reanalysis utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records, and the forward models that map climate variables to proxy measurements. Results based on an extensive proxy database and seasonal regression-based forward models are compared to the prototype reanalysis of Hakim et al. ( ), which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental-era gridded analyses shows that the new reconstructions of surface air temperature, 500 hPa geopotential height and the Palmer Drought Severity Index are significantly improved, with skill scores increasing from 10% to more than 200%, depending on the variable and verification measure. Additional experiments designed to isolate the sources of improvement reveal the importance of additional proxy records, including coral records for improving tropical reconstructions; tree-ring-width chronologies, including moisture-sensitive trees, for thermodynamic and hydroclimate variables in mid-latitudes; and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and the dual sensitivity to temperature and moisture characterizing tree-ring-width proxies, are also found to be particularly important. Other experiments highlight the beneficial role of covariance localization on reanalysis ensemble characteristics. This improved paleoclimate data assimilation system served as the basis for the production of the first publicly released NOAA Last Millennium Reanalysis.
Meteorological Applications, 2018
Stochastic kinetic energy backscatter schemes (SKEBSs) are introduced in numerical weather foreca... more Stochastic kinetic energy backscatter schemes (SKEBSs) are introduced in numerical weather forecast models to represent uncertainties related to unresolved subgrid‐scale processes. These schemes are formulated using a set of parameters that must be determined using physical knowledge and/or to obtain a desired outcome. Here, a methodology is developed for assessing the effect of four factors on spatial features of forecasts simulated by the SKEBS‐enabled Weather Research and Forecasting model. The four factors include two physically motivated SKEBS parameters (the determining amplitude of perturbations applied to stream function and potential temperature tendencies), a purely stochastic element (a seed used in generating random perturbations) and a factor reflecting daily variability. A simple threshold‐based approach for identifying coherent objects within forecast fields is employed, and the effect of the four factors on object features (e.g. number, size and intensity) is assesse...
Climate of the Past Discussions, 2017
The Last Millennial Reanalysis (LMR) employs a data assimilation approach to reconstruct climate ... more The Last Millennial Reanalysis (LMR) employs a data assimilation approach to reconstruct climate fields from annually-resolved proxy data over years 0–2000CE. We use the LMR to examine Atlantic Multidecadal Variability (AMV) over the last two millennia, and find several robust thermodynamic features associated with a positive Atlantic Multidecadal Oscillation (AMO) index that reveal a dynamically-consistent pattern of variability: the Atlantic and most continents warm; sea ice thins over the Arctic and retreats over the Greenland-Iceland-Norwegian Seas; and equatorial precipitation shifts northward. The latter is consistent with anomalous southward energy transport mediated by the atmosphere. Net downward shortwave radiation increases at both the top-of-atmosphere and surface, indicating a decrease in planetary albedo, likely due to a decrease in low clouds. Heat is absorbed by the climate system and the oceans warm. Wavelet analysis of the AMV time series shows a reddening of...
La Météorologie, 2008
Une campagne de mesures dédiée au brouillard, appelée Paris-Fog, s'est déroulée en région parisie... more Une campagne de mesures dédiée au brouillard, appelée Paris-Fog, s'est déroulée en région parisienne au cours de l'hiver 2006-2007. Cette campagne est ambitieuse de par les moyens qui ont été mis en oeuvre et la longueur de la période documentée. On décrit ici les instruments utilisés ainsi que le mode de fonctionnement de la campagne de mesures regroupant des laboratoires de recherche d'horizons différents. Puis on présente un premier bilan ainsi qu'un zoom sur une période d'observations intensives.
Weather and Forecasting, 2009
A specific event, called a low-visibility procedure (LVP), has been defined when visibility is un... more A specific event, called a low-visibility procedure (LVP), has been defined when visibility is under 600 m and/or the ceiling is under 60 m at Paris-Charles de Gaulle Airport, Paris, France, to ensure air traffic safety and to reduce the economic issues related to poor visibility conditions. The Local Ensemble Prediction System (LEPS) has been designed to estimate LVP likelihood in order to help forecasters in their tasks. This work evaluates the skill of LEPS for each type of LVP that takes place at the airport area during five winter seasons from 2002 to 2007. An event-based classification reveals that stratus base lowering, advection, and radiation fogs make up for 78% of the LVP cases that occurred near the airport during this period. This study also demonstrates that LEPS is skillful on these types of event for short-term forecasts. When the ensemble runs start with initialized LVP events, the prediction of advection fogs is as skillful as the prediction of radiation fog events...
Pure and Applied Geophysics, 2007
The scientific community that includes meteorologists, physical scientists, engineers, medical do... more The scientific community that includes meteorologists, physical scientists, engineers, medical doctors, biologists, and environmentalists has shown interest in a better understanding of fog for years because of its effects on, directly or indirectly, the daily life of human beings. The total economic losses associated with the impact of the presence of fog on aviation, marine and land transportation can be comparable to those of tornadoes or, in some cases, winter storms and hurricanes. The number of articles including the word ''fog'' in Journals of American Meteorological Society alone was found to be about 4700, indicating that there is substantial interest in this subject. In spite of this extensive body of work, our ability to accurately forecast/nowcast fog remains limited due to our incomplete understanding of the fog processes over various time and space scales. Fog processes involve droplet microphysics, aerosol chemistry, radiation, turbulence, large/small-scale dynamics, and surface conditions (e.g., partaining to the presence of ice, snow, liquid, plants, and various types of soil). This review paper summarizes past achievements related to the understanding of fog formation, development and decay, and in this respect, the analysis of observations and the development of forecasting models and remote sensing methods are discussed in detail. Finally, future perspectives for fog-related research are highlighted.
Journal of the Atmospheric Sciences, 2010
To gain insights into the poorly understood phenomenon of precipitation fog, this study assesses ... more To gain insights into the poorly understood phenomenon of precipitation fog, this study assesses the evaporation of freely falling drops departing from equilibrium as a possible contributing factor to fog formation in rainy conditions. The study is based on simulations performed with a microphysical column model describing the evolution of the temperature and mass of evaporating raindrops within a Lagrangian reference frame. Equilibrium defines a state where the latent heat loss of an evaporating drop is balanced by the sensible heat flux from the ambient air, hence defining a steady-state drop temperature. Model results show that the assumption of equilibrium leads to small but significant errors in calculated precipitation evaporation rates for drops falling in continuously varying ambient near-saturated or saturated conditions. Departure from equilibrium depends on the magnitude of the vertical gradients of the ambient temperature and moisture as well as the drop-size-dependent t...
The Last Millennium Reanalysis (LMR) utilizes an ensemble methodology to assimilate paleoclimate ... more The Last Millennium Reanalysis (LMR) utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records and the forward models that map climate variables to proxy measurements. Results based on an updated proxy database and seasonal regression-based forward models are compared to the LMR prototype, which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental-era gridded analyses shows that the new reconstructions of surface air temperature and 500 hPa geopotential height are significantly improved (from 10 % to more than 100 %), while improvements in reconstruction of the Palmer Drought Severity Index are more modest. Additional experiments designed to isolate the sources of improvement reveal the importance of the updated proxy records, including coral records for improving tropical reconstructions, and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and dependence on both temperature and moisture for treering width, also contribute to improvements in reconstructed thermodynamic and hydroclimate variables in midlatitudes. The variability of temperature at multidecadal to centennial scales is also shown to be sensitive to the set of assimilated proxies, especially to the inclusion of primarily moisturesensitive tree-ring-width records.
Atmosphere
Accurate and timely fog forecasts are needed to support decision making for various activities wh... more Accurate and timely fog forecasts are needed to support decision making for various activities which are critically affected by low visibility conditions [...]
Although Atlantic hurricane risk is expected to increase in a warming climate, projecting trends ... more Although Atlantic hurricane risk is expected to increase in a warming climate, projecting trends in hurricane frequency over the present century is still highly uncertain. The short instrumental record limits our understanding of hurricane activity and its relationship to climate, especially on multi-decadal and longer time scales. Here we extend the instrumental hurricane frequency record using two independent sources of information: (1) a reconstruction of basin-wide Atlantic hurricane frequency over the last millennium, developed from sedimentary paleohurricane records; (2) a statistical model of hurricane activity using sea surface temperatures (SSTs) from the Last Millennium Reanalysis (LMR) datasets. We find statistically significant agreement between the two estimates, suggesting that a robust climate signal of hurricane frequency over the past millennium can be captured from proxy data. Neither estimate of hurricane frequency indicates that the late 20th century hurricane fr...
Nature, 2021
Climate changes across the last 24,000 years provide key insights into Earth system responses to ... more Climate changes across the last 24,000 years provide key insights into Earth system responses to external forcing. Climate model simulations 1, 2 and proxy data 3-8 have independently allowed for study of this crucial interval; however, they have at times yielded disparate conclusions. Here, we leverage both types of information using paleoclimate data assimilation 9, 10 to produce the first observationally constrained, full-field reanalysis of surface temperature change spanning the Last Glacial Maximum to present. We demonstrate that temperature variability across the last 24 kyr was linked to two modes: radiative forcing from ice sheets and greenhouse gases; and a superposition of changes in thermohaline circulation and seasonal insolation. In contrast with previous proxy-based reconstructions 6, 7 our reanalysis results show that global mean temperatures warmed between the early and middle Holocene and were stable thereafter. When compared with recent temperature changes 11 , our reanalysis indicates that both the rate and magnitude of modern observed warming are unprecedented relative to the changes of the last 24 kyr.
Journal of Climate, 2021
In the past 40 years, the global annual mean surface temperature has experienced a nonuniform war... more In the past 40 years, the global annual mean surface temperature has experienced a nonuniform warming, differing from the spatially uniform warming simulated by the forced responses of large multimodel ensembles to anthropogenic forcing. Rather, it exhibits significant asymmetry between the Arctic and Antarctic, with intermittent and spatially varying warming trends along the Northern Hemisphere (NH) midlatitudes and a slight cooling in the tropical eastern Pacific. In particular, this “wavy” pattern of temperature changes over the NH midlatitudes features strong cooling over Eurasia in boreal winter. Here, we show that these nonuniform features of surface temperature changes are likely tied together by tropical eastern Pacific sea surface temperatures (SSTs), via a global atmospheric teleconnection. Using six reanalyses, we find that this teleconnection can be consistently obtained as a leading circulation mode in the past century. This tropically driven teleconnection is associate...
Monthly Weather Review, 2020
A sensitivity analysis methodology recently developed by the authors is applied to COAMPS and WRF... more A sensitivity analysis methodology recently developed by the authors is applied to COAMPS and WRF. The method involves varying model parameters according to Latin Hypercube Sampling, and developing multivariate multiple regression models that map the model parameters to forecasts over a spatial domain. The regression coefficients and p values testing whether the coefficients are zero serve as measures of sensitivity of forecasts with respect to model parameters. Nine model parameters are selected from COAMPS and WRF, and their impact is examined on nine forecast quantities (water vapor, convective and gridscale precipitation, and air temperature and wind speed at three altitudes). Although the conclusions depend on the model parameters and specific forecast quantities, it is shown that sensitivity to model parameters is often accompanied by nontrivial spatial structure, which itself depends on the underlying forecast model (i.e., COAMPS vs WRF). One specific difference between these...
Climate of the Past, 2018
The Last Millennium Reanalysis (LMR) employs a data assimilation approach to reconstruct climate ... more The Last Millennium Reanalysis (LMR) employs a data assimilation approach to reconstruct climate fields from annually resolved proxy data over years 0-2000 CE. We use the LMR to examine Atlantic multidecadal variability (AMV) over the last 2 millennia and find several robust thermodynamic features associated with a positive Atlantic Multidecadal Oscillation (AMO) index that reveal a dynamically consistent pattern of variability: the Atlantic and most continents warm; sea ice thins over the Arctic and retreats over the Greenland, Iceland, and Norwegian seas; and equatorial precipitation shifts northward. The latter is consistent with anomalous southward energy transport mediated by the atmosphere. Net downward shortwave radiation increases at both the top of the atmosphere and the surface, indicating a decrease in planetary albedo, likely due to a decrease in low clouds. Heat is absorbed by the climate system and the oceans warm. Wavelet analysis of the AMO time series shows a reddening of the frequency spectrum on the 50to 100-year timescale, but no evidence of a distinct multidecadal or centennial spectral peak. This latter result is insensitive to both the choice of prior model and the calibration dataset used in the data assimilation algorithm, suggesting that the lack of a distinct multidecadal spectral peak is a robust result.
Monthly Weather Review, 2018
A methodology is proposed for examining the effect of model parameters (assumed to be continuous)... more A methodology is proposed for examining the effect of model parameters (assumed to be continuous) on the spatial structure of forecasts. The methodology involves several statistical methods of sampling and inference to assure the sensitivity results are statistically sound. Specifically, Latin hypercube sampling is employed to vary the model parameters, and multivariate multiple regression is used to account for spatial correlations in assessing the sensitivities. The end product is a geographic “map” of p values for each model parameter, allowing one to display and examine the spatial structure of the sensitivity. As an illustration, the effect of 11 model parameters in a mesoscale model on forecasts of convective and grid-scale precipitation, surface air temperature, and water vapor is studied. A number of spatial patterns in sensitivity are found. For example, a parameter that controls the fraction of available convective clouds and precipitation fed back to the grid scale influe...
Climate of the Past Discussions, 2018
The Last Millennium Reanalysis utilizes an ensemble methodology to assimilate paleoclimate data f... more The Last Millennium Reanalysis utilizes an ensemble methodology to assimilate paleoclimate data for the production of annually resolved climate field reconstructions of the Common Era. Two key elements are the focus of this work: the set of assimilated proxy records, and the forward models that map climate variables to proxy measurements. Results based on an extensive proxy database and seasonal regression-based forward models are compared to the prototype reanalysis of Hakim et al. ( ), which was based on a smaller set of proxy records and simpler proxy models formulated as univariate linear regressions against annual temperature. Validation against various instrumental-era gridded analyses shows that the new reconstructions of surface air temperature, 500 hPa geopotential height and the Palmer Drought Severity Index are significantly improved, with skill scores increasing from 10% to more than 200%, depending on the variable and verification measure. Additional experiments designed to isolate the sources of improvement reveal the importance of additional proxy records, including coral records for improving tropical reconstructions; tree-ring-width chronologies, including moisture-sensitive trees, for thermodynamic and hydroclimate variables in mid-latitudes; and tree-ring density records for temperature reconstructions, particularly in high northern latitudes. Proxy forward models that account for seasonal responses, and the dual sensitivity to temperature and moisture characterizing tree-ring-width proxies, are also found to be particularly important. Other experiments highlight the beneficial role of covariance localization on reanalysis ensemble characteristics. This improved paleoclimate data assimilation system served as the basis for the production of the first publicly released NOAA Last Millennium Reanalysis.
Meteorological Applications, 2018
Stochastic kinetic energy backscatter schemes (SKEBSs) are introduced in numerical weather foreca... more Stochastic kinetic energy backscatter schemes (SKEBSs) are introduced in numerical weather forecast models to represent uncertainties related to unresolved subgrid‐scale processes. These schemes are formulated using a set of parameters that must be determined using physical knowledge and/or to obtain a desired outcome. Here, a methodology is developed for assessing the effect of four factors on spatial features of forecasts simulated by the SKEBS‐enabled Weather Research and Forecasting model. The four factors include two physically motivated SKEBS parameters (the determining amplitude of perturbations applied to stream function and potential temperature tendencies), a purely stochastic element (a seed used in generating random perturbations) and a factor reflecting daily variability. A simple threshold‐based approach for identifying coherent objects within forecast fields is employed, and the effect of the four factors on object features (e.g. number, size and intensity) is assesse...
Climate of the Past Discussions, 2017
The Last Millennial Reanalysis (LMR) employs a data assimilation approach to reconstruct climate ... more The Last Millennial Reanalysis (LMR) employs a data assimilation approach to reconstruct climate fields from annually-resolved proxy data over years 0–2000CE. We use the LMR to examine Atlantic Multidecadal Variability (AMV) over the last two millennia, and find several robust thermodynamic features associated with a positive Atlantic Multidecadal Oscillation (AMO) index that reveal a dynamically-consistent pattern of variability: the Atlantic and most continents warm; sea ice thins over the Arctic and retreats over the Greenland-Iceland-Norwegian Seas; and equatorial precipitation shifts northward. The latter is consistent with anomalous southward energy transport mediated by the atmosphere. Net downward shortwave radiation increases at both the top-of-atmosphere and surface, indicating a decrease in planetary albedo, likely due to a decrease in low clouds. Heat is absorbed by the climate system and the oceans warm. Wavelet analysis of the AMV time series shows a reddening of...
La Météorologie, 2008
Une campagne de mesures dédiée au brouillard, appelée Paris-Fog, s'est déroulée en région parisie... more Une campagne de mesures dédiée au brouillard, appelée Paris-Fog, s'est déroulée en région parisienne au cours de l'hiver 2006-2007. Cette campagne est ambitieuse de par les moyens qui ont été mis en oeuvre et la longueur de la période documentée. On décrit ici les instruments utilisés ainsi que le mode de fonctionnement de la campagne de mesures regroupant des laboratoires de recherche d'horizons différents. Puis on présente un premier bilan ainsi qu'un zoom sur une période d'observations intensives.
Weather and Forecasting, 2009
A specific event, called a low-visibility procedure (LVP), has been defined when visibility is un... more A specific event, called a low-visibility procedure (LVP), has been defined when visibility is under 600 m and/or the ceiling is under 60 m at Paris-Charles de Gaulle Airport, Paris, France, to ensure air traffic safety and to reduce the economic issues related to poor visibility conditions. The Local Ensemble Prediction System (LEPS) has been designed to estimate LVP likelihood in order to help forecasters in their tasks. This work evaluates the skill of LEPS for each type of LVP that takes place at the airport area during five winter seasons from 2002 to 2007. An event-based classification reveals that stratus base lowering, advection, and radiation fogs make up for 78% of the LVP cases that occurred near the airport during this period. This study also demonstrates that LEPS is skillful on these types of event for short-term forecasts. When the ensemble runs start with initialized LVP events, the prediction of advection fogs is as skillful as the prediction of radiation fog events...
Pure and Applied Geophysics, 2007
The scientific community that includes meteorologists, physical scientists, engineers, medical do... more The scientific community that includes meteorologists, physical scientists, engineers, medical doctors, biologists, and environmentalists has shown interest in a better understanding of fog for years because of its effects on, directly or indirectly, the daily life of human beings. The total economic losses associated with the impact of the presence of fog on aviation, marine and land transportation can be comparable to those of tornadoes or, in some cases, winter storms and hurricanes. The number of articles including the word ''fog'' in Journals of American Meteorological Society alone was found to be about 4700, indicating that there is substantial interest in this subject. In spite of this extensive body of work, our ability to accurately forecast/nowcast fog remains limited due to our incomplete understanding of the fog processes over various time and space scales. Fog processes involve droplet microphysics, aerosol chemistry, radiation, turbulence, large/small-scale dynamics, and surface conditions (e.g., partaining to the presence of ice, snow, liquid, plants, and various types of soil). This review paper summarizes past achievements related to the understanding of fog formation, development and decay, and in this respect, the analysis of observations and the development of forecasting models and remote sensing methods are discussed in detail. Finally, future perspectives for fog-related research are highlighted.
Journal of the Atmospheric Sciences, 2010
To gain insights into the poorly understood phenomenon of precipitation fog, this study assesses ... more To gain insights into the poorly understood phenomenon of precipitation fog, this study assesses the evaporation of freely falling drops departing from equilibrium as a possible contributing factor to fog formation in rainy conditions. The study is based on simulations performed with a microphysical column model describing the evolution of the temperature and mass of evaporating raindrops within a Lagrangian reference frame. Equilibrium defines a state where the latent heat loss of an evaporating drop is balanced by the sensible heat flux from the ambient air, hence defining a steady-state drop temperature. Model results show that the assumption of equilibrium leads to small but significant errors in calculated precipitation evaporation rates for drops falling in continuously varying ambient near-saturated or saturated conditions. Departure from equilibrium depends on the magnitude of the vertical gradients of the ambient temperature and moisture as well as the drop-size-dependent t...