Juan M Restrepo | Oregon State University (original) (raw)
Uploads
Papers by Juan M Restrepo
Tellus B, 2006
... EMILY LANE 1 ,; SYNTE PEACOCK 2,* ,; JUAN M. RESTREPO 3. ... phosphate levels and particulate... more ... EMILY LANE 1 ,; SYNTE PEACOCK 2,* ,; JUAN M. RESTREPO 3. ... phosphate levels and particulate flux, which is lacking in conventional carbon-cycle box models which have been applied to the glacialinterglacial atmospheric pCO 2 problem (Sarmiento and Toggweiler, 1984 ...
Journal of Geophysical Research, 2011
... the conservative vortex force. Following Yu and Slinn (2003), we consider rip currents initia... more ... the conservative vortex force. Following Yu and Slinn (2003), we consider rip currents initially generated by alongshore variation of wave breaking due to a perturbation of a barred bottom topography. This variation is reduced ...
Physica D: Nonlinear Phenomena, 2004
This paper considers the problem of data assimilation into nonlinear stochastic dynamic equations... more This paper considers the problem of data assimilation into nonlinear stochastic dynamic equations, from the point of view that the optimal solution is provided by the probabilities conditioned upon observations. An implementation of Bayes formula is described to calculate such probabilities. In the context of a simple model with multimodal statistics, it is shown that the conditional statistics succeed in
Monthly Weather Review, 2003
... h m (x), N times. This scheme can be shown to be convergent to the optimal filter estimate in... more ... h m (x), N times. This scheme can be shown to be convergent to the optimal filter estimate in the limit as N → ∞ [see Del Moral (1996) and also the useful, recent review of Crisan and Doucet (2002)]. However, the rate of convergence ...
Journal of Geophysical Research, 2009
Physica D: Nonlinear Phenomena, 2008
We present a data assimilation methodology capable of capturing at least the first few moments of... more We present a data assimilation methodology capable of capturing at least the first few moments of the posterior distribution of the state of a system, given data. Both system dynamics and data are assumed to have errors, their distributions well known. However, there are no requirements of Gaussianity or assumptions on linearity in the assimilation procedure. The methods' computational complexity is comparable to the extended Kalman filter, in a variety of general problems. We show comparisons of the sample-based method to a benchmark as well as to extended and ensemble Kalman filter calculations, in the context of oceanic Lagrangian data assimilation.
Journal of Physical Oceanography, 2001
During the last glacial era, the partial pressure of atmospheric carbon dioxide (CO2) was around ... more During the last glacial era, the partial pressure of atmospheric carbon dioxide (CO2) was around 200 ppmv. This is approximately 80 ppmv less than the preindustrial value of atmospheric CO2. Estimates suggest that the ocean is the only reservoir large enough to account for such dramatic change in the carbon budget over these time scales. The distribution of total CO2 (dissolved CO2, carbonate and bicarbonate) throughout the ocean, is controlled by air-sea exchange, the strength of the thermohaline flow, mixing between the surface and the deep water, and by the particulate flux of organic matter sinking to the bottom of the ocean. There is debate over the roles played by high and low latitude oceans in the storage of CO2, and their relative importance in the process: box models of the CO2 cycle have been proposed which suggest that conditions in the high latitude oceans greatly determine the amount of CO2 the ocean is able to store. However, general circulation models (GCMs), indicate that the low latitude oceans influences CO2 storage to a degree greater than the box models suggest. We propose a box model which to a large extent resolves the debate. It does so while also retaining consistency with the fundamental fluid and thermodynamic equations governing this flow. We will describe this carbon dioxide preserving model, and will show that by changing the forcing the model yields glacial and interglacial solutions when realistic parameter values are used.
Eprint Arxiv 1307 0584, Jul 1, 2013
Beyond a methodological comparison, this study gives estimates of parameter values for the bottom... more Beyond a methodological comparison, this study gives estimates of parameter values for the bottom drag and surface forcing that make the particular model most consistent with data; furthermore, we also derive sensitivity estimates that provide useful insights regarding the estimation procedure as well as of the model itself.
The assimilation of oceanic or atmospheric data that takes into account path information is gener... more The assimilation of oceanic or atmospheric data that takes into account path information is generally highly nonlinear and non-Gaussian. The assimilation problem is compounded by the operational necessity to relate statistics along paths to statistics on fixed grids, which are more commonly used for the numerical computation of model flows. I will present a comparison of the Path Integral method (PIMC), the Diffusion Kernel Filter (DKF), and the more common Extended and Ensemble-based Kalman Filter estimation strategies in the assimilation of model and data. The outcomes are compared to a benchmark calculation of the same. The comparison will highlight the relative merits of the methods, particularly of the PIMC and DKF. Moreover, the calculation will be used to argue that statistical convergence and the choice of the right estimator are crucial in arriving at meaningful nonlinear/non-Gaussian estimators.
We propose criteria that define a trend for time series with inherent multi-scale features. We ca... more We propose criteria that define a trend for time series with inherent multi-scale features. We call this trend the {\it tendency} of a time series. The tendency is defined empirically by a set of criteria and captures the large-scale temporal variability of the original signal as well as the most frequent events in its histogram. Among other properties, the tendency has a variance no larger than that of the original signal; the histogram of the difference between the original signal and the tendency is as symmetric as possible; and with reduced complexity, the tendency captures essential features of the signal. To find the tendency we first use the Intrinsic Time-Scale Decomposition (ITD) of the signal, introduced in 2007 by Frei and Osorio, to produce a set of candidate tendencies. We then apply the criteria to each of the candidates to single out the one that best agrees with them. While the criteria for the tendency are independent of the signal decomposition scheme, it is found ...
We propose a simple model of the coupling between oceanic circulation and sea ice dynamics on lon... more We propose a simple model of the coupling between oceanic circulation and sea ice dynamics on long time scales. The model begins with a one-dimensional Budyko-Sellers energy balance model of ice-albedo feedback, with a linearized temperature dependence of outgoing longwave radiation. This sits atop a box model of ocean circulation, with conventional thermohaline forcing except that surface heat exchange occurs via the Budyko-Sellers model. The ocean and the ice sheet are coupled via advection and plastic flow of ice, and by the thermodynamics of the ice/seawater interface. We use this model to assess how (and by what mechanisms) ocean circulation and ice sheet dynamics affect one another, primarily to investigate the role played by changes in solar input and greenhouse gas forcing, e.g. in the Snowball Earth scenario.
DIASS-M4C, a digital additive instrument was implemented on the Argonne National Laboratory`s IBM... more DIASS-M4C, a digital additive instrument was implemented on the Argonne National Laboratory`s IBM POWER parallel System (SP). This paper discusses the need for a massively parallel supercomputer and shows how the code was parallelized. The resulting sounds and the degree of control the user can have justify the effort and the use of such a large computer.
Tellus B, 2006
... EMILY LANE 1 ,; SYNTE PEACOCK 2,* ,; JUAN M. RESTREPO 3. ... phosphate levels and particulate... more ... EMILY LANE 1 ,; SYNTE PEACOCK 2,* ,; JUAN M. RESTREPO 3. ... phosphate levels and particulate flux, which is lacking in conventional carbon-cycle box models which have been applied to the glacialinterglacial atmospheric pCO 2 problem (Sarmiento and Toggweiler, 1984 ...
Journal of Geophysical Research, 2011
... the conservative vortex force. Following Yu and Slinn (2003), we consider rip currents initia... more ... the conservative vortex force. Following Yu and Slinn (2003), we consider rip currents initially generated by alongshore variation of wave breaking due to a perturbation of a barred bottom topography. This variation is reduced ...
Physica D: Nonlinear Phenomena, 2004
This paper considers the problem of data assimilation into nonlinear stochastic dynamic equations... more This paper considers the problem of data assimilation into nonlinear stochastic dynamic equations, from the point of view that the optimal solution is provided by the probabilities conditioned upon observations. An implementation of Bayes formula is described to calculate such probabilities. In the context of a simple model with multimodal statistics, it is shown that the conditional statistics succeed in
Monthly Weather Review, 2003
... h m (x), N times. This scheme can be shown to be convergent to the optimal filter estimate in... more ... h m (x), N times. This scheme can be shown to be convergent to the optimal filter estimate in the limit as N → ∞ [see Del Moral (1996) and also the useful, recent review of Crisan and Doucet (2002)]. However, the rate of convergence ...
Journal of Geophysical Research, 2009
Physica D: Nonlinear Phenomena, 2008
We present a data assimilation methodology capable of capturing at least the first few moments of... more We present a data assimilation methodology capable of capturing at least the first few moments of the posterior distribution of the state of a system, given data. Both system dynamics and data are assumed to have errors, their distributions well known. However, there are no requirements of Gaussianity or assumptions on linearity in the assimilation procedure. The methods' computational complexity is comparable to the extended Kalman filter, in a variety of general problems. We show comparisons of the sample-based method to a benchmark as well as to extended and ensemble Kalman filter calculations, in the context of oceanic Lagrangian data assimilation.
Journal of Physical Oceanography, 2001
During the last glacial era, the partial pressure of atmospheric carbon dioxide (CO2) was around ... more During the last glacial era, the partial pressure of atmospheric carbon dioxide (CO2) was around 200 ppmv. This is approximately 80 ppmv less than the preindustrial value of atmospheric CO2. Estimates suggest that the ocean is the only reservoir large enough to account for such dramatic change in the carbon budget over these time scales. The distribution of total CO2 (dissolved CO2, carbonate and bicarbonate) throughout the ocean, is controlled by air-sea exchange, the strength of the thermohaline flow, mixing between the surface and the deep water, and by the particulate flux of organic matter sinking to the bottom of the ocean. There is debate over the roles played by high and low latitude oceans in the storage of CO2, and their relative importance in the process: box models of the CO2 cycle have been proposed which suggest that conditions in the high latitude oceans greatly determine the amount of CO2 the ocean is able to store. However, general circulation models (GCMs), indicate that the low latitude oceans influences CO2 storage to a degree greater than the box models suggest. We propose a box model which to a large extent resolves the debate. It does so while also retaining consistency with the fundamental fluid and thermodynamic equations governing this flow. We will describe this carbon dioxide preserving model, and will show that by changing the forcing the model yields glacial and interglacial solutions when realistic parameter values are used.
Eprint Arxiv 1307 0584, Jul 1, 2013
Beyond a methodological comparison, this study gives estimates of parameter values for the bottom... more Beyond a methodological comparison, this study gives estimates of parameter values for the bottom drag and surface forcing that make the particular model most consistent with data; furthermore, we also derive sensitivity estimates that provide useful insights regarding the estimation procedure as well as of the model itself.
The assimilation of oceanic or atmospheric data that takes into account path information is gener... more The assimilation of oceanic or atmospheric data that takes into account path information is generally highly nonlinear and non-Gaussian. The assimilation problem is compounded by the operational necessity to relate statistics along paths to statistics on fixed grids, which are more commonly used for the numerical computation of model flows. I will present a comparison of the Path Integral method (PIMC), the Diffusion Kernel Filter (DKF), and the more common Extended and Ensemble-based Kalman Filter estimation strategies in the assimilation of model and data. The outcomes are compared to a benchmark calculation of the same. The comparison will highlight the relative merits of the methods, particularly of the PIMC and DKF. Moreover, the calculation will be used to argue that statistical convergence and the choice of the right estimator are crucial in arriving at meaningful nonlinear/non-Gaussian estimators.
We propose criteria that define a trend for time series with inherent multi-scale features. We ca... more We propose criteria that define a trend for time series with inherent multi-scale features. We call this trend the {\it tendency} of a time series. The tendency is defined empirically by a set of criteria and captures the large-scale temporal variability of the original signal as well as the most frequent events in its histogram. Among other properties, the tendency has a variance no larger than that of the original signal; the histogram of the difference between the original signal and the tendency is as symmetric as possible; and with reduced complexity, the tendency captures essential features of the signal. To find the tendency we first use the Intrinsic Time-Scale Decomposition (ITD) of the signal, introduced in 2007 by Frei and Osorio, to produce a set of candidate tendencies. We then apply the criteria to each of the candidates to single out the one that best agrees with them. While the criteria for the tendency are independent of the signal decomposition scheme, it is found ...
We propose a simple model of the coupling between oceanic circulation and sea ice dynamics on lon... more We propose a simple model of the coupling between oceanic circulation and sea ice dynamics on long time scales. The model begins with a one-dimensional Budyko-Sellers energy balance model of ice-albedo feedback, with a linearized temperature dependence of outgoing longwave radiation. This sits atop a box model of ocean circulation, with conventional thermohaline forcing except that surface heat exchange occurs via the Budyko-Sellers model. The ocean and the ice sheet are coupled via advection and plastic flow of ice, and by the thermodynamics of the ice/seawater interface. We use this model to assess how (and by what mechanisms) ocean circulation and ice sheet dynamics affect one another, primarily to investigate the role played by changes in solar input and greenhouse gas forcing, e.g. in the Snowball Earth scenario.
DIASS-M4C, a digital additive instrument was implemented on the Argonne National Laboratory`s IBM... more DIASS-M4C, a digital additive instrument was implemented on the Argonne National Laboratory`s IBM POWER parallel System (SP). This paper discusses the need for a massively parallel supercomputer and shows how the code was parallelized. The resulting sounds and the degree of control the user can have justify the effort and the use of such a large computer.