Simon Scheidegger - Academia.edu (original) (raw)

Papers by Simon Scheidegger

Research paper thumbnail of Ricardian Business Cycles

Research paper thumbnail of Deep Equilibrium Nets

International Economic Review, 2022

Research paper thumbnail of The climate in climate economics

Research Papers in Economics, Jul 1, 2021

We develop a generic and transparent calibration strategy for climate models used in economics. T... more We develop a generic and transparent calibration strategy for climate models used in economics. The key idea is to choose the free model parameters to match the output of large-scale Earth System Models, which are run on pre-defined future emissions scenarios and collected in the Coupled Model Intercomparison Project (CMIP5). We propose to jointly use four different test cases that are considered pivotal in the climate science literature. Two of these tests are highly idealized to allow for the separate quantitative examination of the carbon cycle and the temperature response. Another two tests are closer to the scenarios that arise from economic models. They test the climate module as a whole, that is, they incorporate gradual changes in CO2 emissions, exogenous forcing, and ultimately the temperature response. To illustrate the applicability of our method, we re-calibrate the free parameters of the climate part of the seminal DICE-2016 model for three different CMIP5 model responses: the multi-model mean as well as two other CMIP5 models that exhibit extreme but still permissible equilibrium climate sensitivities. As an additional novelty, our calibrations of DICE-2016 allow for an arbitrary time step in the model explicitly. By applying our comprehensive suite of tests, we show that i) both the temperature equations and the carbon cycle in DICE-2016 are miscalibrated and that ii) by re-calibrating its coefficients, we can match all three CMIP5 targets we consider. Finally, we apply the economic model from DICE-2016 in combination with the newly calibrated climate model to compute the social cost of carbon and the optimal warming. We find that in our updated model, the social cost of carbon is very similar to DICE-2016. However, a significant difference is that the optimal long-run temperature lies almost one degree below that obtained by DICE-2016. This difference in climate behavior is reflected in the over-sensitivity of the social cost of carbon to the discount rate of the social planner. We also show that under the optimal mitigation scenario, the temperature predictions of DICE-2016 (in contrast to our proposed calibration) fall outside of the CMIP5 scenarios, suggesting that one might want to be skeptical about policy predictions derived from DICE-2016.

Research paper thumbnail of Optimal Dynamic Fiscal Policy with Endogenous Debt Limits

Since the financial crisis of 2008 and increased government debt levels worldwide, fiscal austeri... more Since the financial crisis of 2008 and increased government debt levels worldwide, fiscal austerity has been a focal point in public debates. Central to these debates is the natural debt limit, i.e. the level of public debt that's sustainable in the long run, and the design of fiscal policy that is consistent with that limit. In much of the earlier work on dynamic fiscal policy, governments are not allowed to lend, and the upper limit on debt is determined in an ad-hoc manner Aiyagari et. al (2002)'s (AMSS) seminal paper on fiscal policy in incomplete markets relaxed the lending assumption and revisited earlier work of Barro (1979) and Lucas and Stokey (1983) to study the implications on tax policy. Their results implied that taxes should roughly follow a random walk. They also presented examples where the long-run tax rate is zero, and any spending is financed out of its asset income (i.e., government holds debt of the people). However, their approach had some weaknesses. F...

Research paper thumbnail of Machine Learning for High-Dimensional Dynamic Stochastic Economies

SSRN Electronic Journal, 2017

We present the first computational framework that can compute global solutions to very-high-dimen... more We present the first computational framework that can compute global solutions to very-high-dimensional dynamic stochastic economic models on arbitrary state space geometries. This framework can also resolve value and policy functions' local features and perform uncertainty quantification, in a self-consistent manner. We achieve this by combining Gaussian process machine learning with the active subspace method; we then embed this into a massively parallelized discrete-time dynamic programming algorithm. To demonstrate the broad applicability of our method, we compute solutions to stochastic optimal growth models of up to 500 continuous dimensions. We also show that our framework can address parameter uncertainty and can provide predictive confidence intervals for policies that correspond to the epistemic uncertainty induced by limited data. Finally, we propose an algorithm, based on this framework, that is capable of learning irregularly shaped ergodic sets as well as performing dynamic programming on them.

Research paper thumbnail of The climate in climate economics

We develop a generic and transparent calibration strategy for climate models used in economics. T... more We develop a generic and transparent calibration strategy for climate models used in economics. The key idea is to choose the free model parameters to match the output of large-scale Earth System Models, which are run on pre-defined future emissions scenarios and collected in the Coupled Model Intercomparison Project (CMIP5). We propose to jointly use four different test cases that are considered pivotal in the climate science literature. Two of these tests are highly idealized to allow for the separate quantitative examination of the carbon cycle and the temperature response. Another two tests are closer to the scenarios that arise from economic models. They test the climate module as a whole, that is, they incorporate gradual changes in CO2 emissions, exogenous forcing, and ultimately the temperature response. To illustrate the applicability of our method, we re-calibrate the free parameters of the climate part of the seminal DICE-2016 model for three different CMIP5 model responses: the multi-model mean as well as two other CMIP5 models that exhibit extreme but still permissible equilibrium climate sensitivities. As an additional novelty, our calibrations of DICE-2016 allow for an arbitrary time step in the model explicitly. By applying our comprehensive suite of tests, we show that i) both the temperature equations and the carbon cycle in DICE-2016 are miscalibrated and that ii) by re-calibrating its coefficients, we can match all three CMIP5 targets we consider. Finally, we apply the economic model from DICE-2016 in combination with the newly calibrated climate model to compute the social cost of carbon and the optimal warming. We find that in our updated model, the social cost of carbon is very similar to DICE-2016. However, a significant difference is that the optimal long-run temperature lies almost one degree below that obtained by DICE-2016. This difference in climate behavior is reflected in the over-sensitivity of the social cost of carbon to the discount rate of the social planner. We also show that under the optimal mitigation scenario, the temperature predictions of DICE-2016 (in contrast to our proposed calibration) fall outside of the CMIP5 scenarios, suggesting that one might want to be skeptical about policy predictions derived from DICE-2016.

Research paper thumbnail of High-frequency and high-dimensionality in Finance

The Thesis consists in two fundamental chapters: First, I analyze the presence of jumps in high-f... more The Thesis consists in two fundamental chapters: First, I analyze the presence of jumps in high-frequency financial series, with an extensive application to most liquid American equities. I develop a statistical test that formally treats the multiple testing issue and show, in substance, that discontinuities are much rarer events than previously thought. Second, I introduce a novel, scalable numerical pricing framework for American options under multi-factor models. I make use of adaptive sparse grids to alleviate the curse of dimensionality. My methodology offers a new direction for research in the pricing of derivative contracts with early-exercise features under models actually capable of fitting statistical properties of financial series

Research paper thumbnail of Rethinking large-scale Economic Modeling for Efficiency: Optimizations for GPU and Xeon Phi Clusters

2018 IEEE International Parallel and Distributed Processing Symposium (IPDPS), 2018

We propose a massively parallelized and optimized framework to solve high-dimensional dynamic sto... more We propose a massively parallelized and optimized framework to solve high-dimensional dynamic stochastic economic models on modern GPU-and MIC-based clusters. First, we introduce a novel approach for adaptive sparse grid index compression alongside a surplus matrix reordering, which significantly reduces the global memory throughput of the compute kernels and maps randomly accessed data onto cache or fast shared memory. Second, we fully vectorize the compute kernels for AVX, AVX2 and AVX512 CPUs, respectively. Third, we develop a hybrid cluster oriented work-preempting scheduler based on TBB, which evenly distributes the time iteration workload onto available CPU cores and accelerators. Numerical experiments on Cray XC40 KNL "Grand Tave" and on Cray XC50 "Piz Daint" systems at the Swiss National Supercomputer Centre (CSCS) show that our framework scales nicely to at least 4,096 compute nodes, resulting in an overall speedup of more than four orders of magnitude compared to a single, optimized CPU thread. As an economic application, we compute global solutions to an annually calibrated stochastic public finance model with sixteen discrete, stochastic states with unprecedented performance.

Research paper thumbnail of Uniformly Self-Justified Equilibria

SSRN Electronic Journal, 2021

We consider dynamic stochastic economies with heterogeneous agents and introduce the concept of u... more We consider dynamic stochastic economies with heterogeneous agents and introduce the concept of uniformly self-justified equilibria (USJE)-temporary equilibria for which forecasts are best uniform approximations to a selection of the equilibrium correspondence. In a USJE, individuals' forecasting functions for the next period's endogenous variables are assumed to lie in a compact, finite-dimensional set of functions, and the forecasts constitute the best approximation within this set. We show that USJE always exist and develop a simple algorithm to compute them. Therefore, they are more tractable than rational expectations equilibria that do not always exist. As an application, we discuss a stochastic overlapping generations exchange economy and provide numerical examples to illustrate the concept of USJE and the computational method.

Research paper thumbnail of Large-Scale Precision Matrix Estimation With SQUIC

SSRN Electronic Journal, 2021

High-dimensional sparse precision matrix estimation is a ubiquitous task in multivariate analysis... more High-dimensional sparse precision matrix estimation is a ubiquitous task in multivariate analysis with applications that cross many disciplines. In this paper, we introduce the SQUIC package, which benefits from superior runtime performance and scalability, significantly exceeding the available state-of-the-art packages. This package is a second-order method that solves the L1--regularized maximum likelihood problem using highly optimized linear algebra subroutines, which leverage the underlying sparsity and the intrinsic parallelism in the computation. We provide two sets of numerical tests; the first one consists of didactic examples using synthetic datasets highlighting the performance and accuracy of the package, and the second one is a real-world classification problem of high dimensional medical datasets. The base algorithm is implemented in C++ with interfaces for R and Python.

Research paper thumbnail of Deep Structural Estimation: With an Application to Option Pricing

ERN: Estimation (Topic), 2021

We propose a novel structural estimation framework in which we train a surrogate of an economic m... more We propose a novel structural estimation framework in which we train a surrogate of an economic model with deep neural networks. Our methodology alleviates the curse of dimensionality and speeds up the evaluation and parameter estimation by orders of magnitudes, which significantly enhances one's ability to conduct analyses that require frequent parameter re-estimation. As an empirical application, we compare two popular option pricing models (the Heston and the Bates model with double-exponential jumps) against a non-parametric random forest model. We document that: a) the Bates model produces better out-of-sample pricing on average, but both structural models fail to outperform random forest for large areas of the volatility surface; b) random forest is more competitive at short horizons (e.g., 1-day), for short-dated options (with less than 7 days to maturity), and on days with poor liquidity; c) both structural models outperform random forest in out-of-sample delta hedging; ...

Research paper thumbnail of Pareto-improving carbon-risk taxation

Economic Policy, 2021

Summary Anthropogenic climate change produces two conceptually distinct negative economic externa... more Summary Anthropogenic climate change produces two conceptually distinct negative economic externalities. The first is an expected path of climate damage. The second, the focus of this paper, is an expected path of economic risk. To isolate the climate-risk problem, we consider three mean-zero, symmetric shocks in our 12-period, overlapping generations model. These shocks impact dirty energy usage (carbon emissions), the relationship between carbon concentration and temperature and the connection between temperature and damages. By construction, our model exhibits a de minimis climate problem absent its shocks. However, due to non-linearities, symmetric shocks deliver negatively skewed impacts, including the potential for climate disasters. As we show, Pareto-improving carbon taxation can dramatically lower climate risk, in general, and disaster risk, in particular. The associated climate-risk tax, which is focused exclusively on limiting climate risk, can be as large as, or larger t...

Research paper thumbnail of Parallelized Dimensional Decomposition for Dynamic Stochastic Economic Models

This work is concerned with a generalized solution method for high-dimensional dynamic stochastic... more This work is concerned with a generalized solution method for high-dimensional dynamic stochastic economic models. The introduced method utilizes dimensional decomposition with sparse grid approximation to formulate a massively parallelized computational framework. The algorithm is applied on high-dimensional economic models, with testing conducted at the Swiss National Supercomputing Centre. Our results show that the method can effectively mitigate the so-called “curse of dimensionality”, allowing for computability of models with up to 200 continuous dimensions.

Research paper thumbnail of Making Carbon Taxation a Generational

Felix Kubler and Simon Scheidegger are generously supported by a grant from the Swiss Platform fo... more Felix Kubler and Simon Scheidegger are generously supported by a grant from the Swiss Platform for Advanced Scientific Computing (PASC) under project ID “Computing equilibria in heterogeneous agent macro models on contemporary HPC platforms". Simon Scheidegger gratefully acknowledges support from the Cowles Foundation at Yale University. Laurence Kotlikoff thanks Boston University and the Gaidar Institute for research support. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.

Research paper thumbnail of Block-enhanced precision matrix estimation for large-scale datasets

The l1-regularized Gaussian maximum likelihood method is a common approach for sparse precision m... more The l1-regularized Gaussian maximum likelihood method is a common approach for sparse precision matrix estimation, but one that poses a computational challenge for high-dimensional datasets. We present a novel l1-regularized maximum likelihood method for performant large-scale sparse precision matrix estimation utilizing the block structures in the underlying computations. We identify the computational bottlenecks and contribute a block coordinate descent update as well as a block approximate matrix inversion routine, which is then parallelized using a shared-memory scheme. We demonstrate the effectiveness, accuracy, and performance of these algorithms. Our numerical examples and comparative results with various modern open-source packages reveal that these precision matrix estimation methods can accelerate the computation of covariance matrices by two to three orders of magnitude, while keeping memory requirements modest. Furthermore, we conduct large-scale case studies for applica...

Research paper thumbnail of Detecting Edgeworth Cycles

SSRN Electronic Journal, 2021

We propose algorithms to detect "Edgeworth cycles," asymmetric price movements that have caused a... more We propose algorithms to detect "Edgeworth cycles," asymmetric price movements that have caused antitrust concerns in many countries. We formalize four existing methods and propose six new methods based on spectral analysis and machine learning. We evaluate their accuracy in station-level gasoline-price data from Western Australia, New South Wales, and Germany. Most methods achieve high accuracy in the first two, but only a few can detect nuanced cycles in the third. Results suggest whether researchers find a positive or negative statistical relationship between cycles and markups, and hence their implications for competition policy, crucially depends on the choice of methods.

Research paper thumbnail of Can Today's and Tomorrow's World Uniformly Gain from Carbon Taxation?

Climate change will impact current and future generations in different regions very differently. ... more Climate change will impact current and future generations in different regions very differently. This paper develops a large-scale, annually calibrated, multi-region, overlapping generations model of climate change to study its heterogeneous effects across space and time. We model the relationship between carbon emissions and the global average temperature based on the latest climate science. Predicated average global temperature is used to determine, via pattern-scaling, region-specific temperatures and damages. Our main focus is determining the carbon policy that delivers present and future mankind the highest uniform percentage welfare gains-arguably the policy with the highest chance of global adoption. Damages from climate change are positive for all regions apart from Russia and Canada, with India and South Asia Pacific suffering the most. The optimal policy is implemented via a time-varying global carbon tax plus region-and generation-specific net transfers. Uniform welfare improving carbon policy can materially limit global emissions, dramatically shorten the use of fossil fuels, and raise the welfare of all current and future agents by over four percent. Unfortunately, the pursuit of carbon policy by individual regions, even large ones, makes only a limited difference. However, coalitions of regions, particularly ones including China, can materially limit carbon emissions.

Research paper thumbnail of Parallelized Dimensional Decomposition for Large-Scale Dynamic Stochastic Economic Models

Proceedings of the Platform for Advanced Scientific Computing Conference, 2017

We introduce and deploy a generic, highly scalable computational method to solve high-dimensional... more We introduce and deploy a generic, highly scalable computational method to solve high-dimensional dynamic stochastic economic models on high-performance computing platforms. Within an MPI---TBB parallel, nonlinear time iteration framework, we approximate economic policy functions using an adaptive sparse grid algorithm with d-linear basis functions that is combined with a dimensional decomposition scheme. Numerical experiments on "Piz Daint" (Cray XC30) at the Swiss National Supercomputing Centre show that our framework scales nicely to at least 1,000 compute nodes. As an economic application, we compute global solutions to international real business cycle models up to 200 continuous dimensions with significant speedup values over state-of-the-art techniques.

Research paper thumbnail of High-Dimensional Dynamic Stochastic Model Representation

SSRN Electronic Journal, 2020

We propose a generic and scalable method for computing global solutions of nonlinear, high-dimens... more We propose a generic and scalable method for computing global solutions of nonlinear, high-dimensional dynamic stochastic economic models. First, within an MPI–TBB parallel time-iteration framework, we approximate economic policy functions using an adaptive, high-dimensional model representation scheme, combined with adaptive sparse grids. With increasing dimensions, the number of points in this efficiently-chosen combination of low-dimensional grids grows much more slowly than standard tensor product grids, sparse grids, or even adaptive sparse grids. Moreover, the adaptivity within the individual component functions adds an additional layer of sparsity, since grid points are added only where they are most needed — that is to say, in regions of the computational domain with steep gradients or at non-differentiabilities. Second, we introduce a performant vectorization scheme of the interpolation compute kernel. Third, we validate our claims with numerical experiments conducted on “Piz Daint" (Cray XC50) at the Swiss National Super-computing Center. We observe significant speedups over the state-of-the-art techniques, and almost ideal strong scaling up to at least 1, 000 compute nodes. Fourth, to demonstrate the broad applicability of our method, we compute global solutions to two different versions of a dynamic stochastic economic model: a high-dimensional international real business cycle model with capital adjustment costs, and with or without irreversible investment. We solve these models up to 300 continuous state variables globally.

Research paper thumbnail of Pareto-Improving Carbon-Risk Taxation

At least one co-author has disclosed a financial relationship of potential relevance for this res... more At least one co-author has disclosed a financial relationship of potential relevance for this research. Further information is available online at http://www.nber.org/papers/w26919.ack NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.

Research paper thumbnail of Ricardian Business Cycles

Research paper thumbnail of Deep Equilibrium Nets

International Economic Review, 2022

Research paper thumbnail of The climate in climate economics

Research Papers in Economics, Jul 1, 2021

We develop a generic and transparent calibration strategy for climate models used in economics. T... more We develop a generic and transparent calibration strategy for climate models used in economics. The key idea is to choose the free model parameters to match the output of large-scale Earth System Models, which are run on pre-defined future emissions scenarios and collected in the Coupled Model Intercomparison Project (CMIP5). We propose to jointly use four different test cases that are considered pivotal in the climate science literature. Two of these tests are highly idealized to allow for the separate quantitative examination of the carbon cycle and the temperature response. Another two tests are closer to the scenarios that arise from economic models. They test the climate module as a whole, that is, they incorporate gradual changes in CO2 emissions, exogenous forcing, and ultimately the temperature response. To illustrate the applicability of our method, we re-calibrate the free parameters of the climate part of the seminal DICE-2016 model for three different CMIP5 model responses: the multi-model mean as well as two other CMIP5 models that exhibit extreme but still permissible equilibrium climate sensitivities. As an additional novelty, our calibrations of DICE-2016 allow for an arbitrary time step in the model explicitly. By applying our comprehensive suite of tests, we show that i) both the temperature equations and the carbon cycle in DICE-2016 are miscalibrated and that ii) by re-calibrating its coefficients, we can match all three CMIP5 targets we consider. Finally, we apply the economic model from DICE-2016 in combination with the newly calibrated climate model to compute the social cost of carbon and the optimal warming. We find that in our updated model, the social cost of carbon is very similar to DICE-2016. However, a significant difference is that the optimal long-run temperature lies almost one degree below that obtained by DICE-2016. This difference in climate behavior is reflected in the over-sensitivity of the social cost of carbon to the discount rate of the social planner. We also show that under the optimal mitigation scenario, the temperature predictions of DICE-2016 (in contrast to our proposed calibration) fall outside of the CMIP5 scenarios, suggesting that one might want to be skeptical about policy predictions derived from DICE-2016.

Research paper thumbnail of Optimal Dynamic Fiscal Policy with Endogenous Debt Limits

Since the financial crisis of 2008 and increased government debt levels worldwide, fiscal austeri... more Since the financial crisis of 2008 and increased government debt levels worldwide, fiscal austerity has been a focal point in public debates. Central to these debates is the natural debt limit, i.e. the level of public debt that's sustainable in the long run, and the design of fiscal policy that is consistent with that limit. In much of the earlier work on dynamic fiscal policy, governments are not allowed to lend, and the upper limit on debt is determined in an ad-hoc manner Aiyagari et. al (2002)'s (AMSS) seminal paper on fiscal policy in incomplete markets relaxed the lending assumption and revisited earlier work of Barro (1979) and Lucas and Stokey (1983) to study the implications on tax policy. Their results implied that taxes should roughly follow a random walk. They also presented examples where the long-run tax rate is zero, and any spending is financed out of its asset income (i.e., government holds debt of the people). However, their approach had some weaknesses. F...

Research paper thumbnail of Machine Learning for High-Dimensional Dynamic Stochastic Economies

SSRN Electronic Journal, 2017

We present the first computational framework that can compute global solutions to very-high-dimen... more We present the first computational framework that can compute global solutions to very-high-dimensional dynamic stochastic economic models on arbitrary state space geometries. This framework can also resolve value and policy functions' local features and perform uncertainty quantification, in a self-consistent manner. We achieve this by combining Gaussian process machine learning with the active subspace method; we then embed this into a massively parallelized discrete-time dynamic programming algorithm. To demonstrate the broad applicability of our method, we compute solutions to stochastic optimal growth models of up to 500 continuous dimensions. We also show that our framework can address parameter uncertainty and can provide predictive confidence intervals for policies that correspond to the epistemic uncertainty induced by limited data. Finally, we propose an algorithm, based on this framework, that is capable of learning irregularly shaped ergodic sets as well as performing dynamic programming on them.

Research paper thumbnail of The climate in climate economics

We develop a generic and transparent calibration strategy for climate models used in economics. T... more We develop a generic and transparent calibration strategy for climate models used in economics. The key idea is to choose the free model parameters to match the output of large-scale Earth System Models, which are run on pre-defined future emissions scenarios and collected in the Coupled Model Intercomparison Project (CMIP5). We propose to jointly use four different test cases that are considered pivotal in the climate science literature. Two of these tests are highly idealized to allow for the separate quantitative examination of the carbon cycle and the temperature response. Another two tests are closer to the scenarios that arise from economic models. They test the climate module as a whole, that is, they incorporate gradual changes in CO2 emissions, exogenous forcing, and ultimately the temperature response. To illustrate the applicability of our method, we re-calibrate the free parameters of the climate part of the seminal DICE-2016 model for three different CMIP5 model responses: the multi-model mean as well as two other CMIP5 models that exhibit extreme but still permissible equilibrium climate sensitivities. As an additional novelty, our calibrations of DICE-2016 allow for an arbitrary time step in the model explicitly. By applying our comprehensive suite of tests, we show that i) both the temperature equations and the carbon cycle in DICE-2016 are miscalibrated and that ii) by re-calibrating its coefficients, we can match all three CMIP5 targets we consider. Finally, we apply the economic model from DICE-2016 in combination with the newly calibrated climate model to compute the social cost of carbon and the optimal warming. We find that in our updated model, the social cost of carbon is very similar to DICE-2016. However, a significant difference is that the optimal long-run temperature lies almost one degree below that obtained by DICE-2016. This difference in climate behavior is reflected in the over-sensitivity of the social cost of carbon to the discount rate of the social planner. We also show that under the optimal mitigation scenario, the temperature predictions of DICE-2016 (in contrast to our proposed calibration) fall outside of the CMIP5 scenarios, suggesting that one might want to be skeptical about policy predictions derived from DICE-2016.

Research paper thumbnail of High-frequency and high-dimensionality in Finance

The Thesis consists in two fundamental chapters: First, I analyze the presence of jumps in high-f... more The Thesis consists in two fundamental chapters: First, I analyze the presence of jumps in high-frequency financial series, with an extensive application to most liquid American equities. I develop a statistical test that formally treats the multiple testing issue and show, in substance, that discontinuities are much rarer events than previously thought. Second, I introduce a novel, scalable numerical pricing framework for American options under multi-factor models. I make use of adaptive sparse grids to alleviate the curse of dimensionality. My methodology offers a new direction for research in the pricing of derivative contracts with early-exercise features under models actually capable of fitting statistical properties of financial series

Research paper thumbnail of Rethinking large-scale Economic Modeling for Efficiency: Optimizations for GPU and Xeon Phi Clusters

2018 IEEE International Parallel and Distributed Processing Symposium (IPDPS), 2018

We propose a massively parallelized and optimized framework to solve high-dimensional dynamic sto... more We propose a massively parallelized and optimized framework to solve high-dimensional dynamic stochastic economic models on modern GPU-and MIC-based clusters. First, we introduce a novel approach for adaptive sparse grid index compression alongside a surplus matrix reordering, which significantly reduces the global memory throughput of the compute kernels and maps randomly accessed data onto cache or fast shared memory. Second, we fully vectorize the compute kernels for AVX, AVX2 and AVX512 CPUs, respectively. Third, we develop a hybrid cluster oriented work-preempting scheduler based on TBB, which evenly distributes the time iteration workload onto available CPU cores and accelerators. Numerical experiments on Cray XC40 KNL "Grand Tave" and on Cray XC50 "Piz Daint" systems at the Swiss National Supercomputer Centre (CSCS) show that our framework scales nicely to at least 4,096 compute nodes, resulting in an overall speedup of more than four orders of magnitude compared to a single, optimized CPU thread. As an economic application, we compute global solutions to an annually calibrated stochastic public finance model with sixteen discrete, stochastic states with unprecedented performance.

Research paper thumbnail of Uniformly Self-Justified Equilibria

SSRN Electronic Journal, 2021

We consider dynamic stochastic economies with heterogeneous agents and introduce the concept of u... more We consider dynamic stochastic economies with heterogeneous agents and introduce the concept of uniformly self-justified equilibria (USJE)-temporary equilibria for which forecasts are best uniform approximations to a selection of the equilibrium correspondence. In a USJE, individuals' forecasting functions for the next period's endogenous variables are assumed to lie in a compact, finite-dimensional set of functions, and the forecasts constitute the best approximation within this set. We show that USJE always exist and develop a simple algorithm to compute them. Therefore, they are more tractable than rational expectations equilibria that do not always exist. As an application, we discuss a stochastic overlapping generations exchange economy and provide numerical examples to illustrate the concept of USJE and the computational method.

Research paper thumbnail of Large-Scale Precision Matrix Estimation With SQUIC

SSRN Electronic Journal, 2021

High-dimensional sparse precision matrix estimation is a ubiquitous task in multivariate analysis... more High-dimensional sparse precision matrix estimation is a ubiquitous task in multivariate analysis with applications that cross many disciplines. In this paper, we introduce the SQUIC package, which benefits from superior runtime performance and scalability, significantly exceeding the available state-of-the-art packages. This package is a second-order method that solves the L1--regularized maximum likelihood problem using highly optimized linear algebra subroutines, which leverage the underlying sparsity and the intrinsic parallelism in the computation. We provide two sets of numerical tests; the first one consists of didactic examples using synthetic datasets highlighting the performance and accuracy of the package, and the second one is a real-world classification problem of high dimensional medical datasets. The base algorithm is implemented in C++ with interfaces for R and Python.

Research paper thumbnail of Deep Structural Estimation: With an Application to Option Pricing

ERN: Estimation (Topic), 2021

We propose a novel structural estimation framework in which we train a surrogate of an economic m... more We propose a novel structural estimation framework in which we train a surrogate of an economic model with deep neural networks. Our methodology alleviates the curse of dimensionality and speeds up the evaluation and parameter estimation by orders of magnitudes, which significantly enhances one's ability to conduct analyses that require frequent parameter re-estimation. As an empirical application, we compare two popular option pricing models (the Heston and the Bates model with double-exponential jumps) against a non-parametric random forest model. We document that: a) the Bates model produces better out-of-sample pricing on average, but both structural models fail to outperform random forest for large areas of the volatility surface; b) random forest is more competitive at short horizons (e.g., 1-day), for short-dated options (with less than 7 days to maturity), and on days with poor liquidity; c) both structural models outperform random forest in out-of-sample delta hedging; ...

Research paper thumbnail of Pareto-improving carbon-risk taxation

Economic Policy, 2021

Summary Anthropogenic climate change produces two conceptually distinct negative economic externa... more Summary Anthropogenic climate change produces two conceptually distinct negative economic externalities. The first is an expected path of climate damage. The second, the focus of this paper, is an expected path of economic risk. To isolate the climate-risk problem, we consider three mean-zero, symmetric shocks in our 12-period, overlapping generations model. These shocks impact dirty energy usage (carbon emissions), the relationship between carbon concentration and temperature and the connection between temperature and damages. By construction, our model exhibits a de minimis climate problem absent its shocks. However, due to non-linearities, symmetric shocks deliver negatively skewed impacts, including the potential for climate disasters. As we show, Pareto-improving carbon taxation can dramatically lower climate risk, in general, and disaster risk, in particular. The associated climate-risk tax, which is focused exclusively on limiting climate risk, can be as large as, or larger t...

Research paper thumbnail of Parallelized Dimensional Decomposition for Dynamic Stochastic Economic Models

This work is concerned with a generalized solution method for high-dimensional dynamic stochastic... more This work is concerned with a generalized solution method for high-dimensional dynamic stochastic economic models. The introduced method utilizes dimensional decomposition with sparse grid approximation to formulate a massively parallelized computational framework. The algorithm is applied on high-dimensional economic models, with testing conducted at the Swiss National Supercomputing Centre. Our results show that the method can effectively mitigate the so-called “curse of dimensionality”, allowing for computability of models with up to 200 continuous dimensions.

Research paper thumbnail of Making Carbon Taxation a Generational

Felix Kubler and Simon Scheidegger are generously supported by a grant from the Swiss Platform fo... more Felix Kubler and Simon Scheidegger are generously supported by a grant from the Swiss Platform for Advanced Scientific Computing (PASC) under project ID “Computing equilibria in heterogeneous agent macro models on contemporary HPC platforms". Simon Scheidegger gratefully acknowledges support from the Cowles Foundation at Yale University. Laurence Kotlikoff thanks Boston University and the Gaidar Institute for research support. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.

Research paper thumbnail of Block-enhanced precision matrix estimation for large-scale datasets

The l1-regularized Gaussian maximum likelihood method is a common approach for sparse precision m... more The l1-regularized Gaussian maximum likelihood method is a common approach for sparse precision matrix estimation, but one that poses a computational challenge for high-dimensional datasets. We present a novel l1-regularized maximum likelihood method for performant large-scale sparse precision matrix estimation utilizing the block structures in the underlying computations. We identify the computational bottlenecks and contribute a block coordinate descent update as well as a block approximate matrix inversion routine, which is then parallelized using a shared-memory scheme. We demonstrate the effectiveness, accuracy, and performance of these algorithms. Our numerical examples and comparative results with various modern open-source packages reveal that these precision matrix estimation methods can accelerate the computation of covariance matrices by two to three orders of magnitude, while keeping memory requirements modest. Furthermore, we conduct large-scale case studies for applica...

Research paper thumbnail of Detecting Edgeworth Cycles

SSRN Electronic Journal, 2021

We propose algorithms to detect "Edgeworth cycles," asymmetric price movements that have caused a... more We propose algorithms to detect "Edgeworth cycles," asymmetric price movements that have caused antitrust concerns in many countries. We formalize four existing methods and propose six new methods based on spectral analysis and machine learning. We evaluate their accuracy in station-level gasoline-price data from Western Australia, New South Wales, and Germany. Most methods achieve high accuracy in the first two, but only a few can detect nuanced cycles in the third. Results suggest whether researchers find a positive or negative statistical relationship between cycles and markups, and hence their implications for competition policy, crucially depends on the choice of methods.

Research paper thumbnail of Can Today's and Tomorrow's World Uniformly Gain from Carbon Taxation?

Climate change will impact current and future generations in different regions very differently. ... more Climate change will impact current and future generations in different regions very differently. This paper develops a large-scale, annually calibrated, multi-region, overlapping generations model of climate change to study its heterogeneous effects across space and time. We model the relationship between carbon emissions and the global average temperature based on the latest climate science. Predicated average global temperature is used to determine, via pattern-scaling, region-specific temperatures and damages. Our main focus is determining the carbon policy that delivers present and future mankind the highest uniform percentage welfare gains-arguably the policy with the highest chance of global adoption. Damages from climate change are positive for all regions apart from Russia and Canada, with India and South Asia Pacific suffering the most. The optimal policy is implemented via a time-varying global carbon tax plus region-and generation-specific net transfers. Uniform welfare improving carbon policy can materially limit global emissions, dramatically shorten the use of fossil fuels, and raise the welfare of all current and future agents by over four percent. Unfortunately, the pursuit of carbon policy by individual regions, even large ones, makes only a limited difference. However, coalitions of regions, particularly ones including China, can materially limit carbon emissions.

Research paper thumbnail of Parallelized Dimensional Decomposition for Large-Scale Dynamic Stochastic Economic Models

Proceedings of the Platform for Advanced Scientific Computing Conference, 2017

We introduce and deploy a generic, highly scalable computational method to solve high-dimensional... more We introduce and deploy a generic, highly scalable computational method to solve high-dimensional dynamic stochastic economic models on high-performance computing platforms. Within an MPI---TBB parallel, nonlinear time iteration framework, we approximate economic policy functions using an adaptive sparse grid algorithm with d-linear basis functions that is combined with a dimensional decomposition scheme. Numerical experiments on "Piz Daint" (Cray XC30) at the Swiss National Supercomputing Centre show that our framework scales nicely to at least 1,000 compute nodes. As an economic application, we compute global solutions to international real business cycle models up to 200 continuous dimensions with significant speedup values over state-of-the-art techniques.

Research paper thumbnail of High-Dimensional Dynamic Stochastic Model Representation

SSRN Electronic Journal, 2020

We propose a generic and scalable method for computing global solutions of nonlinear, high-dimens... more We propose a generic and scalable method for computing global solutions of nonlinear, high-dimensional dynamic stochastic economic models. First, within an MPI–TBB parallel time-iteration framework, we approximate economic policy functions using an adaptive, high-dimensional model representation scheme, combined with adaptive sparse grids. With increasing dimensions, the number of points in this efficiently-chosen combination of low-dimensional grids grows much more slowly than standard tensor product grids, sparse grids, or even adaptive sparse grids. Moreover, the adaptivity within the individual component functions adds an additional layer of sparsity, since grid points are added only where they are most needed — that is to say, in regions of the computational domain with steep gradients or at non-differentiabilities. Second, we introduce a performant vectorization scheme of the interpolation compute kernel. Third, we validate our claims with numerical experiments conducted on “Piz Daint" (Cray XC50) at the Swiss National Super-computing Center. We observe significant speedups over the state-of-the-art techniques, and almost ideal strong scaling up to at least 1, 000 compute nodes. Fourth, to demonstrate the broad applicability of our method, we compute global solutions to two different versions of a dynamic stochastic economic model: a high-dimensional international real business cycle model with capital adjustment costs, and with or without irreversible investment. We solve these models up to 300 continuous state variables globally.

Research paper thumbnail of Pareto-Improving Carbon-Risk Taxation

At least one co-author has disclosed a financial relationship of potential relevance for this res... more At least one co-author has disclosed a financial relationship of potential relevance for this research. Further information is available online at http://www.nber.org/papers/w26919.ack NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.