George Tauchen - Profile on Academia.edu (original) (raw)
Papers by George Tauchen
Journal of Econometrics, 2022
[1] "Continuous-Time Volatility Regression in Large Panels," (Job Market Paper).
We consider specification and inference for the stochastic scale of discretely-observed pure-jump... more We consider specification and inference for the stochastic scale of discretely-observed pure-jump semimartingales with locally stable Lévy densities in the setting where both the time span of the data set increases, and the mesh of the observation grid decreases. The estimation is based on constructing a nonparametric estimate for the empirical Laplace transform of the stochastic scale over a given interval of time by aggregating high-frequency increments of the observed process on that time interval into a statistic we call realized Laplace transform. The realized Laplace transform depends on the activity of the driving pure-jump martingale, and we consider both cases when the latter is known or has to be inferred from the data.
We derive a nonparametric test for constant beta over a fixed time interval from high-frequency o... more We derive a nonparametric test for constant beta over a fixed time interval from high-frequency observations of a bivariate \Ito semimartingale. Beta is defined as the ratio of the spot continuous covariation between an asset and a risk factor and the spot continuous variation of the latter. The test is based on the asymptotic behavior of the covariation between the risk factor and an estimate of the residual component of the asset, that is orthogonal (in martingale sense) to the risk factor, over blocks with asymptotically shrinking time span. Rate optimality of the test over smoothness classes is derived.
The copyright to this Article is held by the Econometric Society. It may be downloaded, printed a... more The copyright to this Article is held by the Econometric Society. It may be downloaded, printed and reproduced only for educational or research purposes, including use in course packs. No downloading or copying may be done for any commercial purpose without the explicit permission of the Econometric Society. For such commercial purposes contact the Office of the Econometric Society (contact information may be found at the website
JEL No. G0,G00,G1,G10,G12 The paper estimates and examines the empirical plausibiltiy of asset pr... more JEL No. G0,G00,G1,G10,G12 The paper estimates and examines the empirical plausibiltiy of asset pricing models that attempt to explain features of financial markets such as the size of the equity premium and the volatility of the stock market. In one model, the long run risks model of Bansal and Yaron (2004), low frequency movements and time varying uncertainty in aggregate consumption growth are the key channels for understanding asset prices. In another, as typified by Campbell and Cochrane (1999), habit formation, which generates time-varying risk-aversion and consequently time-variation in risk-premia, is the key channel. These models are fitted to data using simulation estimators. Both models are found to fit the data equally well at conventional significance levels, and they can track quite closely a new measure of realized annual volatility. Further scrutiny using a rich array of diagnostics suggests that the long run risk model is preferred.
This online supplement contains all proofs for the results in the main text. ∗Department of Econo... more This online supplement contains all proofs for the results in the main text. ∗Department of Economics, Duke University, Durham, NC 27708; e-mail: jl410@duke.edu. †Department of Finance, Kellogg School of Management, Northwestern University, Evanston, IL 60208; e-mail: v-todorov@northwestern.edu. ‡Department of Economics, Duke University, Durham, NC 27708; e-mail: george.tauchen@duke.edu.
We developed a model-free Bayesian extraction procedure for the stochastic discount factor under ... more We developed a model-free Bayesian extraction procedure for the stochastic discount factor under a yield curve prior. Previous methods in the literature directly or indirectly use some particular parametric asset-pricing models such as with long-run risks or habits as the prior. Here, in contrast, we used no such model, but rather, we adopted a prior that enforces external information about the historically very low levels of U.S. short- and long-term interest rates. For clarity and simplicity, our data were annual time series. We used the extracted stochastic discount factor to determine the stripped cash flow risk premiums on a panel of industrial profits and consumption. Interestingly, the results align very closely with recent limited information (bounded rationality) models of the term structure of equity risk premiums, although nowhere did we use any theory on the discount factor other than its implied moment restrictions.
This document contains three supplemental appendices for the main text. Supplemental Appendix A p... more This document contains three supplemental appendices for the main text. Supplemental Appendix A presents additional theoretical results. Supplemental Appendix B contains some numerical analysis for the econometric procedures proposed in the main text. Supplemental Appendix C contains all proofs. JEL classification: C51, C52, G12. ∗Department of Economics, Duke University, Durham, NC 27708; e-mail: jl410@duke.edu. †Department of Finance, Kellogg School of Management, Northwestern University, Evanston, IL 60208; e-mail: v-todorov@northwestern.edu. ‡Department of Economics, Duke University, Durham, NC 27708; e-mail: george.tauchen@duke.edu. Supplemental Appendix A: Additional theoretical results SA.1 Inference when some jumps arrive at deterministic times In this subsection, we extend the results in the main text to a setting where a subset of jump times can be identified using prior information. Examples of such jump events are the ones caused by pre-scheduled macro announcements (And...
We propose nonparametric estimators of the occupation measure and its density of the diffusion co... more We propose nonparametric estimators of the occupation measure and its density of the diffusion coefficient (stochastic volatility) of a discretely observed Itô semimartingale on a fixed interval when the mesh of the observation grid shrinks to zero asymptotically. In a first step we estimate the volatility locally over blocks of shrinking length and then in a second step we use these estimates to construct a sample analogue of the volatility occupation time and a kernel-based estimator of its density. We prove the consistency of our estimators and further derive bounds for their rates of convergence. We use these results to estimate nonparametrically the quantiles associated with the volatility occupation measure. Annals of Statistics: forthcoming.
Financial Institutions Center Range-Based Estimation of Stochastic Volatility Models or Exchange Rate Dynamics are More Interesting Than You Think
We propose using the price range, a recently-neglected volatility proxy with a long history in fi... more We propose using the price range, a recently-neglected volatility proxy with a long history in finance, in the estimation of stochastic volatility models. We show both theoretically and empirically that the log range is approximately Gaussian, in sharp contrast to popular volatility proxies, such as log absolute or squared returns. Hence Gaussian quasi-maximum likelihood estimation based on the range is not only simple, but also highly efficient. We illustrate and enrich our theoretical results with a Monte Carlo study and a substantive empirical application to daily exchange rate volatility. Our empirical work produces sharp conclusions. In particular, the evidence points strongly to the inadequacy of one-factor volatility models, favoring instead two-factor models with one highly persistent factor and one quickly mean reverting factor. Acknowledgments: This work was supported by the National Science Foundation. Siem Koopman graciously shared both his wisdom and his Ox routines. We...
This web-based appendix contains additional empirical results, robustness checks, and calibration... more This web-based appendix contains additional empirical results, robustness checks, and calibrations that compliment the findings reported in the paper. ∗Department of Economics, Duke University, Durham, NC 27708, and NBER and CREATES, Email: boller@duke.edu. †Department of Economics, Rice University, Houston, TX 77251, Email: natalia.sizova@rice.edu. ‡Department of Economics, Duke University, Durham, NC 27708, Email: george.tauchen@duke.edu. 1 Impulse Response Analysis A reviewer noted that “.. a way to think of the left-side of the cross-correlation diagram between returns and volatility is as an impulse-response function (with time running from right to left).” We agree. To further elaborate on this point, Figure A1 shows the observed orthogonalized VAR impulse response functions for the the variance risk premium and the VIX, respectively, to a one standard deviation shock in the return in a system comprised of the return, the variance premium, and the VIX, in that order. Figure A2...
Journal of Econometrics, 2018
We modify the efficient method of moments (EMM) method to perform exact Bayesian inference, where... more We modify the efficient method of moments (EMM) method to perform exact Bayesian inference, where exact means no reliance on asymptotic approximations. We use this modification to evaluate the empirical plausibility of recent predictions from high frequency financial theory regarding the small-time movements of an Itô semimartingale. The theory indicates that the probability distribution of the small moves should be locally stable around the origin. It makes no predictions regarding large rare jumps, which get filtered out. Our exact Bayesian procedure imposes support conditions on parameters as implied by this theory. The empirical application uses S&P Index options extending over a wide range of moneyness, including deep out of the money puts. The evidence is consistent with a locally stable distribution valid over most of the support of the observed data while mildly failing in the extreme tails, about which the theory makes no prediction. We undertake diagnostic checks on all aspects of the procedure. In particular, we evaluate the distributional assumptions regarding a semi-pivotal statistic, and we test by Monte Carlo that the posterior distribution is properly centered with short credibility intervals. Taken together, our results suggest a more important role than previously thought for pure jump-like models with diminished, if not absent, diffusive component.
Quantitative Economics, 2019
We develop tests for deciding whether a large cross‐section of asset prices obey an exact factor ... more We develop tests for deciding whether a large cross‐section of asset prices obey an exact factor structure at the times of factor jumps. Such jump dependence is implied by standard linear factor models. Our inference is based on a panel of asset returns with asymptotically increasing cross‐sectional dimension and sampling frequency, and essentially no restriction on the relative magnitude of these two dimensions of the panel. The test is formed from the high‐frequency returns at the times when the risk factors are detected to have a jump. The test statistic is a cross‐sectional average of a measure of discrepancy in the estimated jump factor loadings of the assets at consecutive jump times. Under the null hypothesis, the discrepancy in the factor loadings is due to a measurement error, which shrinks with the increase of the sampling frequency, while under an alternative of a noisy jump factor model this discrepancy contains also nonvanishing firm‐specific shocks. The limit behavior ...
SSRN Electronic Journal, 2015
This paper develops a method to select the threshold in threshold-based jump detection methods. T... more This paper develops a method to select the threshold in threshold-based jump detection methods. The method is motivated by an analysis of threshold-based jump detection methods in the context of jump-diffusion models. We show that over the range of sampling frequencies a researcher is most likely to encounter that the usual in-fill asymptotics provide a poor guide for selecting the jump threshold. Because of this we develop a sample-based method. Our method estimates the number of jumps over a grid of thresholds and selects the optimal threshold at what we term the 'take-off' point in the estimated number of jumps. We show that this method consistently estimates the jumps and their indices as the sampling interval goes to zero. In several Monte Carlo studies we evaluate the performance of our method based on its ability to accurately locate jumps and its ability to distinguish between true jumps and large diffusive moves. In one of these Monte Carlo studies we evaluate the performance of our method in a jump regression context. Finally, we apply our method in two empirical studies. In one we estimate the number of jumps and report the jump threshold our method selects for three commonly used market indices. In the other empirical application we perform a series of jump regressions using our method to select the jump threshold.
SSRN Electronic Journal, 2012
In this paper, we develop a classical moment-based inference framework with Bayesian properties. ... more In this paper, we develop a classical moment-based inference framework with Bayesian properties. We prove that there exists an intensity distribution of the solutions to empirical moment conditions over the parameter space. We approximate it with the empirical saddlepoint (ESP) technique. We call the result the ESP intensity. A higher ESP intensity value indicates a higher estimated probability weight of being a solution to the empirical moment conditions. We propose to use the ESP intensity in the same way as posteriors are used in Bayesian inference to obtain point estimators, confidence regions, and tests. We call this the ESP approach, and explain the rationale behind it. We prove consistency and asymptotic normality of the ESP intensity. The ESP approach provides a unique answer to multiple concerns especially acute in consumption-based asset pricing, such as lack of identification and multiple hypothesis testing on the same data set. It also sheds a new light on consumption-based asset pricing, and, in particular, indicates that consumption-based asset pricing theory is more consistent with data than existing inference approaches suggest.
Journal of Business & Economic Statistics, 2017
We propose a test for the rank of a cross-section of processes at a set of jump events. The jump ... more We propose a test for the rank of a cross-section of processes at a set of jump events. The jump events are either specific known times or are random and associated with jumps of some process. The test is formed from discretely sampled data on a fixed time interval with asymptotically shrinking mesh. In the first step, we form nonparametric estimates of the jump events via thresholding techniques. We then compute the eigenvalues of the outer product of the cross-section of increments at the identified jump events. The test for rank r is based on the asymptotic behavior of the sum of the squared eigenvalues excluding the largest r. A simple resampling method is proposed for feasible testing. The test is applied to financial data spanning the period 2007-2015 at the times of stock market jumps. We find support for a one-factor model of both industry portfolio and Dow 30 stock returns at market jump times. This stands in contrast with earlier evidence for higher-dimensional factor structure of stock returns during "normal" (non-jump) times. We identify the latent factor driving the stocks and portfolios as the size of the market jump.
Journal of Econometrics, 2017
We develop an efficient mixed-scale estimator for jump regressions using high-frequency asset ret... more We develop an efficient mixed-scale estimator for jump regressions using high-frequency asset returns. A fine time scale is used to accurately identify the locations of large rare jumps in the explanatory variables such as the price of the market portfolio. A coarse scale is then used in the estimation in order to attenuate the effect of trading frictions in the dependent variable such as the prices of potentially less liquid assets. The proposed estimator has a non-standard asymptotic distribution that cannot be made asymptotically pivotal via studentization. We propose a novel bootstrap procedure for feasible inference and justify its asymptotic validity. We show that the bootstrap provides an automatic higher-order asymptotic approximation by accounting for the sampling variation in estimates of nuisance quantities that are used in efficient estimation. The Monte Carlo analysis indicates good finite-sample performance of the general specification test and confidence intervals based on the bootstrap. We apply the method to a high-frequency panel of Dow stock prices together with the market index defined by the S&P 500 index futures over the period 2007-2014. We document remarkable temporal stability in the way that stocks react to market jumps. However, this relationship for many of the stocks in the sample is significantly noisier and more unstable during sector-specific jump events.
Econometrica, 2017
We develop econometric tools for studying jump dependence of two processes from highfrequency obs... more We develop econometric tools for studying jump dependence of two processes from highfrequency observations on a fixed time interval. In this context, only segments of data around a few outlying observations are informative for the inference. We derive an asymptotically valid test for stability of a linear jump relation over regions of the jump size domain. The test has power against general forms of nonlinearity in the jump dependence as well as temporal instabilities. We further propose an efficient estimator for the linear jump regression model that is formed by optimally weighting the detected jumps with weights based on the diffusive volatility around the jump times. We derive the asymptotic limit of the estimator, a semiparametric lower efficiency bound for the linear jump regression, and show that our estimator attains the latter. The analysis covers both deterministic and random jump arrivals. A higher-order asymptotic expansion for the optimal estimator further allows for finite-sample refinements. In an empirical application, we use the developed inference techniques to test the temporal stability of market jump betas.
Journal of the American Statistical Association, 2017
We develop robust inference methods for studying linear dependence between the jumps of discretel... more We develop robust inference methods for studying linear dependence between the jumps of discretely observed processes at high frequency. Unlike classical linear regressions, jump regressions are determined by a small number of jumps occurring over a fixed time interval and the rest of the components of the processes around the jump times. The latter are the continuous martingale parts of the processes as well as observation noise. By sampling more frequently the role of these components, which are hidden in the observed price, shrinks asymptotically. The robustness of our inference procedure is with respect to outliers, which are of particular importance in the current setting of relatively small number of jump observations. This is achieved by using non-smooth loss functions (like L 1 ) in the estimation. Unlike classical robust methods, the limit of the objective function here remains non-smooth. The proposed method is also robust to measurement error in the observed processes which is achieved by locally smoothing the high-frequency increments. In an empirical application to financial data we illustrate the usefulness of the robust techniques by contrasting the behavior of robust and OLS-type jump regressions in periods including disruptions of the financial markets such as so called "flash crashes.
The Relative Contributions of Jumps to Total Vari-ance
Journal of Econometrics, 2022
[1] "Continuous-Time Volatility Regression in Large Panels," (Job Market Paper).
We consider specification and inference for the stochastic scale of discretely-observed pure-jump... more We consider specification and inference for the stochastic scale of discretely-observed pure-jump semimartingales with locally stable Lévy densities in the setting where both the time span of the data set increases, and the mesh of the observation grid decreases. The estimation is based on constructing a nonparametric estimate for the empirical Laplace transform of the stochastic scale over a given interval of time by aggregating high-frequency increments of the observed process on that time interval into a statistic we call realized Laplace transform. The realized Laplace transform depends on the activity of the driving pure-jump martingale, and we consider both cases when the latter is known or has to be inferred from the data.
We derive a nonparametric test for constant beta over a fixed time interval from high-frequency o... more We derive a nonparametric test for constant beta over a fixed time interval from high-frequency observations of a bivariate \Ito semimartingale. Beta is defined as the ratio of the spot continuous covariation between an asset and a risk factor and the spot continuous variation of the latter. The test is based on the asymptotic behavior of the covariation between the risk factor and an estimate of the residual component of the asset, that is orthogonal (in martingale sense) to the risk factor, over blocks with asymptotically shrinking time span. Rate optimality of the test over smoothness classes is derived.
The copyright to this Article is held by the Econometric Society. It may be downloaded, printed a... more The copyright to this Article is held by the Econometric Society. It may be downloaded, printed and reproduced only for educational or research purposes, including use in course packs. No downloading or copying may be done for any commercial purpose without the explicit permission of the Econometric Society. For such commercial purposes contact the Office of the Econometric Society (contact information may be found at the website
JEL No. G0,G00,G1,G10,G12 The paper estimates and examines the empirical plausibiltiy of asset pr... more JEL No. G0,G00,G1,G10,G12 The paper estimates and examines the empirical plausibiltiy of asset pricing models that attempt to explain features of financial markets such as the size of the equity premium and the volatility of the stock market. In one model, the long run risks model of Bansal and Yaron (2004), low frequency movements and time varying uncertainty in aggregate consumption growth are the key channels for understanding asset prices. In another, as typified by Campbell and Cochrane (1999), habit formation, which generates time-varying risk-aversion and consequently time-variation in risk-premia, is the key channel. These models are fitted to data using simulation estimators. Both models are found to fit the data equally well at conventional significance levels, and they can track quite closely a new measure of realized annual volatility. Further scrutiny using a rich array of diagnostics suggests that the long run risk model is preferred.
This online supplement contains all proofs for the results in the main text. ∗Department of Econo... more This online supplement contains all proofs for the results in the main text. ∗Department of Economics, Duke University, Durham, NC 27708; e-mail: jl410@duke.edu. †Department of Finance, Kellogg School of Management, Northwestern University, Evanston, IL 60208; e-mail: v-todorov@northwestern.edu. ‡Department of Economics, Duke University, Durham, NC 27708; e-mail: george.tauchen@duke.edu.
We developed a model-free Bayesian extraction procedure for the stochastic discount factor under ... more We developed a model-free Bayesian extraction procedure for the stochastic discount factor under a yield curve prior. Previous methods in the literature directly or indirectly use some particular parametric asset-pricing models such as with long-run risks or habits as the prior. Here, in contrast, we used no such model, but rather, we adopted a prior that enforces external information about the historically very low levels of U.S. short- and long-term interest rates. For clarity and simplicity, our data were annual time series. We used the extracted stochastic discount factor to determine the stripped cash flow risk premiums on a panel of industrial profits and consumption. Interestingly, the results align very closely with recent limited information (bounded rationality) models of the term structure of equity risk premiums, although nowhere did we use any theory on the discount factor other than its implied moment restrictions.
This document contains three supplemental appendices for the main text. Supplemental Appendix A p... more This document contains three supplemental appendices for the main text. Supplemental Appendix A presents additional theoretical results. Supplemental Appendix B contains some numerical analysis for the econometric procedures proposed in the main text. Supplemental Appendix C contains all proofs. JEL classification: C51, C52, G12. ∗Department of Economics, Duke University, Durham, NC 27708; e-mail: jl410@duke.edu. †Department of Finance, Kellogg School of Management, Northwestern University, Evanston, IL 60208; e-mail: v-todorov@northwestern.edu. ‡Department of Economics, Duke University, Durham, NC 27708; e-mail: george.tauchen@duke.edu. Supplemental Appendix A: Additional theoretical results SA.1 Inference when some jumps arrive at deterministic times In this subsection, we extend the results in the main text to a setting where a subset of jump times can be identified using prior information. Examples of such jump events are the ones caused by pre-scheduled macro announcements (And...
We propose nonparametric estimators of the occupation measure and its density of the diffusion co... more We propose nonparametric estimators of the occupation measure and its density of the diffusion coefficient (stochastic volatility) of a discretely observed Itô semimartingale on a fixed interval when the mesh of the observation grid shrinks to zero asymptotically. In a first step we estimate the volatility locally over blocks of shrinking length and then in a second step we use these estimates to construct a sample analogue of the volatility occupation time and a kernel-based estimator of its density. We prove the consistency of our estimators and further derive bounds for their rates of convergence. We use these results to estimate nonparametrically the quantiles associated with the volatility occupation measure. Annals of Statistics: forthcoming.
Financial Institutions Center Range-Based Estimation of Stochastic Volatility Models or Exchange Rate Dynamics are More Interesting Than You Think
We propose using the price range, a recently-neglected volatility proxy with a long history in fi... more We propose using the price range, a recently-neglected volatility proxy with a long history in finance, in the estimation of stochastic volatility models. We show both theoretically and empirically that the log range is approximately Gaussian, in sharp contrast to popular volatility proxies, such as log absolute or squared returns. Hence Gaussian quasi-maximum likelihood estimation based on the range is not only simple, but also highly efficient. We illustrate and enrich our theoretical results with a Monte Carlo study and a substantive empirical application to daily exchange rate volatility. Our empirical work produces sharp conclusions. In particular, the evidence points strongly to the inadequacy of one-factor volatility models, favoring instead two-factor models with one highly persistent factor and one quickly mean reverting factor. Acknowledgments: This work was supported by the National Science Foundation. Siem Koopman graciously shared both his wisdom and his Ox routines. We...
This web-based appendix contains additional empirical results, robustness checks, and calibration... more This web-based appendix contains additional empirical results, robustness checks, and calibrations that compliment the findings reported in the paper. ∗Department of Economics, Duke University, Durham, NC 27708, and NBER and CREATES, Email: boller@duke.edu. †Department of Economics, Rice University, Houston, TX 77251, Email: natalia.sizova@rice.edu. ‡Department of Economics, Duke University, Durham, NC 27708, Email: george.tauchen@duke.edu. 1 Impulse Response Analysis A reviewer noted that “.. a way to think of the left-side of the cross-correlation diagram between returns and volatility is as an impulse-response function (with time running from right to left).” We agree. To further elaborate on this point, Figure A1 shows the observed orthogonalized VAR impulse response functions for the the variance risk premium and the VIX, respectively, to a one standard deviation shock in the return in a system comprised of the return, the variance premium, and the VIX, in that order. Figure A2...
Journal of Econometrics, 2018
We modify the efficient method of moments (EMM) method to perform exact Bayesian inference, where... more We modify the efficient method of moments (EMM) method to perform exact Bayesian inference, where exact means no reliance on asymptotic approximations. We use this modification to evaluate the empirical plausibility of recent predictions from high frequency financial theory regarding the small-time movements of an Itô semimartingale. The theory indicates that the probability distribution of the small moves should be locally stable around the origin. It makes no predictions regarding large rare jumps, which get filtered out. Our exact Bayesian procedure imposes support conditions on parameters as implied by this theory. The empirical application uses S&P Index options extending over a wide range of moneyness, including deep out of the money puts. The evidence is consistent with a locally stable distribution valid over most of the support of the observed data while mildly failing in the extreme tails, about which the theory makes no prediction. We undertake diagnostic checks on all aspects of the procedure. In particular, we evaluate the distributional assumptions regarding a semi-pivotal statistic, and we test by Monte Carlo that the posterior distribution is properly centered with short credibility intervals. Taken together, our results suggest a more important role than previously thought for pure jump-like models with diminished, if not absent, diffusive component.
Quantitative Economics, 2019
We develop tests for deciding whether a large cross‐section of asset prices obey an exact factor ... more We develop tests for deciding whether a large cross‐section of asset prices obey an exact factor structure at the times of factor jumps. Such jump dependence is implied by standard linear factor models. Our inference is based on a panel of asset returns with asymptotically increasing cross‐sectional dimension and sampling frequency, and essentially no restriction on the relative magnitude of these two dimensions of the panel. The test is formed from the high‐frequency returns at the times when the risk factors are detected to have a jump. The test statistic is a cross‐sectional average of a measure of discrepancy in the estimated jump factor loadings of the assets at consecutive jump times. Under the null hypothesis, the discrepancy in the factor loadings is due to a measurement error, which shrinks with the increase of the sampling frequency, while under an alternative of a noisy jump factor model this discrepancy contains also nonvanishing firm‐specific shocks. The limit behavior ...
SSRN Electronic Journal, 2015
This paper develops a method to select the threshold in threshold-based jump detection methods. T... more This paper develops a method to select the threshold in threshold-based jump detection methods. The method is motivated by an analysis of threshold-based jump detection methods in the context of jump-diffusion models. We show that over the range of sampling frequencies a researcher is most likely to encounter that the usual in-fill asymptotics provide a poor guide for selecting the jump threshold. Because of this we develop a sample-based method. Our method estimates the number of jumps over a grid of thresholds and selects the optimal threshold at what we term the 'take-off' point in the estimated number of jumps. We show that this method consistently estimates the jumps and their indices as the sampling interval goes to zero. In several Monte Carlo studies we evaluate the performance of our method based on its ability to accurately locate jumps and its ability to distinguish between true jumps and large diffusive moves. In one of these Monte Carlo studies we evaluate the performance of our method in a jump regression context. Finally, we apply our method in two empirical studies. In one we estimate the number of jumps and report the jump threshold our method selects for three commonly used market indices. In the other empirical application we perform a series of jump regressions using our method to select the jump threshold.
SSRN Electronic Journal, 2012
In this paper, we develop a classical moment-based inference framework with Bayesian properties. ... more In this paper, we develop a classical moment-based inference framework with Bayesian properties. We prove that there exists an intensity distribution of the solutions to empirical moment conditions over the parameter space. We approximate it with the empirical saddlepoint (ESP) technique. We call the result the ESP intensity. A higher ESP intensity value indicates a higher estimated probability weight of being a solution to the empirical moment conditions. We propose to use the ESP intensity in the same way as posteriors are used in Bayesian inference to obtain point estimators, confidence regions, and tests. We call this the ESP approach, and explain the rationale behind it. We prove consistency and asymptotic normality of the ESP intensity. The ESP approach provides a unique answer to multiple concerns especially acute in consumption-based asset pricing, such as lack of identification and multiple hypothesis testing on the same data set. It also sheds a new light on consumption-based asset pricing, and, in particular, indicates that consumption-based asset pricing theory is more consistent with data than existing inference approaches suggest.
Journal of Business & Economic Statistics, 2017
We propose a test for the rank of a cross-section of processes at a set of jump events. The jump ... more We propose a test for the rank of a cross-section of processes at a set of jump events. The jump events are either specific known times or are random and associated with jumps of some process. The test is formed from discretely sampled data on a fixed time interval with asymptotically shrinking mesh. In the first step, we form nonparametric estimates of the jump events via thresholding techniques. We then compute the eigenvalues of the outer product of the cross-section of increments at the identified jump events. The test for rank r is based on the asymptotic behavior of the sum of the squared eigenvalues excluding the largest r. A simple resampling method is proposed for feasible testing. The test is applied to financial data spanning the period 2007-2015 at the times of stock market jumps. We find support for a one-factor model of both industry portfolio and Dow 30 stock returns at market jump times. This stands in contrast with earlier evidence for higher-dimensional factor structure of stock returns during "normal" (non-jump) times. We identify the latent factor driving the stocks and portfolios as the size of the market jump.
Journal of Econometrics, 2017
We develop an efficient mixed-scale estimator for jump regressions using high-frequency asset ret... more We develop an efficient mixed-scale estimator for jump regressions using high-frequency asset returns. A fine time scale is used to accurately identify the locations of large rare jumps in the explanatory variables such as the price of the market portfolio. A coarse scale is then used in the estimation in order to attenuate the effect of trading frictions in the dependent variable such as the prices of potentially less liquid assets. The proposed estimator has a non-standard asymptotic distribution that cannot be made asymptotically pivotal via studentization. We propose a novel bootstrap procedure for feasible inference and justify its asymptotic validity. We show that the bootstrap provides an automatic higher-order asymptotic approximation by accounting for the sampling variation in estimates of nuisance quantities that are used in efficient estimation. The Monte Carlo analysis indicates good finite-sample performance of the general specification test and confidence intervals based on the bootstrap. We apply the method to a high-frequency panel of Dow stock prices together with the market index defined by the S&P 500 index futures over the period 2007-2014. We document remarkable temporal stability in the way that stocks react to market jumps. However, this relationship for many of the stocks in the sample is significantly noisier and more unstable during sector-specific jump events.
Econometrica, 2017
We develop econometric tools for studying jump dependence of two processes from highfrequency obs... more We develop econometric tools for studying jump dependence of two processes from highfrequency observations on a fixed time interval. In this context, only segments of data around a few outlying observations are informative for the inference. We derive an asymptotically valid test for stability of a linear jump relation over regions of the jump size domain. The test has power against general forms of nonlinearity in the jump dependence as well as temporal instabilities. We further propose an efficient estimator for the linear jump regression model that is formed by optimally weighting the detected jumps with weights based on the diffusive volatility around the jump times. We derive the asymptotic limit of the estimator, a semiparametric lower efficiency bound for the linear jump regression, and show that our estimator attains the latter. The analysis covers both deterministic and random jump arrivals. A higher-order asymptotic expansion for the optimal estimator further allows for finite-sample refinements. In an empirical application, we use the developed inference techniques to test the temporal stability of market jump betas.
Journal of the American Statistical Association, 2017
We develop robust inference methods for studying linear dependence between the jumps of discretel... more We develop robust inference methods for studying linear dependence between the jumps of discretely observed processes at high frequency. Unlike classical linear regressions, jump regressions are determined by a small number of jumps occurring over a fixed time interval and the rest of the components of the processes around the jump times. The latter are the continuous martingale parts of the processes as well as observation noise. By sampling more frequently the role of these components, which are hidden in the observed price, shrinks asymptotically. The robustness of our inference procedure is with respect to outliers, which are of particular importance in the current setting of relatively small number of jump observations. This is achieved by using non-smooth loss functions (like L 1 ) in the estimation. Unlike classical robust methods, the limit of the objective function here remains non-smooth. The proposed method is also robust to measurement error in the observed processes which is achieved by locally smoothing the high-frequency increments. In an empirical application to financial data we illustrate the usefulness of the robust techniques by contrasting the behavior of robust and OLS-type jump regressions in periods including disruptions of the financial markets such as so called "flash crashes.
The Relative Contributions of Jumps to Total Vari-ance