Interpolation and backdating with a large information set (original) (raw)

Real-time data for estimating a forward-looking interest rate rule of the ECB

Data in brief, 2017

The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decisio...

Interpolation in Time Series: An Introductive Overview of Existing Methods, Their Performance Criteria and Uncertainty Assessment

Water, 2017

A thorough review has been performed on interpolation methods to fill gaps in time-series, efficiency criteria, and uncertainty quantifications. On one hand, there are numerous available methods: interpolation, regression, autoregressive, machine learning methods, etc. On the other hand, there are many methods and criteria to estimate efficiencies of these methods, but uncertainties on the interpolated values are rarely calculated. Furthermore, while they are estimated according to standard methods, the prediction uncertainty is not taken into account: a discussion is thus presented on the uncertainty estimation of interpolated/extrapolated data. Finally, some suggestions for further research and a new method are proposed.

Using Interpolation for Generating Input Data for the Gross Domestic Product Monte Carlo Simulation

Beneficium, 2023

Input modelling is a complex task within the Monte Carlo simulation, especially when the systems and processes under investigation reveal the non-linear behavior of several interdependent variables. Commonly used approaches for Monte Carlo simulation input modelling include selecting probability distributions and fitting them to existing data; resampling random variates from historical data; and using real-world data as an input model, which in the age of big data becomes more feasible. Each of the approaches comes with its own set of drawbacks. This note aims to describe a new method of input modelling for GDP Monte Carlo simulation based on interpolation of the GDP historical records. Also, this method has been implemented as a publicly available online tool using the Microsoft Azure Machine Learning Studio. A similar approach can be applied to other macroeconomic indicators, e.g., consumer price index (inflation) or current employment statistics. This note is intended for economists, data scientists, and operations research analysts interested in the GDP Monte Carlo simulation. It can also be used by academics, researchers, and practitioners in a broad subject area for generating input data for Monte Carlo simulation. Specifically, it can be of interest for Ph.D. candidates (VAC specialty 5.2.6) performing development of theory and methods of decision-making in economic and social systems, and application of artificial intelligence and big data methods in management.

Estimating the European Central Bank's" Extended Period of Time

On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council's medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.

A note on prediction and interpolation errors in time series

Statistics & Probability Letters, 2005

In this note we analyze the relationship between one-step ahead prediction errors and interpolation errors in time series. We obtain an expression of the prediction errors in terms of the interpolation errors and then we show that minimizing the sum of squares of the one step-ahead standardized prediction errors is equivalent to minimizing the sum of squares of standardized interpolation errors.

Short-Term Forecasting of GDP and Inflation in Real-Time : Norges Bank’s System for Averaging Models

2011

In this paper we describe Norges Bank’s system for averaging models (SAM) which produces model-based density forecasts for Norwegian Mainland GDP and inflation. We combine the forecasts from three main types of models typically used at central banks: Vector autoregressive models, leading indicator models and factor models. By combining models we hedge against uncertain instabilities. We update SAM several times during the quarter to highlight the importance of new data releases, and we show how the performance of SAM improves steadily as new information arrives. The framework is robust with regard to alternative vintages of data to evaluate against. We show that our chosen weighting scheme is superior or on a par with some common alternative weighting schemes, and, finally, that a strategy of trying to pick the best model, ex ante, is inferior to model combination. JEL-codes: C32, C52, C53, E37, E52.

On the performance of the Chow-Lin procedure for quarterly interpolation of annual data: Some Monte-Carlo analysis

Spanish Economic Review, 2003

Most of the countries of the OECD offer quarterly estimates of their national growth or of their Gross National Product. Official Statistical Agencies in western countries have to deal with the problem of estimating Quarterly National Accounts series congruently with Annual National Accounts. In Spain, the Instituto Nacional de Estadística uses the Chow-Lin disaggregation method, which is based on information provided by a group of high-frequency related variables, to estimate the quarterly components of National Accounts from annual components. In this paper, we analyse the relative quality of the estimates obtained through the Chow-Lin procedure, under different sets of hypotheses. JEL Classification C15, C43, M40

Short-term . . . using large monthly datasets -- A pseudo real-time

2008

The purpose of these working papers is to promote the circulation of research results (Research Series) and analytical studies (Documents Series) made within the National Bank of Belgium or presented by external economists in seminars, conferences and conventions organised by the Bank. The aim is therefore to provide a platform for discussion. The opinions expressed are strictly those of the authors and do not necessarily reflect the views of the National Bank of Belgium.

Taylor rules for the euro area: the issue of real-time data

2004

Recently, a number of studies have made an attempt to deal with the key issue of the incompleteness of information available to the central bank when taking its monetary policy decisions. This study adds to this literature by tackling the problem with regard to the euro area. The analysis is based on the simplistic assumption of the central bank following a simple monetary policy rule à-la-Taylor. Along the lines of work suggested by Orphanides, the study tries to assess whether estimates of reaction functions which are carried out using revised data for the euro area can convey a misleading message in terms of policy recommendations.