Asymptotically Unbiased Estimation of A Nonsymmetric Dependence Measure Applied to Sensor Data Analytics and Financial Time Series (original) (raw)
Related papers
Financial data analysis using the informational energy unilateral dependency measure
2015 International Joint Conference on Neural Networks (IJCNN), 2015
Our research area is the unilateral dependency (UD) analysis of non-linear relationships within pairs of simultaneous data. The application is in financial analysis, using the data reported by Kodak and Apple for the period of 1999-2014. We compute and analyze the UD between Kodak's and Apple's financial time series in order to understand how they influence each other over their company assets and liabilities. We also analyze within each of the two companies the UD between assets and liabilities. Our formal approach is based on the informational energy UD measure derived by us in previous work. This measure is estimated here from available sample data, using a non-parametric asymptotically unbiased and consistent kNN estimator.
A Dependence Metric for Possibly Nonlinear Processes
Journal of Time Series Analysis, 2004
A transformed metric entropy measure of dependence is studied which satisfies many desirable properties, including being a proper measure of distance. It is capable of good performance in identifying dependence even in possibly nonlinear time series, and is applicable for both continuous and discrete variables. A nonparametric kernel density implementation is considered here for many stylized models including linear and nonlinear MA, AR, GARCH, integrated series and chaotic dynamics. A related permutation test of independence is proposed and compared with several alternatives. * We would like to thank, without implicating, several referees, the Editor, Timo Terasvirta and seminar participants in the UK, Canada, Australia, Sweden, and the US.
AN ENTROPY-BASED MEASURE OF DEPENDENCE BETWEEN TWO GROUPS OF RANDOM VARIABLES
ETS Research Report Series, 2007
ABSTRACT In multivariate statistics, the linear relationship among random variables has been fully explored in the past. This paper looks into the dependence of one group of random variables on another group of random variables using (conditional) entropy. A new measure, called the K-dependence coefficient or dependence coefficient, is defined using (conditional) entropy. This paper shows that the K-dependence coefficient is a measure of the degree of dependence of one group of random variables on another group of random variables. The dependence measured by the K-dependence coefficient includes both linear and nonlinear dependencies between two groups of random variables. Therefore, the K-dependence coefficient measures the total degree of dependence and not just the linear component of the dependence between the two groups of random variables. Furthermore, the concept of the K-dependence coefficient is extended by defining the partial K-dependence coefficient and the semipartial K-dependence coefficient. Properties of partial and semipartial K-dependence coefficients are also explored.
On Detecting the Dependence of Time Series
Communications in Statistics - Theory and Methods, 2012
This short note suggests a heuristic method for detecting the dependence of random time series that can be used in the case when this dependence is relatively weak and such that the traditional methods are not effective. The method requires to compare some special functionals on the sample characteristic functions with the same functionals computed for the benchmark time series with a known degree of correlation. Some experiments for financial time series are presented. This short note presents some statistical experiments with the purpose to estimate the dependence for time series. We suggest to compare historical time series with a series with given and known correlation using a functional formed from empirical characteristic functions defined similarly to . It gives a simple empirical method that allows to estimate the dependence by comparing the values of this functional for two time series.
Information theory provides ideas for conceptualising information and measuring relationships between objects. It has found wide application in the sciences, but economics and finance have made surprisingly little use of it. We show that time series data can usefully be studied as information -- by noting the relationship between statistical redundancy and dependence, we are able to use the results of information theory to construct a test for joint dependence of random variables. The test is in the same spirit of those developed by Ryabko and Astola (2005, 2006b,a), but differs from these in that we add extra randomness to the original stochatic process. It uses data compression to estimate the entropy rate of a stochastic process, which allows it to measure dependence among sets of random variables, as opposed to the existing econometric literature that uses entropy and finds itself restricted to pairwise tests of dependence. We show how serial dependence may be detected in S&P500 and PSI20 stock returns over different sample periods and frequencies. We apply the test to synthetic data to judge its ability to recover known temporal dependence structures.
ISRN Signal Processing, 2013
We present a computationally tractable approach to dynamically measure statistical dependencies in multivariate non-Gaussian signals. The approach makes use of extensions of independent component analysis to calculate information coupling, as a proxy measure for mutual information, between multiple signals and can be used to estimate uncertainty associated with the information coupling measure in a straightforward way. We empirically validate relative accuracy of the information coupling measure using a set of synthetic data examples and showcase practical utility of using the measure when analysing multivariate financial time series.
A test for independence via Bayesian nonparametric estimation of mutual information
Canadian Journal of Statistics, 2021
Mutual information is a well‐known tool to measure the mutual dependence between variables. In this article, a Bayesian nonparametric estimator of mutual information is established by means of the Dirichlet process and the k‐nearest neighbour distance. As a result, an easy‐to‐implement test of independence is introduced through the relative belief ratio. Several theoretical properties of the approach are presented. The procedure is illustrated through various examples and is compared with its frequentist counterpart.
An information-theoretic measure of dependency among variables in large datasets
2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2015
The maximal information coefficient (MIC), which measures the amount of dependence between two variables, is able to detect both linear and non-linear associations. However, computational cost grows rapidly as a function of the dataset size. In this paper, we develop a computationally efficient approximation to the MIC that replaces its dynamic programming step with a much simpler technique based on the uniform partitioning of data grid. A variety of experiments demonstrate the quality of our approximation.
Note on convergence rates of semiparametric estimators of dependence index
The Annals of Statistics, 1997
Considerable recent attention has been devoted to semiparametric estimation of the dependence index, or the Hurst constant, using methods based on information in either frequency or time domains. Convergence rates of estimators in the frequency domain have been derived, and in the present paper we obtain them for estimators in the time domain. It is shown that the latter can have superior performance for moderate-range time series, but are inferior in the context of long-range dependence.