Autocorrelation Research Papers - Academia.edu (original) (raw)
Most of economic and financial time series have a nonstationary behavior. There are different types of nonstationary processes, such as those with stochastic trend and those with deterministic trend. In practice, it can be quite difficult... more
Most of economic and financial time series have a nonstationary behavior. There are different types of nonstationary processes, such as those with stochastic trend and those with deterministic trend. In practice, it can be quite difficult to distinguish between the two processes. In this paper, we compare random walk and determinist trend processes using sample autocorrelation, sample partial autocorrelation and periodogram based metrics.
This correspondence describes a method for estimating the parameters of an autoregressive (AR) process from a finite number of noisy measurements. The method uses a modified set of Yule-Walker (YW) equations that lead to a quadratic... more
This correspondence describes a method for estimating the parameters of an autoregressive (AR) process from a finite number of noisy measurements. The method uses a modified set of Yule-Walker (YW) equations that lead to a quadratic eigenvalue problem that, when solved, gives estimates of the AR parameters and the measurement noise variance.
In this paper, we investigate the effectiveness of a financial time-series forecasting strategy which exploits the mul- tiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift... more
In this paper, we investigate the effectiveness of a financial time-series forecasting strategy which exploits the mul- tiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). To better utilize the detailed information in the lower scales of wavelet coef- ficients (high frequencies) and general (trend) information in the higher scales of wavelet coefficients (low frequencies), we applied the Bayesian method of automatic relevance determination (ARD) to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the indi- vidual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representatio...
Although a number of studies have already been conducted on the economic impact of the development of the transport and logistics sector and international trade, these are regarded as two separated topics, and little has been done so far... more
Although a number of studies have already been conducted on the economic impact of the development of the transport and logistics sector and international trade, these are regarded as two separated topics, and little has been done so far to study in depth the relationship between them. This paper seeks to shed light on this issue in the context of Australia. To this end, the vector autocorrelation (VAR) framework is employed to explore the causal relationship between Australia-China trade and the development of the Australian transport and logistics sector. This framework is then extended to allow for the effect of Australia's trade with the US, Japan, the rest of the world and other variables. Based on the analysis results, implications for the transport and logistics sector are discussed.
This paper describes a software tool for: (1) assessing uncertainties in environmental data; and (2) generating realisations of uncertain data for use in uncertainty propagation analyses: the ''Data Uncertainty Engine (DUE)''. Data may be... more
This paper describes a software tool for: (1) assessing uncertainties in environmental data; and (2) generating realisations of uncertain data for use in uncertainty propagation analyses: the ''Data Uncertainty Engine (DUE)''. Data may be imported into DUE from file or from a database, and are represented in DUE as objects whose positions and attribute values may be uncertain. Objects supported by DUE include spatial vectors, spatial rasters, time-series of spatial data, simple time-series and objects that are constant in space and time. Attributes supported by DUE include continuous numerical variables (e.g. rainfall), discrete numerical variables (e.g. bird counts) and categorical variables (e.g. land-cover). Once data are imported, an uncertainty model can be developed for the positional and attribute uncertainties of environmental objects. This is currently limited to probability models, but confidence intervals and scenarios will be provided in the future. Using DUE, the spatial and temporal patterns of uncertainty (autocorrelation), as well as crosscorrelations between related inputs, can be incorporated into an uncertainty analysis. Alongside expert judgement, sample data may be used to help estimate uncertainties, and to reduce the uncertainty of the simulated output by ensuring each realisation reproduces the sample data. Most importantly, DUE provides a conceptual framework for structuring an uncertainty analysis, allowing users without direct experience of uncertainty methods to develop realistic uncertainty models for their data.
In many applications, there are multiple time series that are hierarchically organized and can be aggregated at several different levels in groups based on products, geography or some other features. We call these "hierarchical time... more
In many applications, there are multiple time series that are hierarchically organized and can be aggregated at several different levels in groups based on products, geography or some other features. We call these "hierarchical time series". They are commonly forecast using either a "bottom-up" or a "top-down" method. In this paper we propose a new approach to hierarchical forecasting which provides optimal forecasts that are better than forecasts produced by either a top-down or a bottom-up approach. Our method is based on independently forecasting all series at all levels of the hierarchy and then using a regression model to optimally combine and reconcile these forecasts. The resulting revised forecasts add up appropriately across the hierarchy, are unbiased and have minimum variance amongst all combination forecasts under some simple assumptions. We show in a simulation study that our method performs well compared to the top-down approach and the bottom-up method. We demonstrate our proposed method by forecasting Australian tourism demand where the data are disaggregated by purpose of travel and geographical region.
The synthesis of linear and planar acoustical arrals of desired omnidirectional intensit) patterns is considered. B) relating the far-field sound intensity pattern to the correlation function of the used feeding sequence or code, it is... more
The synthesis of linear and planar acoustical arrals of desired omnidirectional intensit) patterns is considered. B) relating the far-field sound intensity pattern to the correlation function of the used feeding sequence or code, it is shown that an almo$t omnidirectional pattern results when a code with a sharp autocorrelation function i 5 used as the feeding sequence for the elements of the arraj. Huffman impulse-equivalent codes enjoy very sharp correlation properties, and are shown in this paper to result in good omnidirectional linear and planar acoustical arrays. The results have direct applications in underwater acoustical communication systems, public addrew $)stems, and acoustical measurements.
Visual inspection remains the most frequently applied method for detecting treatment effects in single-case designs. The advantages and limitations of visual inference are here discussed in relation to other procedures for assessing... more
Visual inspection remains the most frequently applied method for detecting treatment effects in single-case designs. The advantages and limitations of visual inference are here discussed in relation to other procedures for assessing intervention effectiveness. The first part of the paper reviews previous research on visual analysis, paying special attention to the validation of visual analysts' decisions, inter-judge agreement, and false alarm and omission rates. The most relevant factors affecting visual inspection (i.e., effect size, autocorrelation, data variability, and analysts' expertise) are highlighted and incorporated into an empirical simulation study with the aim of providing further evidence about the reliability of visual analysis. Our results concur with previous studies that have reported the relationship between serial dependence and increased Type I rates. Participants with greater experience appeared to be more conservative and used more consistent criteria when assessing graphed data. Nonetheless, the decisions made by both professionals and students did not match sufficiently the simulated data features, and we also found low intra-judge agreement, thus suggesting that visual inspection should be complemented by other methods when assessing treatment effectiveness.
In the Earth's history, periods of relatively stable climate have often been interrupted by sharp transitions to a contrasting state. One explanation for such events of abrupt change is that they happened when the earth system reached a... more
In the Earth's history, periods of relatively stable climate have often been interrupted by sharp transitions to a contrasting state. One explanation for such events of abrupt change is that they happened when the earth system reached a critical tipping point. However, this remains hard to prove for events in the remote past, and it is even more difficult to predict if and when we might reach a tipping point for abrupt climate change in the future. Here, we analyze eight ancient abrupt climate shifts and show that they were all preceded by a characteristic slowing down of the fluctuations starting well before the actual shift. Such slowing down, measured as increased autocorrelation, can be mathematically shown to be a hallmark of tipping points. Therefore, our results imply independent empirical evidence for the idea that past abrupt shifts were associated with the passing of critical thresholds. Because the mechanism causing slowing down is fundamentally inherent to tipping points, it follows that our way to detect slowing down might be used as a universal early warning signal for upcoming catastrophic change. Because tipping points in ecosystems and other complex systems are notoriously hard to predict in other ways, this is a promising perspective.
This paper derives several Lagrange Multiplier statistics and the corresponding likelihood ratio statistics to test for spatial autocorrelation in a fixed effects panel data model. These tests allow discriminating between the two main... more
This paper derives several Lagrange Multiplier statistics and the corresponding likelihood ratio statistics to test for spatial autocorrelation in a fixed effects panel data model. These tests allow discriminating between the two main types of spatial autocorrelation which are relevant in empirical applications, namely endogenous spatial lag versus spatially autocorrelated errors. In this paper, five different statistics are suggested. The first one, the joint test, detects the presence of spatial autocorrelation whatever its type. Hence, it indicates whether specific econometric estimation methods should be implemented to account for the spatial dimension. In case they need to be implemented, the other four tests support the choice between the different specifications, i.e. endogenous spatial lag, spatially autocorrelated errors or both. The first two are simple hypothesis tests as they detect one kind of spatial autocorrelation assuming the other one is absent. The last two take into account the presence of one type of spatial autocorrelation when testing for the presence of the other one. We use the methodology developed in to set up and estimate the general likelihood function. Monte Carlo experiments show the good performance of our tests. Finally, they are applied to the Feldstein-Horioka puzzle. They indicate a misspecification of the investment-saving regression due to the omission of spatial autocorrelation. The traditional saving-retention coefficient is shown to be upward biased. In contrast our results favor capital mobility. JEL Classification: C12, C21, C23 LM ρ|λ and LR ρ|λ test for the presence an endogenous spatial lag when spatially autocorrelated errors are included in the specification.
Two stochastic models of the rainfall process, belonging to different categories, are compared in terms of how well they reproduce certain hyetograph characteristics. The first is the scaling model of storm hyetograph, which belongs to... more
Two stochastic models of the rainfall process, belonging to different categories, are compared in terms of how well they reproduce certain hyetograph characteristics. The first is the scaling model of storm hyetograph, which belongs to the category of storm-based models.
- by Demetris Koutsoyiannis and +1
- •
- Hydrology, Multidisciplinary, Power Law, Surface Water
We discuss robust methods of time series analysis which use only comparisons of values and not their actual size. Local and global order structure are defined as matrices or by rank numbers. Local ranks, autocorrelation by Kendall's tau,... more
We discuss robust methods of time series analysis which use only comparisons of values and not their actual size. Local and global order structure are defined as matrices or by rank numbers. Local ranks, autocorrelation by Kendall's tau, and permutation entropy as complexity measure are introduced in such a way that they contain a scale parameter which allows to study time series on different scales.
Process capability analysis when observations are autocorrelated is addressed using time series modelling and regression analysis. Through the use of a numerical example, it is shown that the variance estimate obtained from the original... more
Process capability analysis when observations are autocorrelated is addressed using time series modelling and regression analysis. Through the use of a numerical example, it is shown that the variance estimate obtained from the original data is no longer an appropriate estimate to be considered for conducting process capability analyses.
The analysis and forecasting of extreme climatic events has become increasingly relevant to plan effective financial and food-related interventions in third-world countries. Natural disasters and climate change, both large and small... more
The analysis and forecasting of extreme climatic events has become increasingly relevant to plan effective financial and food-related interventions in third-world countries. Natural disasters and climate change, both large and small scale, have a great impact on non-industrialized populations who rely exclusively on activities such as crop production, fishing, and similar livelihood activities. It is important to identify the extent of the areas prone to severe drought conditions in order to study the possible consequences of the drought on annual crop production. In this paper, we aim to identify such areas within the South Tigray zone, Ethiopia, using a transformation of the Normalized Difference Vegetation Index (NDVI) called Absolute Difference NDVI (ADVI). Negative NDVI shifts from the historical average can generally be linked to a reduction in the vigor of local vegetation. Drought is more likely to increase in areas where negative shifts occur more frequently and with high magnitude, making it possible to spot critical situations. We propose a new methodology for the assessment of drought risk in areas where crop production represents a primary source of livelihood for its inhabitants. We estimate ADVI return levels pixel per pixel by fitting extreme value models to independent monthly minima. The study is conducted using SPOT-Vegetation (VGT) ten-day composite (S10) images from April 1998 to March 2009. In all short-term and long-term predictions, we found that central and southern areas of the South Tigray zone are prone to a higher drought risk compared to other areas.
This study addresses the question of whether the adaptive market hypothesis provides a better description of the behaviour of emerging stock market like India. We employed linear and nonlinear methods to evaluate the hypothesis... more
This study addresses the question of whether the adaptive market hypothesis provides a better description of the behaviour of emerging stock market like India. We employed linear and nonlinear methods to evaluate the hypothesis empirically. The linear tests show a cyclical pattern in linear dependence suggesting that the Indian stock market switched between periods of efficiency and inefficiency. In contrast, the results from nonlinear tests reveal a strong evidence of nonlinearity in returns throughout the sample period with a sign of tapering magnitude of nonlinear dependence in the recent period. The findings suggest that Indian stock market is moving towards efficiency. The results provide additional insights on association between financial crises, foreign portfolio investments and inefficiency.
The responses obtained from response surface designs that are run sequentially often exhibit serial correlation or time trends. The order in which the runs of the design are performed then has an impact on the precision of the parameter... more
The responses obtained from response surface designs that are run sequentially often exhibit serial correlation or time trends. The order in which the runs of the design are performed then has an impact on the precision of the parameter estimators. This article proposes the use of a variable-neighbourhood search algorithm to compute run orders that guarantee a precise estimation of the effects of the experimental factors. The importance of using good run orders is demonstrated by seeking D-optimal run orders for a central composite design in the presence of an AR(1) autocorrelation pattern.
One of the most powerful speech analysis techniques is the method of linear predictive analysis. This method has become the predominant technique for representing speech for low bit rate transmission or storage. The importance of this... more
One of the most powerful speech analysis techniques is the method of linear predictive analysis. This method has become the predominant technique for representing speech for low bit rate transmission or storage. The importance of this method lies both in its ability to provide extremely accurate estimates of the speech parameters and in its relative speed of computation. The basic idea behind linear predictive analysis is that the speech sample can be approximated as a linear combination of past samples. The linear predictor model provides a robust, reliable and accurate method for estimating parameters that characterize the linear, time varying system. In this project, we implement a voice excited LPC vocoder for low bit rate speech compression.
El presente Trabajo Fin de Master plantea como objetivo la aplicación práctica del modelo GARCH y distribuciones de colas anchas en el cálculo del riesgo de mercado, mediante metodología VeR Paramétrica, en una cartera de renta variable... more
El presente Trabajo Fin de Master plantea como objetivo la aplicación práctica del modelo GARCH y distribuciones de colas anchas en el cálculo del riesgo de mercado, mediante metodología VeR Paramétrica, en una cartera de renta variable en el periodo de 1/1/2014 al 31/12/2018 para los principales títulos del sector bancario y financiero que cotizan en el mercado bursátil español y que pertenecen al IBEX-35. Se escogen los valores cotizados de Santander, BBVA, CaixaBank, Sabadell y Bankia. Se realiza metodología empírica mediante técnicas matemáticas, estadísticas y econométricas. Se comparan resultados obtenidos entre distribución normal y distribución T-Student (colas anchas), y según la forma de cálculo de la volatilidad (Estándar o GARCH). Finalmente se realiza el Backtesting en el periodo de 1/1/2019 al 30/6/2019 para los resultados de la cartera y se establecen las conclusiones finales del estudio.
Jika terjadi korelasi maka dinamakan ada problem autokorelasi. auto korelasi pada sebagian besar kasus ditemukan pada regresi yang datanya adalah time series atau berdasarkan waktu berkala, seperti bulanan, tahunan dan strnya, karena tu... more
We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general results on the rate of convergence of the... more
We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general results on the rate of convergence of the posterior measure relative to distances derived from a testing criterion. We then specialize our results to independent, nonidentically distributed observations, Markov processes, stationary Gaussian time series and the white noise model. We apply our general results to several examples of infinite-dimensional statistical models including nonparametric regression with normal errors, binary regression, Poisson regression, an interval censoring model, Whittle estimation of the spectral density of a time series and a nonlinear autoregressive model. POSTERIOR CONVERGENCE RATES 193 dressed by Amewou-Atisso, Ghosal, Ghosh and Ramamoorthi [1] and Choudhuri, Ghosal and Roy . The main purpose of the present paper is to obtain a theorem on rates of convergence of posterior distributions in a general framework not restricted to the setup of i.i.d. observations. We specialize this theorem to several classes of non-i.i.d. models including i.n.i.d. observations, Gaussian time series, Markov processes and the white noise model. The theorem applies in every situation where it is possible to test the true parameter versus balls of alternatives with exponential error probabilities and it is not restricted to any particular structure on the joint distribution. The existence of such tests has been proven in many special cases by Le Cam and Birgé , who used them to construct estimators with optimal rates of convergence, determined by the (local) metric entropy or "Le Cam dimension" of the model. Our main theorem uses the same metric entropy measure of the complexity of the model and combines this with a measure of prior concentration around the true parameter to obtain a bound on the posterior rate of convergence, generalizing the corresponding result of Ghosal, Ghosh and van der Vaart . We apply these results to obtain posterior convergence rates for linear regression, nonparametric regression, binary regression, Poisson regression, interval censoring, spectral density estimation and nonlinear autoregression. van der Meulen, van der Vaart and van Zanten have extended the approach of this paper to several types of diffusion models.
Seismic interferometry involves the crosscorrelation of responses at different receivers to obtain the Green's function between these receivers. For the simple situation of an impulsive plane wave propagating along the x-axis, the... more
Seismic interferometry involves the crosscorrelation of responses at different receivers to obtain the Green's function between these receivers. For the simple situation of an impulsive plane wave propagating along the x-axis, the crosscorrelation of the responses at two receivers along the x-axis gives the Green's function of the direct wave between these receivers. When the source function of the plane wave is a transient ͑as in exploration seismology͒ or a noise signal ͑as in passive seismology͒, then the crosscorrelation gives the Green's function, convolved with the autocorrelation of the source function. Direct-wave interferometry also holds for 2D and 3D situations, assuming the receivers are surrounded by a uniform distribution of sources. In this case, the main contributions to the retrieved direct wave between the receivers come from sources in Fresnel zones around stationary points. The main application of direct-wave interferometry is the retrieval of seismic surface-wave responses from ambient noise and the subsequent tomographic determination of the surfacewave velocity distribution of the subsurface. Seismic interferometry is not restricted to retrieving direct waves between receivers. In a classic paper, Claerbout shows that the autocorrelation of the transmission response of a layered medium gives the plane-wave reflection response of that medium. This is essentially 1D reflected-wave interferometry. Similarly, the crosscorrelation of the transmission responses, observed at two receivers, of an arbitrary inhomogeneous medium gives the 3D reflection response of that medium. One of the main applications of reflected-wave interferometry is retrieving the seismic reflection response from ambient noise and imaging of the reflectors in the subsurface. A common aspect of direct-and reflected-wave interferometry is that virtual sources are created at positions where there are only receivers without requiring knowledge of the subsurface medium parameters or of the positions of the actual sources.
Seasonal and annual rainfall data of the stations: Akluj, Baramati, Bhor and Malsiras stations located in Nira Basin, Central India, were analyzed for studying trend and periodicity using 104 years’ rainfall data. The analysis was carried... more
Seasonal and annual rainfall data of the stations: Akluj, Baramati, Bhor and Malsiras stations located in Nira Basin, Central India, were analyzed for studying trend and periodicity using 104 years’ rainfall data. The analysis was carried out by using Mann-Kendall (MK), Modified Mann-Kendall (MMK) and Theil and Sen’s slope estimator tests describing rising trend at all the stations. However, it is statistically significant at Akluj and Bhor stations at 10% significance level. Bhor station showed the maximum increase in percentage change i.e. 0.28% in annual rainfall. Monsoon and post-monsoon seasonal rainfall shows a rising trend while the summer and winter seasonal rainfall shows a falling trend. Wavelet analysis showed prominent annual rainfall periods ranging from 2 to 8 years at all the stations after 1960s resulting in describing more changes in the rainfall patterns after 1960s.
This note obtains the theoretical autocorrelation function of an ARMA model with multiplicative seasonality. It is shown that this function can be interpretated as the result of the interaction between the seasonal and regular... more
This note obtains the theoretical autocorrelation function of an ARMA model with multiplicative seasonality. It is shown that this function can be interpretated as the result of the interaction between the seasonal and regular autocorrelation patterns of the ARMA model. The use of this result makes easier the identification of the structure of the model, is helpful in choosing between a multiplicative or additive seasonal component and leads to a better understanding of the properties of the estimated autocorrelation function of scalar ARMA processes.
Tracking speakers in multiparty conversations constitutes a fundamental task for automatic meeting analysis. In this paper, we present a novel probabilistic approach to jointly track the location and speaking activity of multiple speakers... more
Tracking speakers in multiparty conversations constitutes a fundamental task for automatic meeting analysis. In this paper, we present a novel probabilistic approach to jointly track the location and speaking activity of multiple speakers in a multisensor meeting room, equipped with a small microphone array and multiple uncalibrated cameras. Our framework is based on a mixed-state dynamic graphical model defined on a multiperson state-space, which includes the explicit definition of a proximity-based interaction model.
The book, Orthogonal Designs: Quadratic Forms and Hadamard Matrices, Marcel Dekker, New York-Basel, 1979, by A.V. Geramita and Jennifer Seberry, has now been out of print for almost two decades. Many of the results on weighing matrices... more
The book, Orthogonal Designs: Quadratic Forms and Hadamard Matrices, Marcel Dekker, New York-Basel, 1979, by A.V. Geramita and Jennifer Seberry, has now been out of print for almost two decades. Many of the results on weighing matrices presented therein have been greatly improved. Here we review the theory, restate some results which are no longer available and expand on the
This paper presents a new robust method to estimate the parameters of ARMA models. This method makes use of the autocorrelations estimates based on the ratio of medians together with a robust filter cleaner able to reject a large fraction... more
This paper presents a new robust method to estimate the parameters of ARMA models. This method makes use of the autocorrelations estimates based on the ratio of medians together with a robust filter cleaner able to reject a large fraction of outliers, and a Gaussian maximum likelihood estimation which handles missing values. The main advantages of the procedure are its easiness, robustness and fast execution. Its effectiveness is demonstrated on an example of the forecasting of the French daily electricity consumptions.
- by Yacine Chakhchoukh and +1
- •
- Statistics, Signal Processing, Time Series, Modeling
Beaver (Castor sp.) can change the riverine environment through dam-building and other activities. The European beaver (Castor fiber) was extirpated in Poland by the nineteenth century, but populations are again present as a result of... more
Beaver (Castor sp.) can change the riverine environment through dam-building and other activities. The
European beaver (Castor fiber) was extirpated in Poland by the nineteenth century, but populations are again
present as a result of reintroductions that began in 1974. The goal of this paper is to assess the impact of beaver
activity on montane fluvial systemdevelopment by identifying and analysing changes in channel and valleymorphology
following expansion of beaver into a 7.5 km-long headwater reach of the upperWisłoka River in southeast
Poland. Wedocument the distribution of beaver in the reach, the change in river profile, sedimentation type
and storage in beaver ponds, and assess how beaver dams and ponds have altered channel and valley bottom
morphology. The upper Wisłoka River fluvial system underwent a series of anthropogenic disturbances during
the last fewcenturies. The rapid spread of C. fiber in the upper Wisłoka River valleywas promoted by the valley's
morphology, including a low-gradient channel and silty-sand deposits in the valley bottom. At the time of our
survey (2011), beaver ponds occupied 17% of the length of the study reach channel. Two types of beaver dams
were noted: in-channel dams and valley-wide dams. The primary effect of dams, investigated in an intensively
studied 300-m long subreach (Radocyna Pond),was a change in the longitudinal profile fromsmooth to stepped,
a local reduction of thewater surface slope, and an increase in the variability of both the thalweg profile and surface
water depths. We estimate the current rate of sedimentation in beaver ponds to be about 14 cm per year. A
three-stage scheme of fluvial processes in the longitudinal and transverse profile of the river channel is proposed.
C. fiber reintroduction may be considered as another important stage of the upper Wisłoka fluvial system
development.
Pectines are unique, midventral sensory appendages that help direct mating and food-finding behaviors in scorpions. Dense two-dimensional arrays of bimodally sensitive (chemical and mechanical) peg sensilla form the primary sensory... more
Pectines are unique, midventral sensory appendages that help direct mating and food-finding behaviors in scorpions. Dense two-dimensional arrays of bimodally sensitive (chemical and mechanical) peg sensilla form the primary sensory structures on pectines. Several qualities of peg sensilla make them well suited to electrophysiological investigation, including accessibility, stability of extracellular recordings, and the ease with which spiking cells can be identified and categorized. Cross-correlations of spontaneous neural activity show signs of synaptic interactions between sensillar neurons in all species examined to date (Paruroctonus mesaensis, Hadrurus arizonensis, Centruroides vittatus) representing three families and two superfamilies. Both excita-tory and inhibitory interactions have been observed, as well as possible dyadic synaptic arrange-ment. Computer simulations of cross-correlograms are consistent with experimental data and may help provide additional insight into fun...
In this study we examine claim that momentum in stock returns is not due to positive autocorrelation as behavioral models suggest. Using portfolio-specific data, we find that the 6-month/6month and 12-month/12-month momentum strategies... more
In this study we examine claim that momentum in stock returns is not due to positive autocorrelation as behavioral models suggest. Using portfolio-specific data, we find that the 6-month/6month and 12-month/12-month momentum strategies are profitable for the industry, size, and B/M portfolios. Like Lewellen, we find the autocovariance component of the momentum profit to be negative, suggesting no return continuations. However, we also find that the autocorrelations calculated from short-term (e.g., monthly) returns are quite different from long-horizon (e.g., annual) autocorrelations.
The sampling of epiphytes is fraught with methodological difficulties. We present a protocol to sample and analyse vascular epiphyte richness and abundance in forests of different structure (SVERA). Epiphyte abundance is estimated as... more
The sampling of epiphytes is fraught with methodological difficulties. We present a protocol to sample and analyse vascular epiphyte richness and abundance in forests of different structure (SVERA). Epiphyte abundance is estimated as biomass by recording the number of plant components in a range of size cohorts. Epiphyte species biomass is estimated on 35 sample-trees, evenly distributed over six trunk diameter-size cohorts (10 trees with dbh > 30 cm). Tree height, dbh and number of forks (diameter > 5 cm) yield a dimensionless estimate of the size of the tree. Epiphyte dry weight and species richness between forests is compared with ANCOVA that controls for tree size. S Chao1 is used as an estimate of the total number of species at the sites. The relative dependence of the distribution of the epiphyte communities on environmental and spatial variables may be assessed using multivariate analysis and Mantel test. In a case study, we compared epiphyte vegetation of six Mexican oak forests and one Colombian oak forest at similar elevation. We found a strongly significant positive correlation between tree size and epiphyte richness or biomass at all sites. In forests with a higher diversity of host trees, more trees must be sampled. Epiphyte biomass at the Colombian site was lower than in any of the Mexican sites; without correction for tree size no significant differences in terms of epiphyte biomass could be detected. The occurrence of spatial dependence, at both the landscape level and at the tree level, shows that the inclusion of spatial descriptors in SVERA is justified.
This paper fit a univariate time series model to the average amount of electricity generated in Nigeria between 1970 and 2009 and provides ten years forecast for the expected electricity generation in Nigeria. The Box-Jenkins... more
This paper fit a univariate time series model to the average amount of electricity generated in Nigeria between 1970 and 2009 and provides ten years forecast for the expected electricity generation in Nigeria. The Box-Jenkins Autoregressive Integrated Moving Average (ARIMA) models are estimated and the best fitting ARIMA model is used to obtain the post-sample forecasts. The fitted model was ARIMA (3,2,1),with the Normalized Bayesian Information Criteria (BIC) of 13.906, stationary R 2 = 0.69 and Maximum likelihood estimate of 411.55. The model was further validated by Ljung-Box test (Q 14 = 6.404 and p>.10) with no significant autocorrelation between residuals at different lag times. Finally, ten years forecast was made, which showed a pick in the average electricity generation with estimated value as 3088.22 in the year 2011.
—Severe injuries of lower extremities often lead to chronic pain and reduced walking abilities. We postulated that measuring free-living gait can provide further information about walking ability in complement to clinical evaluations. We... more
—Severe injuries of lower extremities often lead to chronic pain and reduced walking abilities. We postulated that measuring free-living gait can provide further information about walking ability in complement to clinical evaluations. We sought to validate a method that characterizes free gaits with a wearable sensor. Over one week, 81 healthy controls (HC) and 66 chronic lower limb pain patients (CLLPP) hospitalized for multidisciplinary rehabilitation wore a simple accelerometer (Actigraph). In the acceleration signals, steady one-minute walks detected numbered 7,835 (5,085 in CLLPP and 2,750 in HC). Five gait quality measures were assessed: movement intensity, cadence, stride regularity, and short-term and long-term local dynamic stability. Gait quality variables differed significantly between CLLPP and HC (4% to 26%). Intraclass correlation coefficients (ICC) revealed moderate to high repeatability (0.71–0.91), which suggests that seven days of measurement are sufficient to assess average gait patterns. Regression analyses showed significant association (R 2 =0.44) between gait quality variables and a clinical evaluation of walking ability, i.e. the 6 minute walk test, Overall, the results show that the method is easy to implement, valid (high concurrent validity), and reliable to assess walking abilities ecologically.
This article describes a linear modeling approach for the analysis of single-case designs (SCDs). Effect size measures in SCDs have been defined and studied for the situation where there is a level change without a time trend. However,... more
This article describes a linear modeling approach for the analysis of single-case designs (SCDs). Effect size measures in SCDs have been defined and studied for the situation where there is a level change without a time trend. However, when there are level and trend changes, effect size measures are either defined in terms of changes in R 2 or defined separately for changes in slopes and intercept coefficients. We propose an alternate effect size measure that takes into account changes in slopes and intercepts in the presence of serial dependence and provides an integrated procedure for the analysis of SCDs through estimation and inference based directly on the effect size measure. A Bayesian procedure is described to analyze the data and draw inferences in SCDs. A multilevel model that is appropriate when several subjects are available is integrated into the Bayesian procedure to provide a standardized effect size measure comparable to effect size measures in a between-subjects design. The applicability of the Bayesian approach for the analysis of SCDs is demonstrated through an example.
This paper focuses on using the integrated methodological, approach and analysis to study macroseismic damage effects in urban areas. A case study inherent seismic risk analysis of the old town centre of Potenza hilltop town has been... more
This paper focuses on using the integrated methodological, approach and analysis to study macroseismic damage effects in urban areas. A case study inherent seismic risk analysis of the old town centre of Potenza hilltop town has been discussed, with particular attention to the evaluation of possible local amplifying factors (stratigraphy and geomorphology) deduced from comparison with data studied with a geostatistical approach (kernel density, Moran Index, Getis & Ord index). Geostatistics have been applied to seismic damage scenario of 1980 Irpinian earthquake macroseismic data, offering a new point of view in seismic risk assessment.
Fitness landscape analysis techniques are used to better understand the influence of genetic representations and associated variation operators when solving a combinatorial optimization problem. Five representations are investigated for... more
Fitness landscape analysis techniques are used to better understand the influence of genetic representations and associated variation operators when solving a combinatorial optimization problem. Five representations are investigated for the multidimensional knapsack problem. Common mutation operators, such as bit-flip mutation, are employed to generate fitness landscapes. Measures such as fitness distance correlation and autocorrelation are applied to examine the landscapes associated with the tested genetic encodings. Furthermore, additional experiments are made to observe the effects of adding heuristics and local optimization to the representations. Encodings with a strong heuristic bias are more efficient, and the addition of local optimization techniques further enhances their performance.
Plant-wide oscillations are common in many processes. Their effects propagate to many units and may impact the overall process performance. It is important to detect and diagnose the cause of such oscillations in order to rectify the... more
Plant-wide oscillations are common in many processes. Their effects propagate to many units and may impact the overall process performance. It is important to detect and diagnose the cause of such oscillations in order to rectify the situation. This paper proposes a new procedure to detect and diagnose plant-wide oscillations using routine operating data. A technique called spectral envelope is used to detect oscillations. Then the variables that have common oscillations can be identified and categorized accurately by a statistical hypothesis test. A new index called the oscillation contribution index (OCI) is proposed to isolate the key variables as the root cause candidates of the common oscillation(s). Two successful industrial case studies are presented to demonstrate the applicability of the proposed procedure.
Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to... more
Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.
- by Colin Beale and +1
- •
- Geography, Spatial Analysis, Macroecology, Ecology
Many finance questions require a full characterization of the distribution of returns. We propose a bivariate model of returns and realized volatility (RV), and explore which features of that time-series model contribute to superior... more
Many finance questions require a full characterization of the distribution of returns. We propose a bivariate model of returns and realized volatility (RV), and explore which features of that time-series model contribute to superior density forecasts over horizons of 1 to 60 days out of sample. This term structure of density forecasts is used to investigate the importance of: the intraday information embodied in the daily RV estimates; the functional form for log(RV ) dynamics; the timing of information availability; and the assumed distributions of both return and log(RV ) innovations. We find that a joint model of returns and volatility that features two components for log(RV ) provides a good fit to S&P 500 and IBM data, and is a significant improvement over an EGARCH model estimated from daily returns.