Pareto Distribution Research Papers - Academia.edu (original) (raw)
Vilfredo Pareto (1848 – 1923) was studying the inequality of welfare distribution in Italy during the nineteenth century and developed a useful tool named „the principle 80:20“, which was later adopted in many fields to explain that a... more
Vilfredo Pareto (1848 – 1923) was studying the inequality of welfare distribution in Italy during the nineteenth century and developed a useful tool named „the principle 80:20“, which was later adopted in many fields to explain that a small number of causes can be responsible for a large percentage of effects. The principle can be applied to indicate the priority of problem solving and determine the direction of business drivers’ development. Separating the vital few from the trivial many, the management staff can improve firm performance. This paper, in particularly links the Pareto principle postulates to the decision making techniques and proposes different points of view for improving the purchasing process in a particular firm. The firm Mix Metal is a small trader in iron scrap present on the Croatian market since 2004. In this case the Pareto principle is adopted to rationalize the purchasing process and ensure better long term sales margins. The aim of the paper is to develop several points of view from which the root causes arise and problems can be interpreted. In particularly the paper tries to find for which suppliers and which types of material, the purchasing process must be reviewed and even end. The case is developed and presented using the case study methodology.
- by Felix Famoye and +1
- •
- Statistics, Maximum Likelihood, Hazard Rate, Pareto Distribution
During the rainy season 2007 international institutions (e.g. WFP) and news agencies reported floods in the Sahel. Especially in August and September some news gave the impression that the whole Sahel was flooded, in contrast to the... more
During the rainy season 2007 international institutions (e.g. WFP) and news agencies reported floods in the Sahel. Especially in August and September some news gave the impression that the whole Sahel was flooded, in contrast to the droughts more frequently reported for that region. But it is well known that the precipitation patterns in the Sahel are characterized by a
SUMMARY. Certain conditionally specified joint distributions prove to be convenient conjugate prior families for classical multiparameter problems. Although the number of hy- perparameters is large, assessment is shown to be reasonably... more
SUMMARY. Certain conditionally specified joint distributions prove to be convenient conjugate prior families for classical multiparameter problems. Although the number of hy- perparameters is large, assessment is shown to be reasonably straightforward, often involving the use of routine regression programs. Examples are provided involving both informative and diuse prior information.
- by José Sarabia and +1
- •
- Statistics, Bayesian Analysis, Sankhya, Pareto Distribution
- by Erik Vanmarcke and +1
- •
- Climate Change, Statistical Analysis, Risk, Risk assessment
Using time-varying systematic risk model, the paper estimates risk in a number of stock markets in the Gulf Cooperation Council (GCC) countries, including Saudi, Kuwait, Dubai and Abu-Dhabi markets. The results in the paper indicate that... more
Using time-varying systematic risk model, the paper estimates risk in a number of stock markets in the Gulf Cooperation Council (GCC) countries, including Saudi, Kuwait, Dubai and Abu-Dhabi markets. The results in the paper indicate that Saudi market is the most perilous in the group, as it shows wider range of systematic risk. The paper also shows that the effect
- by Bruno Sergi and +1
- •
- Saudi Arabia, Risk assessment, Applied Economics, Stock Market
Undiscovered oil and gas assessments are commonly reported as aggregate estimates of hydrocarbon volumes. Potential commercial value and discovery costs are, however, determined by accumulation size, so engineers, economists, decision... more
Undiscovered oil and gas assessments are commonly reported as aggregate estimates of hydrocarbon volumes. Potential commercial value and discovery costs are, however, determined by accumulation size, so engineers, economists, decision makers, and sometimes policy analysts are most interested in projected discovery sizes. The lognormal and Pareto distributions have been used to model exploration target sizes. This note contrasts the outcomes of applying these alternative distributions to the play level assessments of the U.S. Geological Survey's 1995 National Oil and Gas Assessment. Using the same numbers of undiscovered accumulations and the same minimum, medium, and maximum size estimates, substitution of the shifted truncated lognormal distribution for the shifted truncated Pareto distribution reduced assessed undiscovered oil by 16% and gas by 15%. Nearly all of the volume differences resulted because the lognormal had fewer larger fields relative to the Pareto. The lognorma...
The well-established physical and mathematical principle of maximum entropy (ME), is used to explain the distributional and autocorrelation properties of hydrological processes, including the scaling behaviour both in state and in time.... more
The well-established physical and mathematical principle of maximum entropy (ME), is used to explain the distributional and autocorrelation properties of hydrological processes, including the scaling behaviour both in state and in time. In this context, maximum entropy is interpreted as maximum uncertainty. The conditions used for the maximization of entropy are as simple as possible, i.e. that hydrological processes are non-negative with specified coefficients of variation and lag-one autocorrelation. In the first part of the study, the marginal distributional properties of hydrological processes and the state scaling behaviour were investigated. This second part of the study is devoted to joint distributional properties of hydrological processes. Specifically, it investigates the time dependence structure that may result from the ME principle and shows that the time scaling behaviour (or the Hurst phenomenon) may be obtained by this principle under the additional general condition...
The Z-value is an attempt to estimate the statistical significance of a Smith-Waterman dynamic alignment score (SW-score) through the use of a Monte-Carlo process. It partly reduces the bias induced by the composition and length of the... more
The Z-value is an attempt to estimate the statistical significance of a Smith-Waterman dynamic alignment score (SW-score) through the use of a Monte-Carlo process. It partly reduces the bias induced by the composition and length of the sequences. This paper is not a theoretical study on the distribution of SW-scores and Z-values. Rather, it presents a statistical analysis of Z-values on large datasets of protein sequences, leading to a law of probability that the experimental Z-values follow. First, we determine the relationships between the computed Z-value, an estimation of its variance and the number of randomizations in the Monte-Carlo process. Then, we illustrate that Z-values are less correlated to sequence lengths than SW-scores. Then we show that pairwise alignments, performed on 'quasi-real' sequences (i.e., randomly shuffled sequences of the same length and amino acid composition as the real ones) lead to Z-value distributions that statistically fit the extreme val...