OPerations research and statistics Research Papers (original) (raw)
The increasing wind power generation requires the updating of the computational tools that support decision making in the operation and planning of electric power systems. This work describes a methodology to model the spatial... more
The increasing wind power generation requires the updating of the computational tools that support decision making in the operation and planning of electric power systems. This work describes a methodology to model the spatial correlations among wind speeds probability distributions in probabilistic power flow. The proposed methodology is based on Nataf transformation and Monte Carlo Simulation. The proposed methodology is illustrated in case studies with the IEEE 24 bus test system, highlighting the importance of adequately considering such correlations.
This study examined whether the share prices of companies listed on the Zimbabwe Stock Exchange follow the Random Walk Hypothesis. The research was motivated by the fact that investors are interested in knowing whether past share prices... more
This study examined whether the share prices of companies listed on the Zimbabwe Stock Exchange follow the Random Walk Hypothesis. The research was motivated by the fact that investors are interested in knowing whether past share prices have a propensity to forecast future share prices. The period covered by the research was January 2014 to December 2014. The main objective of the study investigated the possibility that share prices follow the Random Walk Hypothesis. The data was analysed using the Chi-square Test, the Runs Test and the Auto-correlation Test. The findings showed that changes in share prices on the ZSE refute the Random Walk Hypothesis. The study concluded that share price shifts follow some pattern or trend and that historical price changes can be used to predict future price movements. The study also concluded that the ZSE provides an opportunity for investors to create wealth as they take advantage of its weak-form inefficiency.
A Markov chain model is mainly used for business, manpower planning, share market and many different areas. Because the prediction of the any ideas based on the Markov chain the result need to be efficient. Now, the infection of corona... more
A Markov chain model is mainly used for business, manpower planning, share market and many different areas. Because the prediction of the any ideas based on the Markov chain the result need to be efficient. Now, the infection of corona virus COVID-19 is a large task for the human being as well as the government. This paper is focusing tool for prediction of corona virus infection with a markov chain model. Markov chain model had been used to predict the corona virus (COVID-10) based at the secondary data as on 13 th March 2020. The 1 st order markov models had been used to predict the impact of corona virus using probability matrices and Monte Carlo simulation. To present the applications of this model, 2020 corona virus pandemic in India by country and union territory become used as a case study. It will be useful for prediction of the corona virus COVID-19 in destiny.
This study compares the SPSS ordinary least squares (OLS) regression and ridge regression procedures in dealing with multicollinearity data. The LS regression method is one of the most frequently applied statistical procedures in... more
This study compares the SPSS ordinary least squares (OLS) regression and ridge regression procedures in dealing with multicollinearity data. The LS regression method is one of the most frequently applied statistical procedures in application. It is well documented that the LS method is extremely unreliable in parameter estimation while the independent variables are dependent (multicollinearity problem). The Ridge Regression procedure deals with the multicollinearity problem by introducing a small bias in the parameter estimation. The application of Ridge Regression involves the selection of a bias parameter and it is not clear if it works better in applications. This study uses a Monte Carlo method to compare the results of OLS procedure with the Ridge Regression procedure in SPSS.
The stock price in developing countries, especially in Kenya, has become one of the market that supports the economy growth of a country. Due to the political instabilities in the Kenyan contest, stock price markets have been affected. As... more
The stock price in developing countries, especially in Kenya, has become one of the market that supports the economy growth of a country. Due to the political instabilities in the Kenyan contest, stock price markets have been affected. As a consequence of the instabilities in the financial markets, this paper model the volatility associated with the stock price for a one day ahead volatility forecast which will help in risk control in the market. This is accomplished by using the asymmetry GARCH and ANN-asymmetry GARCH models. The residuals obtained from artificial neural network are used when fitting ANN-asymmetry GARCH models. It was found that returns on the selected companies in NSE are categorized by volatility clustering, leptokurtosis and asymmetry. In the modelling, we further examine the performance of the leading alternatives with the daily log returns residuals of the leading companies in Kenyan stock market (PAFR, PORT and EGAD) from the period January 2006 to November 2017 for trading days excluding weekends and holidays. The root mean squared error indicated that among the available models i.e. ANN-EGARCH model, GJR-GARCH and EGARCH model, ANN-GJR-GARCH model performed better in modelling and forecasting the stock price volatility in Kenyan contest. The paper demonstrates that combined machine learning and statistical models can effectively model stock price volatility and make reliable forecasts.
Vários tipos de modelos de regressão com exemplos em R
Prevalence of diabetes mellitus has raised faster in low and middle income countries than in high income country. Diabetes Mellitus is a cardiovascular metabolic condition with characteristics of chronic high blood glucose levels and a... more
Prevalence of diabetes mellitus has raised faster in low and
middle income countries than in high income country.
Diabetes Mellitus is a cardiovascular metabolic condition
with characteristics of chronic high blood glucose levels
and a high risk of difficulties like, eyes damage, kidneys
damage, nervous system damage, hearing deficiency,
Alzheimer and cardio vascular diseases. There were 1.16
million cases of diabetes in Sri Lanka in 2016. There are
numerous risk factors of diabetes mellitus. But the majority
of the humankind is unaware of the factors of the
prevalence of this. Therefore the main purpose of this
study was to determine the factors effecting for diabetes
mellitus. The data were obtained from a cross sectional
survey conducted through a structured questionnaire using
100 participants chose from cluster sampling and simple
random sampling. Descriptive statistics including mean,
standard deviation, frequency, proportion, and percentage
and inferential statistics comprising χ2 test, factor analysis,
and discriminant analysis were used to analyze the data
using SPSS and Excel. Prevalence of diabetes mellitus in
females was higher than in males. Higher educated
population had a less prevalence of the disease. Diabetes
mellitus showed a positive relationship with age, less
physical activity, and BMI value. Among these diabetic
patientss, 56.9% had family history of diabetes; and
47.10% were performance sedentary work. There was a
significant association between diabetic community and
BMI χ2 (3) = 31.041, p = .000. Therefore, measures must
be taken to implement health policies to aware the society
about the diabetes mellitus
From the bestselling author of Blink and The Tipping Point , Malcolm Gladwell's Outliers: The Story of Success overturns conventional wisdom about genius to show us what makes an ordinary person an extreme overachiever. Why do some people... more
From the bestselling author of Blink and The Tipping Point , Malcolm Gladwell's Outliers: The Story of Success overturns conventional wisdom about genius to show us what makes an ordinary person an extreme overachiever. Why do some people achieve so much more than others? Can they lie so far out of the ordinary? In this provocative and inspiring book, Malcolm Gladwell looks at everyone from rock stars to professional athletes, software billionaires to scientific geniuses, to show that the story of success is far more surprising, and far more fascinating, than we could ever have imagined. He reveals that it's as much about where we're from and what we do, as who we are-and that no one, not even a genius, ever makes it alone. Outliers will change the way you think about your own life story, and about what makes us all unique.
A Markov chain model is mainly used for business, manpower planning, share market and many different areas. Because the prediction of the any ideas based on the Markov chain the result need to be efficient. Now, the infection of corona... more
A Markov chain model is mainly used for business, manpower planning, share market and many different areas. Because the prediction of the any ideas based on the Markov chain the result need to be efficient. Now, the infection of corona virus COVID-19 is a large task for the human being as well as the government. This paper is focusing tool for prediction of corona virus infection with a markov chain model. Markov chain model had been used to predict the corona virus (COVID-10) based at the secondary data as on 13 th March 2020. The 1 st order markov models had been used to predict the impact of corona virus using probability matrices and Monte Carlo simulation. To present the applications of this model, 2020 corona virus pandemic in India by country and union territory become used as a case study. It will be useful for prediction of the corona virus COVID-19 in destiny.
Neural networks are being used in areas of prediction and classification, the areas where statistical methods have traditionally been used. Both the traditional statistical methods and neural networks are looked upon as competing... more
Neural networks are being used in areas of prediction and classification, the areas where statistical methods have traditionally been used. Both the traditional statistical methods and neural networks are looked upon as competing model-building techniques in literature. This paper carries out a comprehensive review of articles that involve a comparative study of feed forward neural networks and statistical techniques used for prediction and classification problems in various areas of applications. Tabular presentations highlighting the important features of these articles are also provided. This study aims to give useful insight into the capabilities of neural networks and statistical methods used in different kinds of applications.
For a long time, one of my dreams was to describe the nature of uncertainty axiomatically, and it looks like I've finally done it in my co∼eventum mechanics! Now it remains for me to explain to everyone the co∼eventum mechanics in the... more
For a long time, one of my dreams was to describe the nature of uncertainty axiomatically, and it looks like I've finally done it in my co∼eventum mechanics! Now it remains for me to explain to everyone the co∼eventum mechanics in the most approachable way. This is what I'm trying to do in this work. The co∼eventum mechanics is another name for the co∼event theory, i.e., for the theory of experience and chance which I axiomatized in 2016 [1, 2]. In my opinion, this name best reflects the co∼event-based idea of the new dual theory of uncertainty, which combines the probability theory as a theory of chance, with its dual half, the believability theory as a theory of experience. In addition, I like this new name indicates a direct connection between the co∼event theory and quantum mechanics, which is intended for the physical explanation and description of the conict between quantum observers and quantum observations [4]. Since my theory of uncertainty satises the Kolmogorov axioms of probability theory, to explain this co∼eventum mechanics I will use a way analogous to the already tested one, which explains the theory of probability as a theory of chance describing the results of a random experiment. The simplest example of a random experiment in probability theory is the " tossing a coin ". Therefore, I decided to use this the simplest random experiment itself, as well as the two its analogies: the " "flipping a coin " and the " spinning a coin " to explain the co∼eventum mechanics, which describes the results of a combined experienced random experiment. I would like to resort to the usual for the probability theory " coin-based " analogy to explain (and first of all for myself) the logic of the co∼eventum mechanics as a logic of experience and chance. Of course, this analogy one may seem strange if not crazy. But I did not come up with a better way of tying the explanations of the logic of the co∼eventum mechanics to the coin-based explanations that are commonly used in probability theory to explain at least for myself the logic of the chance through a simple visual " coin-based " model that clarifies what occurs as a result of a combined experienced random experiment in which the experience of observer faces the chance of observation. I hope this analogy can be useful not only for me in understanding the co∼eventum mechanics.
You yourself, or what is the same, your experience is such ``coin'' that, while you aren't questioned, it rotates all the time in ``free flight''. And only when you answer the question the ``coin'' falls on one of the sides: ``Yes'' or... more
You yourself, or what is the same, your experience is such ``coin'' that, while you aren't questioned, it rotates all the time in ``free flight''. And only when you answer the question the ``coin'' falls on one of the sides: ``Yes'' or ``No'' with the believability that your experience tells you.
The main aim of this study was to empirically assess the main micro-economic factors that affect a bank's performance. The objectives were to ascertain if there is a relationship between the performance variables with the microeconomic... more
The main aim of this study was to empirically assess the main micro-economic factors that affect a bank's performance. The objectives were to ascertain if there is a relationship between the performance variables with the microeconomic variables, determine those that are statistically significant and their impact on the performance of banks in Zimbabwe. This study becomes unique in that it employs an econometric model which was built from balanced panel data and according to the author’s knowledge, no study has so far employed the Arellano-Bond estimation procedure to Zimbabwean banks’ data. The empirical analysis was carried out on a sample of 17 banks that were operational in the years 2010 to 2017 in Zimbabwe. Return on Assets (ROA), Return on Equity (ROE) and Net Interest Margin (NIM) were used as the performance indicators in the analysis, when most studies on Zimbabwean banks
would focus on just one or two of these variables. The results indicate the main microeconomic factors to be those attributed to growth, credit risk, capitalisation, managerial efficiency, liquidity and diversification in the Zimbabwean financial institutions. Performance in these institutions is generally good as measured by positive persistent profits, that is, ROA, ROE and NIM. These returns reflect the extent to which these institutions are resilient to the economic crisis. The nature of relationships within
the micro economic variables can be explored for future research, as well as using the emerging machine learning techniques to better understand the relationships and critical points of the variables
that have a significant impact on the performance of banks.
BUKU STATISTIK Buku ini disusun dalam 17 bab dan disusun sebagai buku pegangan untuk mata-ajaran sidik peubah ganda bagi mahasiswa S1 pada tahun ke tiga atau mahasiswa S2 tahun pertama mayor statistika di Departemen Statistika, Fakultas... more
BUKU STATISTIK
Buku ini disusun dalam 17 bab dan disusun sebagai buku pegangan untuk mata-ajaran sidik peubah ganda bagi mahasiswa S1 pada tahun ke tiga atau mahasiswa S2 tahun pertama mayor statistika di Departemen Statistika, Fakultas Matematika dan IPA, IPB. Bagi mahasiswa S1, sampai dengan Bab 3 wajib dimengerti dengan baik sebelum mempelajari Bab lainnya. Bab 12 sampai Bab 17 adalah topik yang wajib diketahui bagi mahasiswa S2, sedangkan bagi mahasiswa S1 baik untuk diketahui akan tetapi diberikan sebagai pengetahuan tambahan saja. Dalam mempelajari buku ini, meskipun akan ada pengulangan untuk beberapa materi peubah tunggal seperlunya, tetapi untuk memperoleh hasil yang lebih baik, pembaca diharapkan telah mendapat pengetahuan satu semester untuk mata-ajaran metoda statistika, sampling, dan kalkulus, terutama tentang turunan sebagian dan integral, yang diperlukan untuk menghitung maksimum dan nilai harapan. Akan lebih baik lagi bila telah mempelajari aljabar matriks.
OR consultants aspire to provide their clients with good reasoning. OR methods are designed to support the process of developing good reasoning. Logic examines what counts as good reasoning. The aim of the paper is to investigate the... more
OR consultants aspire to provide their clients with good reasoning. OR methods are designed to support the process of developing good reasoning. Logic examines what counts as good reasoning. The aim of the paper is to investigate the foundations of OR in terms of the relationship between logic and the activities and methods of OR inquiry. The approach taken is: (i) to identify the types of activities that OR practitioners typically engage in during an OR project; (ii) to introduce some of the main areas of logical reasoning, particularly those relevant to OR practice such as deductive, inductive, modal, and informal logic; and (iii) to bring these two lines of thinking together in order to examine the relationship between logic and OR practice and methods. The key finding is that although the core activities and methods of gathering data, analysing it and developing models are logically well founded, the management of uncertainty in the line of argument could benefit from the insights and methods provided by logic. From this analysis, some practical suggestions are made to improve the process of OR, the development of the line of argument, and the honing of advice given to clients.
This research examined factors affecting students’ academic performance in tertiary institutions taking the department of statistics, ABU, Zaria as a case study. Five factors namely: family income, adequacy of lecture halls, availability... more
This research examined factors affecting students’ academic performance in tertiary institutions taking the department of statistics, ABU, Zaria as a case study. Five factors namely: family income, adequacy of lecture halls, availability of reading materials, interest in course of study and reading habits; were considered. A sample of 100 students from the department of statistics was selected purposively and data was obtained through a structured questionnaire with a response value of 90 students. Multiple regression analysis revealed a model which proved to be significant at 0.05 level of significance by Analysis of variance and a Multiple correlation value of 0.551 which indicates a strong positive relationship between CGPA and the predictors. The coefficient of determination posited that 30.4% of the variation in academic performance was explained by the predictor variables. The result of the analysis indicated that availability of reading materials, interest in course of study and reading habits contribute to students’ academic performance (CGPA).
Mathematics plays a critical role in many modeling and simulation applications, and whenever detailed representations of combat phenomenon are required (e.g., attrition, detection, etc.). The required mathematics can seem daunting, but a... more
Mathematics plays a critical role in many modeling and simulation applications, and whenever detailed representations of combat phenomenon are required (e.g., attrition, detection, etc.). The required mathematics can seem daunting, but a wealth of problems relevant to the M&S community can be addressed with efficient numerical algorithms and heuristics. We shall present a small sampling of these methods and relevant examples of their usage. Examples include combat attrition, target engagement, detection, and missile defense. Most of these can be implemented with a spreadsheet for demonstration.
Penelitian ini bertujuan untuk menganalisis penerapan metode Statistical Processing Control (SPC) dalam pengendalian kualitas sebagai upaya menekan jumlah produk cacat dan mengidentifikasi faktor-faktor apa saja yang menyebabkan... more
Penelitian ini bertujuan untuk menganalisis penerapan metode Statistical Processing Control (SPC) dalam pengendalian kualitas sebagai upaya menekan jumlah produk cacat dan mengidentifikasi faktor-faktor apa saja yang menyebabkan kerusakan/ cacat produk yang diproduksi oleh PT. Asera Tirta Posidonia. Penelitian ini merupakan penelitian kuantitaf dengan data sekunder. Langkah-langkah analisis data dengan menggunakan metode SPC yaitu (1) Mengumpulkan data produksi dan produk rusak (check sheet). (2) Membuat histogram, (3) membuat grafik kendali u (u-chart). (4) Mencari faktor penyebab kerusakan/ kecacatan dengan diagram sebab-akibat, dan (5) membuat rekomendasi/ usulan perbaikan kualitas. Penelitian ini dapat disimpulkan dengan melihat grafik kendali u (u-chart) yang menyatakan bahwa kualitas produk berada diluar batas kendali (statistics out control). Hal tersebut menunjukkan masih ada titik-titik yang berada diluar batas kendali yang merupakan indikasi bahwa proses berada dalam keadaan masih mengalami penyimpangan. Hasil analisis diagram fishbone dapat diketahui faktor penyebab kerusakan dalam proses produksi, yaitu berasal dari faktor pekerja, mesin produksi, metode kerja, material/ bahan baku dan lingkungan kerja. Oleh karena itu, semua penyebab kerusakan dapat segera diperbaiki atau pencegahan sedini mungkin agar tidak terjadi kerusakan yang lebih besar lagi.
Kata Kunci: Statistical Processing Control dan Pengendalian Kualitas Produk
This work aims to identify the most relevant economic and financial indicators for predicting the insolvency of the Brazilian electricity distribution utilities. The work was carried out based on 18 indicators from accounting statements... more
This work aims to identify the most relevant economic and financial indicators for predicting the insolvency of the Brazilian electricity distribution utilities. The work was carried out based
on 18 indicators from accounting statements published on the website of the Brazilian National Electric Energy Agency (ANEEL) for 62 distribution utilities in the period 2011-2018. The identification of the most relevant indicators was carried out through an exhaustive search, in
which logistic regression models for panel data classify the utilities between solvents and insolvents, but with different indicators in the explanatory variables set. The analyzes identified five important indicators, namely the net profit, the financial dependence/independence, the general liquidity and the non-technical losses.
Mavi miğferli, askeri elbiseli, Balkanlar, Ortadoğu ve Afrika gibi dünyanın çatışma bölgelerinde daha çok insani yardım amaçlı çalıştıkları tahmin edilen değişik uluslara mensup askerlerden oluşmuş birlikler. Farklı bir yaklaşıma göre ise... more
Mavi miğferli, askeri elbiseli, Balkanlar, Ortadoğu ve Afrika gibi dünyanın çatışma bölgelerinde daha çok insani yardım amaçlı çalıştıkları tahmin edilen değişik uluslara mensup askerlerden oluşmuş birlikler. Farklı bir yaklaşıma göre ise daima gizli gündemleri olan emperyalizmin sinsi taşeronları. Barış gücü denince ortalama insanın zihninde oluşan tablo bu yaklaşımların ötesine geçememektedir. Birleşmiş Milletler Barış Gücü Konseyi'nin BM askeri gözlemcilerinin Ortadoğu'ya yerleştirilmesine izin vermesiyle 1948'de başladı. Birleşmiş Milletler Ateşkes Sağlama ve Denetleme Teşkilatı(UNTSO) olarak bilinen ilk operasyonu İsrail ile Arap komşuları arasındaki Mütarekesiz Anlaşmayı izlemekti BM Fotoğrafı BM barış gücü, önceki görevlerden birinde Kıbrıs 1967'de. O zamandan bu yana, BM tarafından, 56 ülkeye 69 barış gücü operasyonu düzenlendi. BM barış gücü operasyonlarına yüz binlerce askeri personelin yanı sıra 120'den fazla ülkeden on binlerce BM polisi ve diğer siviller de katıldı. BM bayrağı altında görev yaparken yaklaşık 120 ülkeden 3326 BM Barış gücü personeli ölmüştür. Barış gücü tarihini üç kısımda incelersek; İlk yıllar; BM Barış Gücü, Soğuk Savaş rekabetlerinin sıklıkla Güvenlik Konseyi'ni felce uğrattığı bir zamanda doğdu. O yıllarda Barış gücü, barışçıl yollarla çatışmayı çözmek için politik çabalar için önemli destek sağlayan ateşkesleri ve yerinde istikrarı bozmak için sınırlı kaldı. Bu görevler silahsız askeri gözlemcilerden ve hafifçe silahlı birliklerden oluşuyordu ve öncelikle izleme, raporlama ve güven oluşturma rolleri vardı. Birleşmiş Milletler tarafından konuşlandırılan ilk iki barışı koruma operasyonu, BM Mütareke Kontrol Organizasyonu (UNTSO) ve Hindistan ve Pakistan'daki BM Askeri Gözlemci Grubu (UNMOGIP) idi. Bu güne kadar faaliyete devam eden bu görevlerin ikisi de, gözlem ve izleme tipini örneklendirmiş ve düşük yetkili yüzlerce güce sahipti. O yıllarda BM askeri gözlemcileri silahsızdı. En eski silahlı barışı koruma operasyonu, 1956'da Süveyş Krizini ele geçirmek üzere başarılı bir şekilde konuşlandırılan İlk BM Acil Kuvvetleri (UNEF I) idi. 1960 yılında başlatılan, zirvede yaklaşık 20.000 askeri personel olan ilk büyük ölçekli görevi, Kongo BM Operasyonu (ONUC)'tur. ONUC, savaştan zarar görmüş bölgelere istikrarı sağlamaya çalışırken karşılaşılan riskleri göstermiş oldu. Genel Sekreter Dag Hammarskjold da dahil olmak üzere 250 BM personeli bu görevde görev yaparken öldü. BM, 1960'larda ve 1970'lerde Dominik Cumhuriyeti'nde kısa vadeli Dominik Cumhuriyeti'nde Genel Sekreter Temsilciliği (DOMREP) kurdu. Daha sonra, Batı Yeni Gine (Batı Irian)'de Batı Yeni Gine'de Birleşmiş Milletler Güvenlik Gücü BM Güvenlik Konseyi (UNSF), Yemen'de BM Yemen Gözlem
Di era yang semakin berkembang, banyak bidang ilmu seperti ekonomi, sosial, lingkungan, kesehatan, meteorologi, klimatologi, geologi dan sebagainya yang menggunakan data yang berkaitan dengan lokasi atau letak gepgrafis suatu tempat. Data... more
Di era yang semakin berkembang, banyak bidang ilmu seperti ekonomi, sosial, lingkungan, kesehatan, meteorologi, klimatologi, geologi dan sebagainya yang menggunakan data yang berkaitan dengan lokasi atau letak gepgrafis suatu tempat. Data yang memuat informasi mengenai lokasi atau letak geografis suatu daerah dan diperoleh dari hasil pengukuran sering disebut data spasial. Buku ini membahas lengkap mengenai metode statistika spasial dan penerapan dalam permasalahan. Bab pertama membahas definisi statistika spasial, Bab dua sampai dengan empat membahas Geographically Weighted Regression (GWR), Geograpichally Weighted Logistic Regression (GWLR), Geograpichally Weighted Logistic Regression Semiparametric (GWLRS), Geographically Weighted Poisson Regression (GWPR) dan bab lima membahas tentang applikasi OpenGeoDa ArcView GIS. Pada buku ini diberikan sejumlah panduan dalam menganalisis dan intepretasi dari metode tersebut khususnya pengoperasian dengan menggunakan software R, ArcView dan OpenGeoDa. R merupakan Bahasa pemrograman untuk komputasi statistik dan grafis.
Resumen El análisis exploratorio de datos es una de las etapas más importantes de un proyecto de mejora bajo el modelo DMAIC (Define, Measure, Analyze & Control) de la metodología Six Sigma. En este artículo se presenta una breve reseña... more
Resumen El análisis exploratorio de datos es una de las etapas más importantes de un proyecto de mejora bajo el modelo DMAIC (Define, Measure, Analyze & Control) de la metodología Six Sigma. En este artículo se presenta una breve reseña del análisis exploratorio de datos (AED) como una poderosa herramienta para identificar y solucionar problemas de calidad en los procesos de cualquier tipo. Se utilizó para tal efecto un software estadístico de código abierto que es uno de los más utilizados a nivel mundial, R y su interfase RStudio para realizar un análisis exploratorio de un problema ficticio. En base a los resultados obtenidos se pudo identificar el problema subyacente y proponer algunas soluciones al mismo. Abstract Exploratory Data Analysis (EDA) is one of he most important stages in a continuous improvement project using the DMAIC (Define, Measure, Analyse, Improve, Control) models for Six Sigma methodology. This article presents a brief review of EDA as a powerful tool to pinpoint and solve quality issues in any type of process. For this purpose, an open source statistical tool was used, which is one of the most used statistical software worldwide, R and the RStudio IDE in order to analyze a fictitious problem. Based on the results of the analysis, it was possible to identify the underlying problem and to propose some solutions.
This paper aims to present an implementation of the DEA cross-efficiency models in the environment R. R is a free software and open source, highly extensible that offers a variety of functions and graphical routines for data analysis. We... more
This paper aims to present an implementation of the DEA cross-efficiency models in the environment R. R is a free software and open source, highly extensible that offers a variety of functions and graphical routines for data analysis. We develop R codes for both formulations aggressive and benevolent of the DEA cross-efficiency models. In order to illustrate a practical application of the R codes we use inputs and outputs from Brazilian electricity distribution utilities.
This study aims at reviewing the theoretical foundations and Literature Reviews for Technical Analysis approach, to examine the ability of this approach to predict the future stock value in both the emerging and developed financial... more
This study aims at reviewing the theoretical foundations and Literature Reviews for Technical Analysis approach, to examine the ability of this approach to predict the future stock value in both the emerging and developed financial markets. On reviewing several studies in emerging markets and as a result of financial inefficiency, the simplest technical trading rules are able to predict the future stock value. In contrast, the application of these same simple rules (models) for technical analysis, result in inaccurate predictions in developed financial markets, however, with the use of some complex models such as: Neural Network, Genetic Algorithm (GA), Genetic Programming (GP) and CAST, and technical analysis applied models, the result is so clear in the predictability of the future stock value using the technical analysis approach in developed financial markets. So it can be concluded that the technical analysis is profitable in both emerging and developed financial markets, so the study recommended that research and study to try to reach the best and most accurate technical analysis models that can be applied in both emerging and developed financial markets, which can then be generalized.
In general the undergraduate curriculum in statistics offers one semester basic course on linear programming (LP). The conventional course on LP includes simplex algorithm, duality theory, sensitivity analysis and some applications like... more
In general the undergraduate curriculum in statistics offers one semester basic course on linear programming (LP). The conventional course on LP includes simplex algorithm, duality theory, sensitivity analysis and some applications like the feedmix and transport problems. This outline seems to be distant from the mainstream curriculum in statistics. In order to partially overcome this shortcoming, we propose to introduce a technique based on linear programming called Data Envelopment Analysis (DEA) on the LP courses for students of statistics. DEA has gained momentum as a powerful complementary method for statistical tools in the research agenda as well as in practical activities of economic efficiency evaluation.
DR. WILBERT GOBBO, RESEARCH METHODOLOGY IN ACADEMIC AND PASTORAL CONTEXTS
In this thesis, we consider the extreme value distn. of two parameters for the reason of its appearance in many statistical fields of applications. Mathematical and statistical properties of the distribution. such as moments and higher... more
In this thesis, we consider the extreme value distn. of two parameters for the reason of its appearance in many statistical fields of applications. Mathematical and statistical properties of the distribution. such as moments and higher moments are collected and unified and the properties of reliability and hazard functions of the distribution are illustrated.
The chi-square goodness - of - fit is used to test whether the generated samples from the standardized extreme value distribution by Monte Carlo simulation are acceptable for use.
These samples are used to estimate the distribution parameters by four methods of estimation, namely moments method, maximum likelihood method, order statistic method and least squares method.
These methods are discussed theoretically and assessed practically in estimating the reliability and hazard functions. The properties of the estimator, reliability and hazard functions, such as bias, variance, skewness, kurtosis, and mean square error are tabled.
The computer programs are listed in three appendices and the run is made by using "MathCAD 14".
The stock price in developing countries, especially in Kenya, has become one of the market that supports the economy growth of a country. Due to the political instabilities in the Kenyan contest, stock price markets have been affected. As... more
The stock price in developing countries, especially in Kenya, has become one of the market that supports the economy growth of a country. Due to the political instabilities in the Kenyan contest, stock price markets have been affected. As a consequence of the instabilities in the financial markets, this paper model the volatility associated with the stock price for a one day ahead volatility forecast which will help in risk control in the market. This is accomplished by using the asymmetry GARCH and ANN-asymmetry GARCH models. The residuals obtained from artificial neural network are used when fitting ANN-asymmetry GARCH models. It was found that returns on the selected companies in NSE are categorized by volatility clustering, leptokurtosis and asymmetry. In the modelling, we further examine the performance of the leading alternatives with the daily log returns residuals of the leading companies in Kenyan stock market (PAFR, PORT and EGAD) from the period January 2006 to November 2017 for trading days excluding weekends and holidays. The root mean squared error indicated that among the available models i.e. ANN-EGARCH model, GJR-GARCH and EGARCH model, ANN-GJR-GARCH model performed better in modelling and forecasting the stock price volatility in Kenyan contest. The paper demonstrates that combined machine learning and statistical models can effectively model stock price volatility and make reliable forecasts.
How will AI affect OR practice? In OR we aspire to be logical, and therefore our behaviours should be relatively easy to replicate in logic, the basis of computer systems. We also pride ourselves on our attention to context, our project... more
How will AI affect OR practice? In OR we aspire to be logical, and therefore our behaviours should be relatively easy to replicate in logic, the basis of computer systems. We also pride ourselves on our attention to context, our project management skills, and our pragmatic approach. To think about the issues, we can turn to our experience of practicing OR and to the insights of mathematics, philosophy, sociology, and economics. Mathematicians and philosophers have widened the scope of logic to cover many aspects of decision-making; sociologists have conducted research into the social context and consequences of new technologies and economists have analysed their rates of penetration. Some OR jobs will be destroyed and others will be created giving rise to new, more varied career paths. The paper concludes that the centre of gravity of OR practice will move from analysis to those aspects difficult to computerize, the 'residuals'. When AI does finally displace OR practitioners, it may come in the form of 'AI strong enough for OR', strong enough to satisfy potential OR clients in terms of efficacy and cost. The OR community needs to get involved more deeply in AI; it has the relevant expertise to do so.
The main aim of this study was to empirically assess the main microeconomic factors that affect a bank's performance. The objectives were to ascertain if there is a relationship between the performance variables with the microeconomic... more
The main aim of this study was to empirically assess the main microeconomic factors that affect a bank's performance. The objectives were to ascertain if there is a relationship between the performance variables with the microeconomic variables, determine those that are significant and their impact on the performance of banks in Zimbabwe. An econometric model was built from balanced panel data and the Arellano-Bond estimation procedure was employed. The empirical analysis was carried out on a sample of 17 banks that were operational in the years 2010 to 2017 in Zimbabwe. Return on Assets (ROA), Return on Equity (ROE) and Net Interest Margin (NIM) were used as the performance indicators in the analysis. The results indicate the main microeconomic factors to be those attributed to growth, credit risk, capitalisation, managerial efficiency, liquidity and diversification in the Zimbabwean financial institutions. Performance in these institutions is generally good as measured by posi...
A capacidade de estabelecer comparações e perceber diferenças e semelhanças entre objetos, fenômenos e pessoas é uma importante habilidade cognitiva que permite organizar informações, reconhecer padrões e estabelecer relações entre... more
A capacidade de estabelecer comparações e perceber diferenças e semelhanças entre objetos, fenômenos e pessoas é uma importante habilidade cognitiva que permite organizar informações, reconhecer padrões e estabelecer relações entre variáveis. Tal habilidade é natural ao ser humano e vem sendo desenvolvida e aplicada desde tempos imemoriais, por exemplo, na taxonomia das espécies. A análise de agrupamentos consiste na algoritmização desta habilidade cognitiva com o objetivo de identificar automaticamente grupos de objetos com caraterísticas semelhantes em uma massa de dados. Ao longo do minicurso pretende-se apresentar os principais algoritmos hierárquicos (métodos de encadeamento e Ward) e não hierárquicos (K Means) para análise de agrupamentos e abordagens baseadas em conceitos de machine learning como o mapa auto organizável, o método fuzzy c-means e o DBSCAN (Density-based spatial clustering of applications with noise). Algumas aplicações reais também serão ilustradas com o auxílio do programa R.
http://www.semest.uff.br/index.php/minicursos
Short-term wind power forecasts are fundamental information for the safe and economic integration of wind farms into an electric power system. In this work we present a Generalized Additive Model to predict the wind power quantiles... more
Short-term wind power forecasts are fundamental information for the safe and economic integration of wind farms into an electric power system. In this work we present a Generalized Additive Model to predict the wind power quantiles (Quantile Regression) from which we obtain a prediction of the wind power production probability density function in a wind farm. The methodology was implemented in the VENTOS Program. In order to illustrate the application of the methodology as well as the VENTOS Program this work presents the results achieved by a computational experiment based on real data from a wind farm located in Galicia, Spain.
This paper is intended to present a plan for installing a network of electric vehicle charging stations in the city of Rio de Janeiro. It is important to underline the fact that this text results from a research and development project... more
This paper is intended to present a plan for installing a network of electric vehicle charging stations in the city of Rio de Janeiro. It is important to underline the fact that this text results from a research and development project called Planning and Analysis Methodology for the Implementation of Electric Vehicles in Transport Activities in the realm of the Brazilian electric energy regulatory agency, ANEEL, carried out by the distribution power company Light S.E.S.A. in partnership with the Vehicle Propulsion Systems and Electrochemical Sources Laboratory of the Rio de Janeiro State University – UERJ.
For claim actuaries, claim modeling is very crucial since a good understanding and interpretation of loss distribution is the back bone of all the decisions made in the insurance industry regarding premium loading, expected profits,... more
For claim actuaries, claim modeling is very crucial since a good understanding and interpretation of loss distribution is the back bone of all the decisions made in the insurance industry regarding premium loading, expected profits, reserve necessary to ensure profitability and impact of re-insurance. Using claim data obtained from the Central Bank of Nigeria 2015 bulletin and National Insurers Association digest book, this paper determines the best fit for motor, fire and general accident claim data and ascertain whether the composite models suggested in the literatures perform better that the basic parametric models. The results shows that basic parametric models (lognormal and Pareto) performed better than the composite models in fitting motor claim data. Similarly, parametric model (lognormal) performed better than composite models in fitting fire claim data and general accident claim data. However, the insurance company should acknowledge that some external factors such as, trend in claim response and reporting as well as changes in future claim amounts as a result of carelessness or carefulness of drivers, need to be considered before embarking on the usage of this result. Conclusively, the modeling process is a paramount step before any decision can be made with regard to future policies in Nigeria Insurance Companies hence more effort must be channeled towards ensuring that the process adopted produces accurate and reliable forecast.
There are various methods of imparting knowledge to students in a way they will quickly understand what is being passed across to them. Such methods include the introduction of teaching aids like film shows, related materials to the... more
There are various methods of imparting knowledge to students in a way they will quickly understand what is being passed across to them. Such methods include the introduction of teaching aids like film shows, related materials to the lesson and so on. It is very rare to use storytelling method in teaching statistics. This research work, therefore, aims at using the story of a man, his son, and donkey to introduce students in Urban and Regional Planning and Teacher Education departments to some basic terms in statistics. The methodology employed in this paper is using every event and comment passed to explain some statistical terms (content analysis). Findings revealed that a large number of students in the pure science and engineering departments find it difficult to understand some basic terms in statistics as it reflects in their final project writing and results in statistics courses. This could be majorly attached to the ways they were taught. It was recommended that these basic statistical terms should be taught using every possible means to enable students to understand them and be able to apply them in their project and other reports writing.
Backtracking line-search is an old yet powerful strategy for finding better step size to be used in proximal gradient algorithms. The main principle is to locally find a simple convex upper bound of the objective function, which in turn... more
Backtracking line-search is an old yet powerful strategy for finding better step size to be used in proximal gradient algorithms. The main principle is to locally find a simple convex upper bound of the objective function, which in turn controls the step size that is used. In case of inertial proximal gradient algorithms, the situation becomes much more difficult and usually leads to very restrictive rules on the extrapolation parameter. In this paper, we show that the extrapolation parameter can be controlled by locally finding also a simple concave lower bound of the objective function. This gives rise to a double convex-concave backtracking procedure which allows for an adaptive and optimal choice of both the step size and extrapolation parameters. We apply this procedure to the class of inertial Bregman proximal gradient methods, and prove that any sequence generated converges globally to critical points of the function at hand. Numerical experiments on a number of challenging non-convex problems in image processing and machine learning were conducted and show the power of combining inertial step and double backtracking strategy in achieving improved performances.
Singular Spectrum Analysis (SSA) is a non-parametric technique that allows the decomposition of a time series into signal and noise. Thus, it is a useful technique to trend extraction, smooth and filter a time series. The effect on... more
Singular Spectrum Analysis (SSA) is a non-parametric technique that allows the decomposition of a time series into signal and noise. Thus, it is a useful technique to trend extraction, smooth and filter a time series. The effect on performance of both Box and Jenkins' and Holt-Winters models when applied to the time series filtered by SSA is investigated in this paper. Three different methodologies are evaluated in the SSA approach: Principal Component Analysis (PCA), Cluster Analysis and Graphical Analysis of Singular Vectors. In order to illustrate and compare the methodologies, in this paper, we also present the main results of a computational experiment with the monthly residential consumption of electricity in Brazil.
Impacts of Globalization on Local Governments Globalization is the process of connection and incorporation among individuals, organizations, and states worldwide. Globalization has rapidly increased because of advances in transportation... more
Impacts of Globalization on Local Governments Globalization is the process of connection and incorporation among individuals, organizations, and states worldwide. Globalization has rapidly increased because of advances in transportation and communication innovation (Kohn, 2016). This expansion in worldwide collaborations has caused development in global exchange and the trading of thoughts, convictions, and culture. Likewise, globalization has changed the way states and local governments govern and operate their administrative tasks. This paper focuses on globalization's changes to the local governments, including its challenges. Challenged faced by Local governments in the Age of Globalization Globalization presents difficulties for worldwide organizations as far as capital speculation and authority. Setting up a business in another country, particularly an emerging nation, requires significant forthright capital. The required infrastructure may not be available.