Stochastic Optimization Research Papers - Academia.edu (original) (raw)

For very long the deterministic view of Nature prevailed in Physics. Starting from the Illuminism there was the believe that Mechanics was completely described by Newton laws. If the initial conditions of a mechanical systems were... more

For very long the deterministic view of Nature prevailed in Physics. Starting from the Illuminism there was the believe that Mechanics was completely described by Newton laws. If the initial conditions of a mechanical systems were known, then its state could be completely determined in the future as well as in the past.This was precisely the viewpoint of Lagrange and it ruled from the 18th century to around the middle of the 20th century.

However this panglossian view of Nature was disputed, especially with the advent of Thermodynamics that challenged the believe in reversibility and started the study of complex random systems. Also the discovery of simple systems with chaotic behaviour, that despite being deterministic, showed a very complex behaviour, showed that mechanical problems are far from simple. The Quantum Theory, on the other hand, showed that the processes in Nature are not deterministic, but stochastic. This change of paradigm is not yet accepted as is well described in the book of Thomas Kuhn, that discusses sociological aspects of sciences.

The main ideas of probability were slow to develop. Perhaps because chance events were interpreted as the wish of the gods and hence were not believed to be random. The first theoretical works on probability were connected with games of chance and since the set of possibilities are finite the studies are combinatorial, but the big breakthrough is when the continuous case is studied.

Stochastic modelling is, of course, harder than deterministic modelling and the implementation of the model is more costly. Let us see this in a simple example. In the simplest deterministic continuous model the parameters of the model are constants. The equivalent of this in the stochastic case, is to transform the constants in random variables. For each realisation of the random variable its value is also a constant, but the constant may change with the realisation following a certain distribution of probability. That is, the most simple object of a stochastic model, the random variables, are functions, hence objects of infinite dimension. Also, in a coherent deterministic model one expects a solution if one fixes suitable conditions. In a stochastic model this is not the case. For each realisation of the stochastic data the parameters are chosen and a deterministic problem is solved. Then, with the set of solutions obtained, statistics are made and the main result of the problem
is the distribution of probability of the results. The important is to know among the several possible solutions how they are distributed. The values obtained from a single realisation are of no importance, the distribution of the values is the important result. The understanding of this fact was very slow to come for the manufacturing industry, for example, and only after long year of application of quality control in the manufacturing came the realisation that they dealt with a process stochastic in nature. Since them stochastic modelling is essential in Engineering. Today Reliability represents a new way of design and it takes into account the inevitable uncertainties of the processes.

The solution of a stochastic model can be decomposed in three steps: generation of the stochastic data following their distribution of probability, solution of a deterministic problem for each sample generated, and finally computation of statistics with the results (for example, to construct a histogram) until they show a certain persistence (do not change accordingly an error criterium). Histograms represent the approximation of the solution of a stochastic problem. When one wants an approximation of the solution of an stochastic problem, histogram is the best approximation. If they are hard to find, or costly to compute, one make do with mean and dispersion, or a certain number of moments. In some situation, as the ones described in Chapter 8, one computes only a certain probability since moments are too hard to compute.

This book intents to show the main ideas of Stochastic Modelling and Uncertainty Quantification and the main tool used is Functional Analysis, more specifically, Hilbert Spaces. With this tool very complex ideas, as Conditional Probabilities, can be developed in a systematic way.

In Chapter 1 are discussed the main ideas of Probability Theory in a hilbertian context. This chapter is designed as a reference, but the main concepts of random variables and random processes are developed. Chapter 2 discusses how to construct a stochastic model using the Maximum Entropy Principal. When the data is only independent random variables the generation is simple. But in the case one has dependent random variables or a random process the generation is harder and one needs more tools. The main ideas of generation is discussed here. Chapter 3 discusses the problem of approximation of a random variable. The following chapters deals with applications. Chapter 4 discusses linear systems with uncertainties; chapter 5 discusses nonlinear systems with uncertainties; chapter 6 deals with differential equations and the next two chapters, 7 and 8, deals with optimisation with uncertainties.

Description
Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does not lead to a unified presentation of the methods. Moreover, this description does not consider either deterministic problems or infinite dimensional ones.
This book gives a unified, practical and comprehensive presentation of the main techniques used for the characterization of the effect of uncertainty on numerical models and on their exploitation in numerical problems. In particular, applications to linear and nonlinear systems of equations, differential equations, optimization and reliability are presented. Applications of stochastic methods to deal with deterministic numerical problems are also discussed.
MatlabĀ® illustrates the implementation of these methods and makes the book suitable as a textbook and for self-study.
Contents
1. Elements of Probability Theory and Stochastic Processes. 2. Maximum Entropy and Information.
3. Representation of Random Variables.
4. Linear Algebraic Equations Under Uncertainty.
5. Nonlinear Algebraic Equations Involving Random Parameters. 6. Differential Equations Under Uncertainty.
7. Optimization Under Uncertainty.
8. Reliability-Based Optimization

ISBN: 978178540058