Ali Mohammad-djafari - Profile on Academia.edu (original) (raw)
Papers by Ali Mohammad-djafari
An alternative inference tool to total probability formula and its applications
AIP Conference Proceedings, 2004
Classical methods for inverse problems are mainly based on regularization theory. In particular t... more Classical methods for inverse problems are mainly based on regularization theory. In particular those which are based on optimization of a criterion with two parts: a data-model matching and a regularization term. Different choices for these two terms and great number of optimization algorithms have been proposed. When these two terms are distance or divergence measures, they can have a Bayesian Maximum A Posteriori (MAP) interpretation where these two terms correspond, respectively, to the likelihood and prior probability models.
International Image Processing, Applications and Systems Conference, 2014
Nucleation and Atmospheric Aerosols, 2011
In the event of an accidental or intentional release of chemical or biological (CB) agents into t... more In the event of an accidental or intentional release of chemical or biological (CB) agents into the atmosphere, first responders and decision makers need to rapidly locate and characterize the source of dispersion events using limited information from sensor networks. In this study the stochastic event reconstruction tool (SERT) is applied to a subset of the Fusing Sensor Information from Observing Networks (FUSION) Field Trial 2007 (FFT 07) database. The inference in SERT is based on Bayesian inference with Markov chain Monte Carlo (MCMC) sampling. SERT adopts a probability model that takes into account both positive and zero-reading sensors. In addition to the location and strength of the dispersion event, empirical parameters in the forward model are also estimated to establish a data-driven plume model. Results demonstrate the effectiveness of the Bayesian inference approach to characterize the source of a short range atmospheric release with uncertainty quantification.
Nucleation and Atmospheric Aerosols, 2011
At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, r... more At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.
Problèmes inverses en imagerie et en vision en deux volumes inséparables (Traité Signal et Image, IC2)
HAL (Le Centre pour la Communication Scientifique Directe), 2009
Imagerie micro−onde et application à la détection d'objets enfouis
HAL (Le Centre pour la Communication Scientifique Directe), Feb 1, 2009
Nucleation and Atmospheric Aerosols, 2011
The information-theoretic representation of quantum systems, which complements the familiar energ... more The information-theoretic representation of quantum systems, which complements the familiar energy description of the density-functional and wave-function-based theories, is here discussed. According to it, the internal disorder of the quantum-mechanical non-relativistic systems can be quantified by various single (Fisher information, Shannon entropy) and composite (e.g. Cramér-Rao, LMC shape and Fisher-Shannon complexity) functionals of the Schrödinger probability density ρ(r). First, we examine these concepts and its application to quantum systems with central potentials. Then, we calculate these measures for hydrogenic systems, emphasizing their predictive power for various physical phenomena. Finally, some recent open problems are pointed out.
Nucleation and Atmospheric Aerosols, 2011
We introduce a novel approach for evaluation of neuronal tuning functions, which can be expressed... more We introduce a novel approach for evaluation of neuronal tuning functions, which can be expressed by the conditional probability of observing a spike given any combination of independent variables. This probability can be estimated out of experimentally available data. By maximizing the mutual information between the probability distribution of the spike occurrence and that of the variables, the dependence of the spike on the input variables is maximized as well. We used this method to analyze the dependence of neuronal activity in cortical area MSTd on signals related to movement of the eye and retinal image movement.
Nous proposons une méthode de fusion pour obtenir une image tridimensionnelle d'un objet à partir... more Nous proposons une méthode de fusion pour obtenir une image tridimensionnelle d'un objet à partir de gammagraphies et d'échographies. Ces deux jeux de données apportent des informations complémentaires et sont reliés à l'objet par des paramètres indépendants : l'atténuation pour la gammagraphie et la réflectivité pour les ultrasons. Dans le cadre bayésien retenu, nous montrons que le coeur du processus de fusion réside dans l'introduction d'un modèle a priori joint pour l'atténuation et la réflectivité. Ainsi, la construction de ce modèle et l'estimation conjointe qui en découle constituent la principale contribution de ce travail. Les résultats de traitements de données réelles montrent l'intérêt de cette méthode par rapport à une approche séquentielle.
A Bayesian approach to Fourier Synthesis inverse problem with application in SAR imaging
Nucleation and Atmospheric Aerosols, 2011
Nucleation and Atmospheric Aerosols, 2011
This paper describes an application of Bayesian programming to the control of an autonomous avata... more This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Medical Physics, Oct 22, 2013
Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis m... more Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images. Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed. Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also necessary to have the accurate spectrum information about the source-detector system. When dealing with experimental data, the spectrum can be predicted by a Monte Carlo simulator. For the materials between water and bone, less than 5% separation errors are observed on the estimated decomposition fractions. Conclusions: The proposed approach is a statistical reconstruction approach based on a nonlinear forward model counting the full beam polychromaticity and applied directly to the projections without taking negative-log. Compared to the approaches based on linear forward models and the BHA correction approaches, it has advantages in noise robustness and reconstruction accuracy.
Maximum entropy and Bayesian methods : Paris, France, 1992 : proceedings of the twelfth International Workshop on Maximum Entropy and Bayesian Methods
Kluwer Academic eBooks, 1993
Preface. 1. Bayesian Inference and Maximum Entropy. 2. Quantum Physics and Quantum Information. 3... more Preface. 1. Bayesian Inference and Maximum Entropy. 2. Quantum Physics and Quantum Information. 3. Time Series. 4. Inverse Problems. 5. Applications. 6. Image Restoration and Reconstruction. Key Words Index. Authors' Index.
arXiv (Cornell University), May 18, 2007
In this paper, first a great number of inverse problems which arise in instrumentation, in comput... more In this paper, first a great number of inverse problems which arise in instrumentation, in computer imaging systems and in computer vision are presented. Then a common general forward modeling for them is given and the corresponding inversion problem is presented. Then, after showing the inadequacy of the classical analytical and least square methods for these ill posed inverse problems, a Bayesian estimation framework is presented which can handle, in a coherent way, all these problems. One of the main steps, in Bayesian inversion framework is the prior modeling of the unknowns. For this reason, a great number of such models and in particular the compound hidden Markov models are presented. Then, the main computational tools of the Bayesian estimation are briefly presented. Finally, some particular cases are studied in detail and new results are presented.
Une approche bayésienne de l'inversion en imagerie micro-onde 3D
HAL (Le Centre pour la Communication Scientifique Directe), Mar 19, 2008
Le probleme de la tomographie 3D est modelise par deux equations integrales couplees qui exprimen... more Le probleme de la tomographie 3D est modelise par deux equations integrales couplees qui expriment les champs electrique observe et existant a l'interieur de l'objet a imager. La discretisation de ces deux equations par une methode des moments nous ramene a un jeux de deux equations algebriques matricielles avec deux inconnues qui sont le contraste de l'objet et le champs total a l'interieur de l'objet. Ces deux equations sont de tres grandes dimensions. Par ailleurs, il y trois sources d'erreurs : i) le bruit de mesure proprement dit, ii) l'erreur de discretisation de l'objet, et ii) l'erreur liee aux approximations dans le calcul des elements des deux matrices. L'idee dans les approches probabilistes est de modeliser ces erreurs pour les prendre en compte dans le calcul de la solution. De plus, l'approche bayesienne nous permet aussi de prendre en compte l'information a priori sur les inconnues du probleme. Dans les applications en CND, souvent l'objet etudie est compose d'un nombre fini de materiaux, ce qui implique que l'image recherchee est constituee d'un nombre fini de regions homogenes et compactes, ce qui justifie la modelisation de sa distribution par une melange de gaussiennes avec une variable cachee representant l'etiquette des regions. Nous avons deja utilise cette approche en 2D avec succes et l'objet de cette these est l'extension en 3D. Dans cette expose nous presentons surtout les difficultes que l'on peut rencontrer du point de vue du calcul, et les idees que l'on propose pour la solution.
Estimation des hyperparamètres dans une approche bayésienne de la résolution des problèmes inverses linéaires
An alternative inference tool to total probability formula and its applications
AIP Conference Proceedings, 2004
Classical methods for inverse problems are mainly based on regularization theory. In particular t... more Classical methods for inverse problems are mainly based on regularization theory. In particular those which are based on optimization of a criterion with two parts: a data-model matching and a regularization term. Different choices for these two terms and great number of optimization algorithms have been proposed. When these two terms are distance or divergence measures, they can have a Bayesian Maximum A Posteriori (MAP) interpretation where these two terms correspond, respectively, to the likelihood and prior probability models.
International Image Processing, Applications and Systems Conference, 2014
Nucleation and Atmospheric Aerosols, 2011
In the event of an accidental or intentional release of chemical or biological (CB) agents into t... more In the event of an accidental or intentional release of chemical or biological (CB) agents into the atmosphere, first responders and decision makers need to rapidly locate and characterize the source of dispersion events using limited information from sensor networks. In this study the stochastic event reconstruction tool (SERT) is applied to a subset of the Fusing Sensor Information from Observing Networks (FUSION) Field Trial 2007 (FFT 07) database. The inference in SERT is based on Bayesian inference with Markov chain Monte Carlo (MCMC) sampling. SERT adopts a probability model that takes into account both positive and zero-reading sensors. In addition to the location and strength of the dispersion event, empirical parameters in the forward model are also estimated to establish a data-driven plume model. Results demonstrate the effectiveness of the Bayesian inference approach to characterize the source of a short range atmospheric release with uncertainty quantification.
Nucleation and Atmospheric Aerosols, 2011
At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, r... more At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox's approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially-ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.
Problèmes inverses en imagerie et en vision en deux volumes inséparables (Traité Signal et Image, IC2)
HAL (Le Centre pour la Communication Scientifique Directe), 2009
Imagerie micro−onde et application à la détection d'objets enfouis
HAL (Le Centre pour la Communication Scientifique Directe), Feb 1, 2009
Nucleation and Atmospheric Aerosols, 2011
The information-theoretic representation of quantum systems, which complements the familiar energ... more The information-theoretic representation of quantum systems, which complements the familiar energy description of the density-functional and wave-function-based theories, is here discussed. According to it, the internal disorder of the quantum-mechanical non-relativistic systems can be quantified by various single (Fisher information, Shannon entropy) and composite (e.g. Cramér-Rao, LMC shape and Fisher-Shannon complexity) functionals of the Schrödinger probability density ρ(r). First, we examine these concepts and its application to quantum systems with central potentials. Then, we calculate these measures for hydrogenic systems, emphasizing their predictive power for various physical phenomena. Finally, some recent open problems are pointed out.
Nucleation and Atmospheric Aerosols, 2011
We introduce a novel approach for evaluation of neuronal tuning functions, which can be expressed... more We introduce a novel approach for evaluation of neuronal tuning functions, which can be expressed by the conditional probability of observing a spike given any combination of independent variables. This probability can be estimated out of experimentally available data. By maximizing the mutual information between the probability distribution of the spike occurrence and that of the variables, the dependence of the spike on the input variables is maximized as well. We used this method to analyze the dependence of neuronal activity in cortical area MSTd on signals related to movement of the eye and retinal image movement.
Nous proposons une méthode de fusion pour obtenir une image tridimensionnelle d'un objet à partir... more Nous proposons une méthode de fusion pour obtenir une image tridimensionnelle d'un objet à partir de gammagraphies et d'échographies. Ces deux jeux de données apportent des informations complémentaires et sont reliés à l'objet par des paramètres indépendants : l'atténuation pour la gammagraphie et la réflectivité pour les ultrasons. Dans le cadre bayésien retenu, nous montrons que le coeur du processus de fusion réside dans l'introduction d'un modèle a priori joint pour l'atténuation et la réflectivité. Ainsi, la construction de ce modèle et l'estimation conjointe qui en découle constituent la principale contribution de ce travail. Les résultats de traitements de données réelles montrent l'intérêt de cette méthode par rapport à une approche séquentielle.
A Bayesian approach to Fourier Synthesis inverse problem with application in SAR imaging
Nucleation and Atmospheric Aerosols, 2011
Nucleation and Atmospheric Aerosols, 2011
This paper describes an application of Bayesian programming to the control of an autonomous avata... more This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Medical Physics, Oct 22, 2013
Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis m... more Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images. Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed. Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also necessary to have the accurate spectrum information about the source-detector system. When dealing with experimental data, the spectrum can be predicted by a Monte Carlo simulator. For the materials between water and bone, less than 5% separation errors are observed on the estimated decomposition fractions. Conclusions: The proposed approach is a statistical reconstruction approach based on a nonlinear forward model counting the full beam polychromaticity and applied directly to the projections without taking negative-log. Compared to the approaches based on linear forward models and the BHA correction approaches, it has advantages in noise robustness and reconstruction accuracy.
Maximum entropy and Bayesian methods : Paris, France, 1992 : proceedings of the twelfth International Workshop on Maximum Entropy and Bayesian Methods
Kluwer Academic eBooks, 1993
Preface. 1. Bayesian Inference and Maximum Entropy. 2. Quantum Physics and Quantum Information. 3... more Preface. 1. Bayesian Inference and Maximum Entropy. 2. Quantum Physics and Quantum Information. 3. Time Series. 4. Inverse Problems. 5. Applications. 6. Image Restoration and Reconstruction. Key Words Index. Authors' Index.
arXiv (Cornell University), May 18, 2007
In this paper, first a great number of inverse problems which arise in instrumentation, in comput... more In this paper, first a great number of inverse problems which arise in instrumentation, in computer imaging systems and in computer vision are presented. Then a common general forward modeling for them is given and the corresponding inversion problem is presented. Then, after showing the inadequacy of the classical analytical and least square methods for these ill posed inverse problems, a Bayesian estimation framework is presented which can handle, in a coherent way, all these problems. One of the main steps, in Bayesian inversion framework is the prior modeling of the unknowns. For this reason, a great number of such models and in particular the compound hidden Markov models are presented. Then, the main computational tools of the Bayesian estimation are briefly presented. Finally, some particular cases are studied in detail and new results are presented.
Une approche bayésienne de l'inversion en imagerie micro-onde 3D
HAL (Le Centre pour la Communication Scientifique Directe), Mar 19, 2008
Le probleme de la tomographie 3D est modelise par deux equations integrales couplees qui exprimen... more Le probleme de la tomographie 3D est modelise par deux equations integrales couplees qui expriment les champs electrique observe et existant a l'interieur de l'objet a imager. La discretisation de ces deux equations par une methode des moments nous ramene a un jeux de deux equations algebriques matricielles avec deux inconnues qui sont le contraste de l'objet et le champs total a l'interieur de l'objet. Ces deux equations sont de tres grandes dimensions. Par ailleurs, il y trois sources d'erreurs : i) le bruit de mesure proprement dit, ii) l'erreur de discretisation de l'objet, et ii) l'erreur liee aux approximations dans le calcul des elements des deux matrices. L'idee dans les approches probabilistes est de modeliser ces erreurs pour les prendre en compte dans le calcul de la solution. De plus, l'approche bayesienne nous permet aussi de prendre en compte l'information a priori sur les inconnues du probleme. Dans les applications en CND, souvent l'objet etudie est compose d'un nombre fini de materiaux, ce qui implique que l'image recherchee est constituee d'un nombre fini de regions homogenes et compactes, ce qui justifie la modelisation de sa distribution par une melange de gaussiennes avec une variable cachee representant l'etiquette des regions. Nous avons deja utilise cette approche en 2D avec succes et l'objet de cette these est l'extension en 3D. Dans cette expose nous presentons surtout les difficultes que l'on peut rencontrer du point de vue du calcul, et les idees que l'on propose pour la solution.
Estimation des hyperparamètres dans une approche bayésienne de la résolution des problèmes inverses linéaires