J. Nunnink - Academia.edu (original) (raw)
Papers by J. Nunnink
Proceedings. IEEE/WIC/ACM International Conference on Intelligent Agent Technology, 2004. (IAT 2004)., 2004
This paper introduces a multi agent-based approach to fusion of heterogeneous data and informatio... more This paper introduces a multi agent-based approach to fusion of heterogeneous data and information, which supports arbitrary distribution of fusion processes throughout a system of networked devices. In particular, the presented approach provides mechanisms for dynamic generation of well-structured models, which can cope with complex fusion problems in dynamic environments.
The 2005 IEEE/WIC/ACM International Conference on Web Intelligence (WI'05), 2005
Distributed Perception Networks (DPN) are a MAS approach to large scale fusion of heterogeneous a... more Distributed Perception Networks (DPN) are a MAS approach to large scale fusion of heterogeneous and noisy information. DPN agents can establish meaningful information filtering channels between the relevant information sources and the decision makers. Through specification of high level concepts, DPN agent organizations generate distributed Bayesian Networks, which provide mappings between the observed symptoms and the hypotheses relevant to the decision making. In addition, DPNs support robust distributed inference as well as decentralized probabilistic resource allocation.
ABSTRACT Modern situation assessment and controlling applications often require efficient fusion ... more ABSTRACT Modern situation assessment and controlling applications often require efficient fusion of large amounts of heterogeneous and uncertain information. In addition, fusion results are often mission critical. It turns out that Bayesian networks (BN) are suitable for a significant class of such applications, since they facilitate modeling of very heterogeneous types of uncertain information and support efficient belief propagation techniques. BNs are based on a rigorous theory which facilitates (i) analysis of the robustness of fusion systems and (ii) monitoring of the fusion quality. We assume domains where situations can be described through sets of discrete random variables. A situation corresponds to a set of hidden and observed states that the nature `sampled' from some true distribution over the combinations of possible states. Thus, in a particular situation certain states materialized while others did not, which corresponds to a point-mass distribution over the possible states. Consequently, the state estimation can be reduced to a classification of the possible combinations of relevant states. We assume that there exist mappings between hidden states of interest and optimal decisions/actions. In this context, we consider classification of the states accurate if it is equivalent to the truth in the sense that knowing the truth would not change the action based on the classification. Clearly, BNs provide a mapping between the observed symptoms and hypotheses about hidden events. Consequently, BNs have a critical impact on the fusion accuracy. We emphasize a fundamental difference between the model accuracy and fusion (i.e.classification) accuracy. A BN is a generalization over many possible situations that captures probability distributions over the possible events in the observed domain. However, even a perfect generalization does not necessarily support accurate classification in a particular situation. We address this problem with the help of the Inference Meta Model (IMM) which describes information fusion in BNs from a coarse, runtime perspective. IMM is based on a few realistic assumptions and exposes properties of BNs that are r elevant for the construction of inherently robust fusion systems. With the help of IMM we show that in BNs featuring many conditionally independent network fragments inference can be very insensitive to the modeling parameter values. This implies that fusion can be robust, which is especially relevant in many real world applications where we cannot obtain precise models due to the lack of sufficient training data or expertise. In addition, IMM introduces a reinforcement propagation algorithm that can be used as an alternative to the common approaches to inference in BNs. We can show that the classification accuracy of this propagation algorithm is asymptotically approaching 1 as the number of conditionally independent network fragments increases. Because of these properties, the propagation algorithm can be used as a basis for effective detection of misleading fusion results as well as discovery of inadequate modeling components and erroneous information sources. @InProceedings{pavlin_et_al:DSP:2006:756, author = {Gregor Pavlin and Jan Nunnink and Frans Groen}, title = {Robustness and Accuracy of Bayesian Information Fusion Systems}, booktitle = {Form and Content in Sensor Networks}, year = {2006}, editor = {Leonidas Guibas and Uwe D. Hanebeck and Thomas C. Henderson }, number = {05381}, series = {Dagstuhl Seminar Proceedings}, ISSN = {1862-4405}, publisher = {Internationales Begegnungs- und Forschungszentrum f{"u}r Informatik (IBFI), Schloss Dagstuhl, Germany}, address = {Dagstuhl, Germany}, URL = {http://drops.dagstuhl.de/opus/volltexte/2006/756}, annote = {Keywords: Robust Information Fusion, Bayesian Networks, Heterogeneous Information, Modeling Uncertainties} }
Page 1. Accelerated Greedy Mixture Learning Jan RJ Nunnink Jakob J. Verbeek Nikos Vlassis ... Pag... more Page 1. Accelerated Greedy Mixture Learning Jan RJ Nunnink Jakob J. Verbeek Nikos Vlassis ... Page 2. that offers a speedup without compromising sta-bility. As in (Moore, 1999), we first partition the data and cache some statistics in each par-tition cell. ...
Mixture densities constitute a rich family of models that can be used in several data mining and ... more Mixture densities constitute a rich family of models that can be used in several data mining and machine learning applications, for instance, clustering. Although practical algorithms exist for learning such models from data, these algorithms typically do not scale very well with large datasets. Our approach, which builds on previous work by other authors, offers an acceleration of the EM algorithm for Gaussian mixtures by precomputing and storing sufficient statistics of the data in the nodes of a kd-tree. Contrary to other works, we obtain algorithms that strictly increase a lower bound on the data log-likelihood in every learning step. Experimental results illustrate the validity of our approach.
This paper introduces design principles for modular Bayesian fusion systems which can (i) cope wi... more This paper introduces design principles for modular Bayesian fusion systems which can (i) cope with large quantities of heterogeneous information and (ii) can adapt to changing constellations of information sources on the fly. The presented approach exploits the locality of relations in causal probabilistic processes, which facilitates decentralized modeling and information fusion. Observed events resulting from stochastic causal processes can be modeled with the help of Bayesian networks, compact and mathematically rigorous probabilistic models. With the help of the theory of Bayesian networks and factor graphs we derive design and organization rules for modular fusion systems which implement exact belief propagation without centralized configuration and fusion control. These rules are applied in distributed perception networks (DPN), a multi agent systems approach to distributed Bayesian information fusion. While each DPN agent has limited fusion capabilities, multiple DPN agents can autonomously collaborate to form complex modular fusion systems. Such self-organizing systems of agents can adapt to the available information sources at runtime and can infer critical hidden events through interpretation of complex patterns consisting of many heterogeneous observations.
We investigate properties of Bayesian networks (BNs) in the context of robust state estimation. W... more We investigate properties of Bayesian networks (BNs) in the context of robust state estimation. We focus on problems where state estimation can be viewed as a classification of the possible states, which in turn is based on the fusion of heterogeneous and noisy information. We introduce a coarse perspective of the inference processes and show that classification with BNs can be very robust, even if we use models and evidence associated with significant uncertainties. By making coarse and realistic assumptions we can (i) formulate asymptotic properties of the classification performance, (ii) identify situations in which Bayesian fusion supports robust inference and (iii) introduce techniques that support detection of potentially misleading inference results at runtime. The presented coarse grained analysis from the runtime perspective is relevant for an important class of real world fusion problems where it is difficult to obtain domain models that precisely describe the true probability distributions over different states.
Proceedings. IEEE/WIC/ACM International Conference on Intelligent Agent Technology, 2004. (IAT 2004)., 2004
This paper introduces a multi agent-based approach to fusion of heterogeneous data and informatio... more This paper introduces a multi agent-based approach to fusion of heterogeneous data and information, which supports arbitrary distribution of fusion processes throughout a system of networked devices. In particular, the presented approach provides mechanisms for dynamic generation of well-structured models, which can cope with complex fusion problems in dynamic environments.
The 2005 IEEE/WIC/ACM International Conference on Web Intelligence (WI'05), 2005
Distributed Perception Networks (DPN) are a MAS approach to large scale fusion of heterogeneous a... more Distributed Perception Networks (DPN) are a MAS approach to large scale fusion of heterogeneous and noisy information. DPN agents can establish meaningful information filtering channels between the relevant information sources and the decision makers. Through specification of high level concepts, DPN agent organizations generate distributed Bayesian Networks, which provide mappings between the observed symptoms and the hypotheses relevant to the decision making. In addition, DPNs support robust distributed inference as well as decentralized probabilistic resource allocation.
ABSTRACT Modern situation assessment and controlling applications often require efficient fusion ... more ABSTRACT Modern situation assessment and controlling applications often require efficient fusion of large amounts of heterogeneous and uncertain information. In addition, fusion results are often mission critical. It turns out that Bayesian networks (BN) are suitable for a significant class of such applications, since they facilitate modeling of very heterogeneous types of uncertain information and support efficient belief propagation techniques. BNs are based on a rigorous theory which facilitates (i) analysis of the robustness of fusion systems and (ii) monitoring of the fusion quality. We assume domains where situations can be described through sets of discrete random variables. A situation corresponds to a set of hidden and observed states that the nature `sampled' from some true distribution over the combinations of possible states. Thus, in a particular situation certain states materialized while others did not, which corresponds to a point-mass distribution over the possible states. Consequently, the state estimation can be reduced to a classification of the possible combinations of relevant states. We assume that there exist mappings between hidden states of interest and optimal decisions/actions. In this context, we consider classification of the states accurate if it is equivalent to the truth in the sense that knowing the truth would not change the action based on the classification. Clearly, BNs provide a mapping between the observed symptoms and hypotheses about hidden events. Consequently, BNs have a critical impact on the fusion accuracy. We emphasize a fundamental difference between the model accuracy and fusion (i.e.classification) accuracy. A BN is a generalization over many possible situations that captures probability distributions over the possible events in the observed domain. However, even a perfect generalization does not necessarily support accurate classification in a particular situation. We address this problem with the help of the Inference Meta Model (IMM) which describes information fusion in BNs from a coarse, runtime perspective. IMM is based on a few realistic assumptions and exposes properties of BNs that are r elevant for the construction of inherently robust fusion systems. With the help of IMM we show that in BNs featuring many conditionally independent network fragments inference can be very insensitive to the modeling parameter values. This implies that fusion can be robust, which is especially relevant in many real world applications where we cannot obtain precise models due to the lack of sufficient training data or expertise. In addition, IMM introduces a reinforcement propagation algorithm that can be used as an alternative to the common approaches to inference in BNs. We can show that the classification accuracy of this propagation algorithm is asymptotically approaching 1 as the number of conditionally independent network fragments increases. Because of these properties, the propagation algorithm can be used as a basis for effective detection of misleading fusion results as well as discovery of inadequate modeling components and erroneous information sources. @InProceedings{pavlin_et_al:DSP:2006:756, author = {Gregor Pavlin and Jan Nunnink and Frans Groen}, title = {Robustness and Accuracy of Bayesian Information Fusion Systems}, booktitle = {Form and Content in Sensor Networks}, year = {2006}, editor = {Leonidas Guibas and Uwe D. Hanebeck and Thomas C. Henderson }, number = {05381}, series = {Dagstuhl Seminar Proceedings}, ISSN = {1862-4405}, publisher = {Internationales Begegnungs- und Forschungszentrum f{"u}r Informatik (IBFI), Schloss Dagstuhl, Germany}, address = {Dagstuhl, Germany}, URL = {http://drops.dagstuhl.de/opus/volltexte/2006/756}, annote = {Keywords: Robust Information Fusion, Bayesian Networks, Heterogeneous Information, Modeling Uncertainties} }
Page 1. Accelerated Greedy Mixture Learning Jan RJ Nunnink Jakob J. Verbeek Nikos Vlassis ... Pag... more Page 1. Accelerated Greedy Mixture Learning Jan RJ Nunnink Jakob J. Verbeek Nikos Vlassis ... Page 2. that offers a speedup without compromising sta-bility. As in (Moore, 1999), we first partition the data and cache some statistics in each par-tition cell. ...
Mixture densities constitute a rich family of models that can be used in several data mining and ... more Mixture densities constitute a rich family of models that can be used in several data mining and machine learning applications, for instance, clustering. Although practical algorithms exist for learning such models from data, these algorithms typically do not scale very well with large datasets. Our approach, which builds on previous work by other authors, offers an acceleration of the EM algorithm for Gaussian mixtures by precomputing and storing sufficient statistics of the data in the nodes of a kd-tree. Contrary to other works, we obtain algorithms that strictly increase a lower bound on the data log-likelihood in every learning step. Experimental results illustrate the validity of our approach.
This paper introduces design principles for modular Bayesian fusion systems which can (i) cope wi... more This paper introduces design principles for modular Bayesian fusion systems which can (i) cope with large quantities of heterogeneous information and (ii) can adapt to changing constellations of information sources on the fly. The presented approach exploits the locality of relations in causal probabilistic processes, which facilitates decentralized modeling and information fusion. Observed events resulting from stochastic causal processes can be modeled with the help of Bayesian networks, compact and mathematically rigorous probabilistic models. With the help of the theory of Bayesian networks and factor graphs we derive design and organization rules for modular fusion systems which implement exact belief propagation without centralized configuration and fusion control. These rules are applied in distributed perception networks (DPN), a multi agent systems approach to distributed Bayesian information fusion. While each DPN agent has limited fusion capabilities, multiple DPN agents can autonomously collaborate to form complex modular fusion systems. Such self-organizing systems of agents can adapt to the available information sources at runtime and can infer critical hidden events through interpretation of complex patterns consisting of many heterogeneous observations.
We investigate properties of Bayesian networks (BNs) in the context of robust state estimation. W... more We investigate properties of Bayesian networks (BNs) in the context of robust state estimation. We focus on problems where state estimation can be viewed as a classification of the possible states, which in turn is based on the fusion of heterogeneous and noisy information. We introduce a coarse perspective of the inference processes and show that classification with BNs can be very robust, even if we use models and evidence associated with significant uncertainties. By making coarse and realistic assumptions we can (i) formulate asymptotic properties of the classification performance, (ii) identify situations in which Bayesian fusion supports robust inference and (iii) introduce techniques that support detection of potentially misleading inference results at runtime. The presented coarse grained analysis from the runtime perspective is relevant for an important class of real world fusion problems where it is difficult to obtain domain models that precisely describe the true probability distributions over different states.