Statistical Relational Learning with Soft Quantifiers (original) (raw)

Soft quantification in statistical relational learning

Machine Learning, 2017

We present a new statistical relational learning (SRL) framework that supports reasoning with soft quantifiers, such as "most" and "a few." We define the syntax and the semantics of this language, which we call PSL Q , and present a most probable explanation inference algorithm for it. To the best of our knowledge, PSL Q is the first SRL framework that combines soft quantifiers with first-order logic rules for modelling uncertain relational data. Our experimental results for two real-world applications, link prediction in social trust networks and user profiling in social networks, demonstrate that the use of soft quantifiers not only allows for a natural and intuitive formulation of domain knowledge, but also improves inference accuracy.

Statistical relational learning: Four claims and a survey

2003

Statistical relational learning (SRL) research has made significant progress over the last 5 years. We have successfully demonstrated the feasibility of a number of probabilistic models for relational data, including probabilistic relational models, Bayesian logic programs, and relational probability trees, and the interest in SRL is growing. However, in order to sustain and nurture the growth of SRL as a subfield we need to refocus our efforts on the science of machine learning—moving from demonstrations to comparative and ablation studies.

Social networks and statistical relational learning: a survey

International Journal of Social Network Mining, 2012

One of the most appreciated functionality of computers nowadays is their being a means for communication and information sharing among people. With the spread of the internet, several complex interactions have taken place among people, giving rise to huge information networks based on these interactions. Social networks potentially represent an invaluable source of information that can be exploited for scientific and commercial purposes. On the other hand, due to their distinguishing peculiarities (huge size and inherent relational setting) with respect to all previous information extraction tasks faced in computer science, they require new techniques to gather this information. Social network mining (SNM) is the corresponding research area, aimed at extracting information about the network objects and behaviour that cannot be obtained based on the explicit/implicit description of the objects alone, ignoring their explicit/implicit relationships. Statistical relational learning (SRL) is a very promising approach to SNM, since it combines expressive representation formalisms, able to model complex relational networks, with statistical methods able to handle uncertainty about objects and relations. This paper is a survey of some SRL formalisms and techniques adopted to solve some SNM tasks.

Statistical Relational Learning: A State-Of-The-Art Review

Journal of Engineering and Technology, 2019

The objective of this paper is to review the state-of-the-art of statistical relational learning (SRL) models developed to deal with machine learning and data mining in relational domains in presence of missing, partially observed, and/or noisy data. It starts by giving a general overview of conventional graphical models, first-order logic and inductive logic programming approaches as needed for background. The historical development of each SRL key model is critically reviewed. The study also focuses on the practical application of SRL techniques to a broad variety of areas and their limitations.

Research on Statistical Relational Learning

2007

This paper presents an overview of the research on learning statistical models of relational data being carried out at the University of Washington. Our work falls into five main directions: learning models of social networks; learning models of sequential relational processes; scaling up statistical relational learning to massive data sources; learning for knowledge integration; and learning programs in procedural languages. We describe some of the common themes and research issues arising from this work.

Statistical Relational Learning with Formal Ontologies

Lecture Notes in Computer Science, 2009

We propose a learning approach for integrating formal knowledge into statistical inference by exploiting ontologies as a semantically rich and fully formal representation of prior knowledge. The logical constraints deduced from ontologies can be utilized to enhance and control the learning task by enforcing description logic satisfiability in a latent multi-relational graphical model. To demonstrate the feasibility of our approach we provide experiments using real world social network data in form of a SHOIN (D) ontology. The results illustrate two main practical advancements: First, entities and entity relationships can be analyzed via the latent model structure. Second, enforcing the ontological constraints guarantees that the learned model does not predict inconsistent relations. In our experiments, this leads to an improved predictive performance.

A short introduction to probabilistic soft logic

2012

Probabilistic soft logic (PSL) is a framework for collective, probabilistic reasoning in relational domains. PSL uses first order logic rules as a template language for graphical models over random variables with soft truth values from the interval [0, 1]. Inference in this setting is a continuous optimization task, which can be solved efficiently. This paper provides an overview of the PSL language and its techniques for inference and weight learning. An implementation of PSL is available at http://psl. umiacs. umd. edu/.

Introduction to statistical relational learning

2007

Handling inherent uncertainty and exploiting compositional structure are fundamental to understanding and designing large-scale systems. Statistical relational learning builds on ideas from probability theory and statistics to address uncertainty while incorporating tools from logic, databases, and programming languages to represent structure.

Research on statistical relational learning at the University of Washington

2003

This paper presents an overview of the research on learning statistical models from relational data being carried out at the University of Washington. Our work falls into five main directions: learning models of social networks; learning models of sequential relational processes; scaling up statistical relational learning to massive data sources; learning for knowledge integration; and learning programs in procedural languages. We describe some of the common themes and research issues arising from this work.

A Comparison between Two Statistical Relational Models

Lecture Notes in Computer Science

Statistical Relational Learning has received much attention this last decade. In the ILP community, several models have emerged for modelling and learning uncertain knowledge, expressed in subset of first order logics. Nevertheless, no deep comparisons have been made among them and, given an application, determining which model must be chosen is difficult. In this paper, we compare two of them, namely Markov Logic Networks and Bayesian Programs, especially with respect to their representation ability and inference methods. The comparison shows that the two models are substantially different, from the point of view of the user, so that choosing one really means choosing a different philosophy to look at the problem. In order to make the comparison more concrete, we have used a running example, which shows most of the interesting points of the approaches, yet remaining exactly tractable.