The Logical Expressiveness of Graph Neural Networks (original) (raw)
Related papers
Graph Neural Networks as the Copula Mundi between Logic and Machine Learning: a Roadmap
2021
Combining machine learning (ML) and computational logic (CL) is hard, mostly because of the inherentlydifferent ways they use to represent knowledge. In fact, while ML relies on fixed-size numeric representations leveraging on vectors, matrices, or tensors of real numbers, CL relies on logic terms and clauses—which are unlimited in size and structure. Graph neural networks (GNN) are a novelty in the ML world introduced for dealing with graphstructured data in a sub-symbolic way. In other words, GNN pave the way towards the application of ML to logic clauses and knowledge bases. However, there are several ways to encode logic knowledge into graphs: which is the best one heavily depends on the specific task at hand. Accordingly, in this paper, we (i) elicit a number of problems from the field of CL that may benefit from many graph-related problems where GNN has been proved effective; (ii) exemplify the application of GNN to logic theories via an end-to-end toy example, to demonstrate ...
A Gaze into the Internal Logic of Graph Neural 1 Networks , with Logic 2
2022
Graph Neural Networks share with Logic Programming several key relational inference mechanisms. The 5 datasets on which they are trained and evaluated can be seen as database facts containing ground terms. This 6 makes possible modeling their inference mechanisms with equivalent logic programs, to better understand 7 not just how they propagate information between the entities involved in the machine learning process but 8 also to infer limits on what can be learned from a given dataset and how well that might generalize to unseen 9 test data. 10 This leads us to the key idea of this paper: modeling with the help of a logic program the information 11 flows involved in learning to infer from the link structure of a graph and the information content of its 12 nodes properties of new nodes, given their known connections to nodes with possibly similar properties. 13 The problem is known as graph node property prediction and our approach will consist in emulating with 14 help of a Prolog...
Graph Neural Networks and Boolean Satisfiability
ArXiv, 2017
In this paper we explore whether or not deep neural architectures can learn to classify Boolean satisfiability (SAT). We devote considerable time to discussing the theoretical properties of SAT. Then, we define a graph representation for Boolean formulas in conjunctive normal form, and train neural classifiers over general graph structures called Graph Neural Networks, or GNNs, to recognize features of satisfiability. To the best of our knowledge this has never been tried before. Our preliminary findings are potentially profound. In a weakly-supervised setting, that is, without problem specific feature engineering, Graph Neural Networks can learn features of satisfiability.
Graph Neural Networks for Reasoning 2-Quantified Boolean Formulas
2019
It is valuable yet remains challenging to apply neural networks in logical reasoning tasks. Despite some successes witnessed in learning SAT (Boolean Satisfiability) solvers for propositional logic via Graph Neural Networks (GNN), there haven’t been any successes in learning solvers for more complex predicate logic. In this paper, we target the QBF (Quantified Boolean Formula) satisfiability problem, the complexity of which is in-between propositional logic and predicate logic, and investigate the feasibility of learning GNN-based solvers and GNN-based heuristics for the cases with a universal-existential quantifier alternation (so-called 2QBF problems). We conjecture, with empirical support, that GNNs have certain limitations in learning 2QBF solvers, primarily due to the inability to reason about a set of assignments. Then we show the potential of GNN-based heuristics in CEGAR-based solvers, and explore the interesting challenges to generalize them to larger problem instances. In ...
Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective
2021
In the recent literature of Graph Neural Networks (GNN), the expressive power of models has been studied through their capability to distinguish if two given graphs are isomorphic or not. Since the graph isomorphism problem is NP-intermediate, and Weisfeiler-Lehman (WL) test can give sufficient but not enough evidence in polynomial time, the theoretical power of GNNs is usually evaluated by the equivalence of WL-test order, followed by an empirical analysis of the models on some reference inductive and transductive datasets. However, such analysis does not account the signal processing pipeline, whose capability is generally evaluated in the spectral domain. In this paper, we argue that a spectral analysis of GNNs behavior can provide a complementary point of view to go one step further in the understanding of GNNs. By bridging the gap between the spectral and spatial design of graph convolutions, we theoretically demonstrate some equivalence of the graph convolution process regardl...
Improving Graph Neural Network Representations of Logical Formulae with Subgraph Pooling
ArXiv, 2019
Recent advances in the integration of deep learning with automated theorem proving have centered around the representation of logical formulae as inputs to deep learning systems. In particular, there has been a growing interest in adapting structure-aware neural methods to work with the underlying graph representations of logical expressions. While more effective than character and token-level approaches, graph-based methods have often made representational trade-offs that limited their ability to capture key structural properties of their inputs. In this work we propose a novel approach for embedding logical formulae that is designed to overcome the representational limitations of prior approaches. Our architecture works for logics of different expressivity; e.g., first-order and higher-order logic. We evaluate our approach on two standard datasets and show that the proposed architecture achieves state-of-the-art performance on both premise selection and proof step classification.
Graph Neural Reasoning for 2-Quantified Boolean Formula Solvers
ArXiv, 2019
In this paper, we investigate the feasibility of learning GNN (Graph Neural Network) based solvers and GNN-based heuristics for specified QBF (Quantified Boolean Formula) problems. We design and evaluate several GNN architectures for 2QBF formulae, and conjecture that GNN has limitations in learning 2QBF solvers. Then we show how to learn a heuristic CEGAR 2QBF solver. We further explore generalizing GNN-based heuristics to larger unseen instances, and uncover some interesting challenges. In summary, this paper provides a comprehensive surveying view of applying GNN-embeddings to specified QBF solvers, and aims to offer guidance in applying ML to more complicated symbolic reasoning problems.
On the equivalence between graph isomorphism testing and function approximation with GNNs
2019
Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In the light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints. Using this framework, we compare the expressive power of different classes of GNNs as well as other methods on graphs. In particular, we prove that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree. We then extend them to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network...
Expressiveness and complexity of graph logic
Information and Computation, 2007
We investigate the complexity and expressive power of the spatial logic for querying graphs introduced by Cardelli, Gardner and Ghelli (ICALP 2002). We show that the model-checking complexity of versions of this logic with and without recursion is PSPACE-complete. In terms of expressive power, the version without recursion is a fragment of the monadic second-order logic of graphs and we show that it can express complete problems at every level of the polynomial hierarchy. We also show that it can define all regular languages, when interpretation is restricted to strings. The expressive power of the logic with recursion is much greater as it can express properties that are PSPACE-complete and therefore unlikely to be definable in second-order logic. ´ edgeµ where, -is a set of vertices, a set of edges and a set of labels. These sets are all finite and mutually disjoint. Moreover, and , where is a fixed infinite set of names and a fixed infinite set of labels (names do not actually name anything, they are just the universe of constants from which elements of the graph are drawn).
Representing first-order logic using graphs
Graph Transformations, 2004
We show how edge-labelled graphs can be used to represent first-order logic formulae. This gives rise to recursively nested structures, in which each level of nesting corresponds to the negation of a set of existentials. The model is a direct generalisation of the negative application conditions used in graph rewriting, which count a single level of nesting and are thereby shown to correspond to the fragment ∃¬∃ of first-order logic. Vice versa, this generalisation may be used to strengthen the notion of application conditions. We then proceed to show how these nested models may be flattened to (sets of) plain graphs, by allowing some structure on the labels. The resulting formulae-as-graphs may form the basis of a unification of the theories of graph transformation and predicate transformation.