Jan 26, 2023 · In this paper, we study GNNs' generalization ability through the lens of Vapnik--Chervonenkis (VC) dimension theory in two settings, focusing on graph-level ...
Here, the 1-WL is a well-studied heuristic for the graph isomorphism problem, which iteratively colors or partitions a graph's vertex set. While this connection ...
We show that GNNs' VC dimension depends tightly on the number of equivalence classes computed by the 1-WL over a set of graphs; see Propositions 1 and 2.
Jul 23, 2023 · In this paper, we study GNNs' generalization ability through the lens of Vapnik-Chervonenkis (VC) dimension theory in two settings, focusing on ...
WL meet VC ±. We show a tight connection between GNNs' expressivity and their generalization ability. 7. Page 9. WL meet VC ±. Uniform? Bitlength ≤ b? = b. 1-WL ...
In this paper, we study GNNs' generalization ability through the lens of Vapnik--Chervonenkis (VC) dimension theory in two settings, focusing on graph-level ...
Algorithms, Dynamics, and Information Flow in Networks DFG Research Unit. ≡ Open menu. Home · Projects · People · Publications · Events · Media · Contact.
WL meet VC. Publication. WL meet VC. Morris, Christopher (Corresponding author); Geerts, Floris; Tönshoff, Jan Martin; Grohe, Martin. ML Research Press (2023)
People also ask
What is the VC-dimension of a neural network?
What is the VC-dimension of a graph?
The aim of our work is to extend this analysis on the VC dimension of GNNs to other commonly used activation functions, such as the sigmoid and hyperbolic ...