FSW-GNN: A Bi-Lipschitz WL-Equivalent Graph Neural Network

Yonatan Sverdlov, Yair Davidson, Nadav Dym, Tal Amir

arXiv preprint arXiv:2410.09118 (2024 preprint, under review)

Abstract

Many of the most popular graph neural networks fall into the category of Message Passing Neural Networks (MPNNs). Famously, MPNNs ability to distinguish between graphs is limited to graphs separable by the Weisfeiler-Leman (WL) graph isomorphism test, and the strongest MPNNs, in terms of separation power, are those which are WL-equivalent.

Recently, it was shown that the quality of separation provided by standard WL-equivalent MPNN can be very low, resulting in WL-separable graphs being mapped to very similar, hardly distinguishable features.

This paper addresses this issue by seeking bi-Lipschitz continuity guarantees for MPNNs. We demonstrate that, in contrast with standard summation-based MPNNs, which lack bi-Lipschitz properties, our proposed model provides a bi-Lipschitz graph embedding with respect to two standard graph metrics. Empirically, we show that our MPNN is competitive with standard MPNNs for several graph learning tasks, and is far more accurate on long-range tasks.

Paper website


BibTeX:

@misc{sverdlov2024fswgnn,
title={{FSW-GNN}: {A} Bi-{L}ipschitz {WL}-Equivalent Graph Neural Network},
author={Sverdlov, Yonatan and Davidson, Yair and Dym, Nadav and Amir, Tal},
year={2024},
eprint={2410.09118},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2410.09118}
}