TIDE is described in "TIDE: Time Derivative Diffusion for Deep Learning on Graphs", by
Maysam Behmanesh*, Maximilian Krahn*, Maks Ovsjanikov
Abstract
A prominent paradigm for graph neural networks is based on the message-passing framework. In this framework, information communication is realized only between neighboring nodes. The challenge of approaches that use this paradigm is to ensure efficient and accurate long-distance communication between nodes, as deep convolutional networks are prone to over-smoothing. In this paper, we present a novel method based on time derivative graph diffusion (TIDE), with a learnable time parameter. Our approach allows us to adapt the spatial extent of diffusion across different tasks and network channels, thus enabling medium and long-distance communication efficiently. Furthermore, we show that our architecture directly enables local message-passing and thus inherits from the expressive power of local message-passing approaches. We show that on widely used graph benchmarks we achieve comparable performance and on a synthetic mesh dataset we outperform state-of-the-art methods like GCN or GRAND by a significant margin.
Please take note that Equation (10) has been simplified similarly to the following equation, as you can find in the implementation.
- ogb>=1.3.3
- torch>=1.10.0
- torch-geometric>=2.0.4
@InProceedings{pmlr-v202-behmanesh23a,
title = {{TIDE}: Time Derivative Diffusion for Deep Learning on Graphs},
author = {Behmanesh, Maysam and Krahn, Maximilian and Ovsjanikov, Maks},
booktitle = {Proceedings of the 40th International Conference on Machine Learning},
pages = {2015--2030},
year = {2023},
editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan},
volume = {202},
series = {Proceedings of Machine Learning Research},
month = {23--29 Jul},
publisher = {PMLR},
pdf = {https://proceedings.mlr.press/v202/behmanesh23a/behmanesh23a.pdf},
url = {https://proceedings.mlr.press/v202/behmanesh23a.html},
}
Sharp et al. DiffusionNet: Discretization Agnostic Learning on Surfaces. TOG 2022.