Limits of Depth: Over-Smoothing and Over-Squashing in GNNs
Graph Neural Networks (GNNs) have become a widely used tool for learning and analyzing data on graph structures, largely due to their ability to preserve graph structure and properties via graph representation learning. However, the effect of depth on the performance of GNNs, particularly isotropic...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Tsinghua University Press
2024-03-01
|
Series: | Big Data Mining and Analytics |
Subjects: | |
Online Access: | https://www.sciopen.com/article/10.26599/BDMA.2023.9020019 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832544790309240832 |
---|---|
author | Aafaq Mohi ud din Shaima Qureshi |
author_facet | Aafaq Mohi ud din Shaima Qureshi |
author_sort | Aafaq Mohi ud din |
collection | DOAJ |
description | Graph Neural Networks (GNNs) have become a widely used tool for learning and analyzing data on graph structures, largely due to their ability to preserve graph structure and properties via graph representation learning. However, the effect of depth on the performance of GNNs, particularly isotropic and anisotropic models, remains an active area of research. This study presents a comprehensive exploration of the impact of depth on GNNs, with a focus on the phenomena of over-smoothing and the bottleneck effect in deep graph neural networks. Our research investigates the tradeoff between depth and performance, revealing that increasing depth can lead to over-smoothing and a decrease in performance due to the bottleneck effect. We also examine the impact of node degrees on classification accuracy, finding that nodes with low degrees can pose challenges for accurate classification. Our experiments use several benchmark datasets and a range of evaluation metrics to compare isotropic and anisotropic GNNs of varying depths, also explore the scalability of these models. Our findings provide valuable insights into the design of deep GNNs and offer potential avenues for future research to improve their performance. |
format | Article |
id | doaj-art-eb0fb22d25674c1eb7ff4c7f1d475726 |
institution | Kabale University |
issn | 2096-0654 |
language | English |
publishDate | 2024-03-01 |
publisher | Tsinghua University Press |
record_format | Article |
series | Big Data Mining and Analytics |
spelling | doaj-art-eb0fb22d25674c1eb7ff4c7f1d4757262025-02-03T09:54:47ZengTsinghua University PressBig Data Mining and Analytics2096-06542024-03-017120521610.26599/BDMA.2023.9020019Limits of Depth: Over-Smoothing and Over-Squashing in GNNsAafaq Mohi ud din0Shaima Qureshi1Department of Computer Science and Engineering, National Institute of Technology Srinagar, Srinagar 190006, IndiaDepartment of Computer Science and Engineering, National Institute of Technology Srinagar, Srinagar 190006, IndiaGraph Neural Networks (GNNs) have become a widely used tool for learning and analyzing data on graph structures, largely due to their ability to preserve graph structure and properties via graph representation learning. However, the effect of depth on the performance of GNNs, particularly isotropic and anisotropic models, remains an active area of research. This study presents a comprehensive exploration of the impact of depth on GNNs, with a focus on the phenomena of over-smoothing and the bottleneck effect in deep graph neural networks. Our research investigates the tradeoff between depth and performance, revealing that increasing depth can lead to over-smoothing and a decrease in performance due to the bottleneck effect. We also examine the impact of node degrees on classification accuracy, finding that nodes with low degrees can pose challenges for accurate classification. Our experiments use several benchmark datasets and a range of evaluation metrics to compare isotropic and anisotropic GNNs of varying depths, also explore the scalability of these models. Our findings provide valuable insights into the design of deep GNNs and offer potential avenues for future research to improve their performance.https://www.sciopen.com/article/10.26599/BDMA.2023.9020019graph neural networks (gnns)learning on graphsover-smoothingover-squashingisotropic-gnnsanisotropic-gnns |
spellingShingle | Aafaq Mohi ud din Shaima Qureshi Limits of Depth: Over-Smoothing and Over-Squashing in GNNs Big Data Mining and Analytics graph neural networks (gnns) learning on graphs over-smoothing over-squashing isotropic-gnns anisotropic-gnns |
title | Limits of Depth: Over-Smoothing and Over-Squashing in GNNs |
title_full | Limits of Depth: Over-Smoothing and Over-Squashing in GNNs |
title_fullStr | Limits of Depth: Over-Smoothing and Over-Squashing in GNNs |
title_full_unstemmed | Limits of Depth: Over-Smoothing and Over-Squashing in GNNs |
title_short | Limits of Depth: Over-Smoothing and Over-Squashing in GNNs |
title_sort | limits of depth over smoothing and over squashing in gnns |
topic | graph neural networks (gnns) learning on graphs over-smoothing over-squashing isotropic-gnns anisotropic-gnns |
url | https://www.sciopen.com/article/10.26599/BDMA.2023.9020019 |
work_keys_str_mv | AT aafaqmohiuddin limitsofdepthoversmoothingandoversquashingingnns AT shaimaqureshi limitsofdepthoversmoothingandoversquashingingnns |