Ensemble graph auto-encoders for clustering and link prediction

Graph auto-encoders are a crucial research area within graph neural networks, commonly employed for generating graph embeddings while minimizing errors in unsupervised learning. Traditional graph auto-encoders focus on reconstructing minimal graph data loss to encode neighborhood information for eac...

Full description

Saved in:
Bibliographic Details
Main Authors: Chengxin Xie, Jingui Huang, Yongjiang Shi, Hui Pang, Liting Gao, Xiumei Wen
Format: Article
Language:English
Published: PeerJ Inc. 2025-01-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-2648.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832587190401499136
author Chengxin Xie
Jingui Huang
Yongjiang Shi
Hui Pang
Liting Gao
Xiumei Wen
author_facet Chengxin Xie
Jingui Huang
Yongjiang Shi
Hui Pang
Liting Gao
Xiumei Wen
author_sort Chengxin Xie
collection DOAJ
description Graph auto-encoders are a crucial research area within graph neural networks, commonly employed for generating graph embeddings while minimizing errors in unsupervised learning. Traditional graph auto-encoders focus on reconstructing minimal graph data loss to encode neighborhood information for each node, yielding node embedding representations. However, existing graph auto-encoder models often overlook node representations and fail to capture contextual node information within the graph data, resulting in poor embedding effects. Accordingly, this study proposes the ensemble graph auto-encoders (E-GAE) model. It utilizes the ensemble random walk graph auto-encoder, the random walk graph auto-encoder of the ensemble network, and the graph attention auto-encoder to generate three node embedding matrices Z. Then, these techniques are combined using adaptive weights to reconstruct a new node embedding matrix. This method addresses the problem of low-quality embeddings. The model’s performance is evaluated using three publicly available datasets (Cora, Citeseer, and PubMed), indicating its effectiveness through multiple experiments. It achieves up to a 2.0% improvement in the link prediction task and a 9.4% enhancement in the clustering task. Our code for this work can be found at https://github.com/xcgydfjjjderg/graphautoencoder.
format Article
id doaj-art-04419b50a5ed495ea221f2380ff9905f
institution Kabale University
issn 2376-5992
language English
publishDate 2025-01-01
publisher PeerJ Inc.
record_format Article
series PeerJ Computer Science
spelling doaj-art-04419b50a5ed495ea221f2380ff9905f2025-01-24T15:05:09ZengPeerJ Inc.PeerJ Computer Science2376-59922025-01-0111e264810.7717/peerj-cs.2648Ensemble graph auto-encoders for clustering and link predictionChengxin Xie0Jingui Huang1Yongjiang Shi2Hui Pang3Liting Gao4Xiumei Wen5Hebei University of Architecture, Zhangjiakou, ChinaHunan Normal University, ChangSha, ChinaHebei University of Architecture, Zhangjiakou, ChinaHebei University of Architecture, Zhangjiakou, ChinaHebei University of Architecture, Zhangjiakou, ChinaHebei University of Architecture, Zhangjiakou, ChinaGraph auto-encoders are a crucial research area within graph neural networks, commonly employed for generating graph embeddings while minimizing errors in unsupervised learning. Traditional graph auto-encoders focus on reconstructing minimal graph data loss to encode neighborhood information for each node, yielding node embedding representations. However, existing graph auto-encoder models often overlook node representations and fail to capture contextual node information within the graph data, resulting in poor embedding effects. Accordingly, this study proposes the ensemble graph auto-encoders (E-GAE) model. It utilizes the ensemble random walk graph auto-encoder, the random walk graph auto-encoder of the ensemble network, and the graph attention auto-encoder to generate three node embedding matrices Z. Then, these techniques are combined using adaptive weights to reconstruct a new node embedding matrix. This method addresses the problem of low-quality embeddings. The model’s performance is evaluated using three publicly available datasets (Cora, Citeseer, and PubMed), indicating its effectiveness through multiple experiments. It achieves up to a 2.0% improvement in the link prediction task and a 9.4% enhancement in the clustering task. Our code for this work can be found at https://github.com/xcgydfjjjderg/graphautoencoder.https://peerj.com/articles/cs-2648.pdfGraph auto-encodersLow embeddingEnsembleLink predictionClustering
spellingShingle Chengxin Xie
Jingui Huang
Yongjiang Shi
Hui Pang
Liting Gao
Xiumei Wen
Ensemble graph auto-encoders for clustering and link prediction
PeerJ Computer Science
Graph auto-encoders
Low embedding
Ensemble
Link prediction
Clustering
title Ensemble graph auto-encoders for clustering and link prediction
title_full Ensemble graph auto-encoders for clustering and link prediction
title_fullStr Ensemble graph auto-encoders for clustering and link prediction
title_full_unstemmed Ensemble graph auto-encoders for clustering and link prediction
title_short Ensemble graph auto-encoders for clustering and link prediction
title_sort ensemble graph auto encoders for clustering and link prediction
topic Graph auto-encoders
Low embedding
Ensemble
Link prediction
Clustering
url https://peerj.com/articles/cs-2648.pdf
work_keys_str_mv AT chengxinxie ensemblegraphautoencodersforclusteringandlinkprediction
AT jinguihuang ensemblegraphautoencodersforclusteringandlinkprediction
AT yongjiangshi ensemblegraphautoencodersforclusteringandlinkprediction
AT huipang ensemblegraphautoencodersforclusteringandlinkprediction
AT litinggao ensemblegraphautoencodersforclusteringandlinkprediction
AT xiumeiwen ensemblegraphautoencodersforclusteringandlinkprediction