Few-Shot Graph Anomaly Detection via Dual-Level Knowledge Distillation
Graph anomaly detection is crucial in many high-impact applications across diverse fields. In anomaly detection tasks, collecting plenty of annotated data tends to be costly and laborious. As a result, few-shot learning has been explored to address the issue by requiring only a few labeled samples t...
Saved in:
Main Authors: | Xuan Li, Dejie Cheng, Luheng Zhang, Chengfang Zhang, Ziliang Feng |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/27/1/28 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Few-Shot Knowledge Graph Completion Model With Neighbor Filter and Affine Attention
by: Hongfang Gong, et al.
Published: (2025-01-01) -
GNN-EADD: Graph Neural Network-Based E-Commerce Anomaly Detection via Dual-Stage Learning
by: Zhouhang Shao, et al.
Published: (2025-01-01) -
Measuring the Inferential Values of Relations in Knowledge Graphs
by: Xu Zhang, et al.
Published: (2024-12-01) -
Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
by: Arief Setyanto, et al.
Published: (2025-01-01) -
Graph-Attention Diffusion for Enhanced Multivariate Time-Series Anomaly Detection
by: Vadim Lanko, et al.
Published: (2024-01-01)