Hybrid generative adversarial network based on frequency and spatial domain for histopathological image synthesis

Abstract Background Due to the complexity and cost of preparing histopathological slides, deep learning-based methods have been developed to generate high-quality histological images. However, existing approaches primarily focus on spatial domain information, neglecting the periodic information in t...

Full description

Saved in:
Bibliographic Details
Main Authors: Qifeng Liu, Tao Zhou, Chi Cheng, Jin Ma, Marzia Hoque Tania
Format: Article
Language:English
Published: BMC 2025-01-01
Series:BMC Bioinformatics
Subjects:
Online Access:https://doi.org/10.1186/s12859-025-06057-9
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Background Due to the complexity and cost of preparing histopathological slides, deep learning-based methods have been developed to generate high-quality histological images. However, existing approaches primarily focus on spatial domain information, neglecting the periodic information in the frequency domain and the complementary relationship between the two domains. In this paper, we proposed a generative adversarial network that employs a cross-attention mechanism to extract and fuse features across spatial and frequency domains. The method optimizes frequency domain features using spatial domain guidance and refines spatial features with frequency domain information, preserving key details while eliminating redundancy to generate high-quality histological images. Results Our model incorporates a variable-window mixed attention module to dynamically adjust attention window sizes, capturing both local details and global context. A spectral filtering module enhances the extraction of repetitive textures and periodic structures, while a cross-attention fusion module dynamically weights features from both domains, focusing on the most critical information to produce realistic and detailed images. Conclusions The proposed method achieves efficient spatial-frequency domain fusion, significantly improving image generation quality. Experiments on the Patch Camelyon dataset show superior performance over eight state-of-the-art models across five metrics. This approach advances automated histopathological image generation with potential for clinical applications.
ISSN:1471-2105