Direct Data Domain Sparsity-Based STAP Utilizing Subaperture Smoothing Techniques

We propose a novel direct data domain (D3) sparsity-based space-time adaptive processing (STAP) algorithm utilizing subaperture smoothing techniques for airborne radar applications. Different from either normal sparsity-based STAP or D3 sparsity-based STAP, the proposed algorithm firstly uses only t...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhaocheng Yang, Rui Fa, Yuliang Qin, Xiang Li, Hongqiang Wang
Format: Article
Language:English
Published: Wiley 2015-01-01
Series:International Journal of Antennas and Propagation
Online Access:http://dx.doi.org/10.1155/2015/171808
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a novel direct data domain (D3) sparsity-based space-time adaptive processing (STAP) algorithm utilizing subaperture smoothing techniques for airborne radar applications. Different from either normal sparsity-based STAP or D3 sparsity-based STAP, the proposed algorithm firstly uses only the snapshot in the cell under test (CUT) to generate multiple subsnapshots by exploiting the space-time structure of the steering vector and the uncorrelated nature of the components of the interference covariance matrix. Since the interference spectrum is sparse in the whole angle-Doppler plane, by employing a sparse regularization, the generated multiple subsnapshots are jointly used to recover the interference spectrum. The interference covariance matrix is then estimated from the interference spectrum, followed by the space-time filtering and the target detection. Simulation results illustrate that the proposed algorithm outperforms the generalized forward/backward method, the conventional D3 least squares STAP algorithm, and the existing D3 sparsity-based STAP algorithm. Furthermore, compared with the normal sparsity-based STAP algorithm using multiple snapshots, the proposed algorithm can also avoid the performance degradation caused by discrete interferers merely appearing in the CUT.
ISSN:1687-5869
1687-5877