Neural Linguistic Steganalysis via Multi-Head Self-Attention

Linguistic steganalysis can indicate the existence of steganographic content in suspicious text carriers. Precise linguistic steganalysis on suspicious carrier is critical for multimedia security. In this paper, we introduced a neural linguistic steganalysis approach based on multi-head self-attenti...

Full description

Saved in:
Bibliographic Details
Main Authors: Sai-Mei Jiao, Hai-feng Wang, Kun Zhang, Ya-qi Hu
Format: Article
Language:English
Published: Wiley 2021-01-01
Series:Journal of Electrical and Computer Engineering
Online Access:http://dx.doi.org/10.1155/2021/6668369
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832565995578851328
author Sai-Mei Jiao
Hai-feng Wang
Kun Zhang
Ya-qi Hu
author_facet Sai-Mei Jiao
Hai-feng Wang
Kun Zhang
Ya-qi Hu
author_sort Sai-Mei Jiao
collection DOAJ
description Linguistic steganalysis can indicate the existence of steganographic content in suspicious text carriers. Precise linguistic steganalysis on suspicious carrier is critical for multimedia security. In this paper, we introduced a neural linguistic steganalysis approach based on multi-head self-attention. In the proposed steganalysis approach, words in text are firstly mapped into semantic space with a hidden representation for better modeling the semantic features. Then, we utilize multi-head self-attention to model the interactions between words in carrier. Finally, a softmax layer is utilized to categorize the input text as cover or stego. Extensive experiments validate the effectiveness of our approach.
format Article
id doaj-art-60201bf1ba4146188506716f094cfa76
institution Kabale University
issn 2090-0147
2090-0155
language English
publishDate 2021-01-01
publisher Wiley
record_format Article
series Journal of Electrical and Computer Engineering
spelling doaj-art-60201bf1ba4146188506716f094cfa762025-02-03T01:05:20ZengWileyJournal of Electrical and Computer Engineering2090-01472090-01552021-01-01202110.1155/2021/66683696668369Neural Linguistic Steganalysis via Multi-Head Self-AttentionSai-Mei Jiao0Hai-feng Wang1Kun Zhang2Ya-qi Hu3College of Computer Science and Technology, Hainan Tropical Ocean University, Sanya, Hainan 572022, ChinaCollege of Computer Science and Technology, Hainan Tropical Ocean University, Sanya, Hainan 572022, ChinaEducation Center of MTA, Hainan Tropical Ocean University, Sanya, Hainan 572022, ChinaCollege of Computer Science and Technology, Hainan Tropical Ocean University, Sanya, Hainan 572022, ChinaLinguistic steganalysis can indicate the existence of steganographic content in suspicious text carriers. Precise linguistic steganalysis on suspicious carrier is critical for multimedia security. In this paper, we introduced a neural linguistic steganalysis approach based on multi-head self-attention. In the proposed steganalysis approach, words in text are firstly mapped into semantic space with a hidden representation for better modeling the semantic features. Then, we utilize multi-head self-attention to model the interactions between words in carrier. Finally, a softmax layer is utilized to categorize the input text as cover or stego. Extensive experiments validate the effectiveness of our approach.http://dx.doi.org/10.1155/2021/6668369
spellingShingle Sai-Mei Jiao
Hai-feng Wang
Kun Zhang
Ya-qi Hu
Neural Linguistic Steganalysis via Multi-Head Self-Attention
Journal of Electrical and Computer Engineering
title Neural Linguistic Steganalysis via Multi-Head Self-Attention
title_full Neural Linguistic Steganalysis via Multi-Head Self-Attention
title_fullStr Neural Linguistic Steganalysis via Multi-Head Self-Attention
title_full_unstemmed Neural Linguistic Steganalysis via Multi-Head Self-Attention
title_short Neural Linguistic Steganalysis via Multi-Head Self-Attention
title_sort neural linguistic steganalysis via multi head self attention
url http://dx.doi.org/10.1155/2021/6668369
work_keys_str_mv AT saimeijiao neurallinguisticsteganalysisviamultiheadselfattention
AT haifengwang neurallinguisticsteganalysisviamultiheadselfattention
AT kunzhang neurallinguisticsteganalysisviamultiheadselfattention
AT yaqihu neurallinguisticsteganalysisviamultiheadselfattention