Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network

Abstract Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When ir...

Full description

Saved in:
Bibliographic Details
Main Authors: Guixian Xu, Zhe Chen, Zixin Zhang
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-86009-8
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571874441166848
author Guixian Xu
Zhe Chen
Zixin Zhang
author_facet Guixian Xu
Zhe Chen
Zixin Zhang
author_sort Guixian Xu
collection DOAJ
description Abstract Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When irrelevant sentiment words have semantic meaning for the given aspect category, it may cause the problem that sentiment words cannot be matched with aspect categories. To address the aforementioned issue, this paper proposes a novel approach for ACSA utilizing pre-trained Bidirectional Long Short-Term Memory (BiLSTM) and syntax-aware graph attention network. To address the issue of insufficient existing annotated datasets, a method of using transfer learning is proposed. Firstly, the BiLSTM model is used to pre-train on the document-level sentiment analysis dataset, and the obtained pre-training parameters are transferred to the aspect-level task model. Then, a syntax-aware graph attention network model is proposed to make full use of the syntactic structure and semantic information in the text, and combine the knowledge learned in pre-training to achieve the ACSA task. The performance evaluation of this method is carried out on five user comment text datasets, and the comprehensive ablation experiments prove that this method performs best compared with baseline models.
format Article
id doaj-art-96dbe35a2ce84a9e901eba6c044ca86a
institution Kabale University
issn 2045-2322
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-96dbe35a2ce84a9e901eba6c044ca86a2025-02-02T12:16:29ZengNature PortfolioScientific Reports2045-23222025-01-0115111510.1038/s41598-025-86009-8Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention networkGuixian Xu0Zhe Chen1Zixin Zhang2Key Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE, Minzu University of ChinaKey Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE, Minzu University of ChinaKey Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE, Minzu University of ChinaAbstract Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When irrelevant sentiment words have semantic meaning for the given aspect category, it may cause the problem that sentiment words cannot be matched with aspect categories. To address the aforementioned issue, this paper proposes a novel approach for ACSA utilizing pre-trained Bidirectional Long Short-Term Memory (BiLSTM) and syntax-aware graph attention network. To address the issue of insufficient existing annotated datasets, a method of using transfer learning is proposed. Firstly, the BiLSTM model is used to pre-train on the document-level sentiment analysis dataset, and the obtained pre-training parameters are transferred to the aspect-level task model. Then, a syntax-aware graph attention network model is proposed to make full use of the syntactic structure and semantic information in the text, and combine the knowledge learned in pre-training to achieve the ACSA task. The performance evaluation of this method is carried out on five user comment text datasets, and the comprehensive ablation experiments prove that this method performs best compared with baseline models.https://doi.org/10.1038/s41598-025-86009-8Aspect category sentiment analysisAspect category detectionGraph attention networkBidirectional long-short term memory networks
spellingShingle Guixian Xu
Zhe Chen
Zixin Zhang
Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
Scientific Reports
Aspect category sentiment analysis
Aspect category detection
Graph attention network
Bidirectional long-short term memory networks
title Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
title_full Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
title_fullStr Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
title_full_unstemmed Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
title_short Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
title_sort aspect category sentiment analysis based on pre trained bilstm and syntax aware graph attention network
topic Aspect category sentiment analysis
Aspect category detection
Graph attention network
Bidirectional long-short term memory networks
url https://doi.org/10.1038/s41598-025-86009-8
work_keys_str_mv AT guixianxu aspectcategorysentimentanalysisbasedonpretrainedbilstmandsyntaxawaregraphattentionnetwork
AT zhechen aspectcategorysentimentanalysisbasedonpretrainedbilstmandsyntaxawaregraphattentionnetwork
AT zixinzhang aspectcategorysentimentanalysisbasedonpretrainedbilstmandsyntaxawaregraphattentionnetwork