Aspect category sentiment analysis based on pre-trained BiLSTM and syntax-aware graph attention network
Abstract Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When ir...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Scientific Reports |
Subjects: | |
Online Access: | https://doi.org/10.1038/s41598-025-86009-8 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract Aspect Category Sentiment Analysis (ACSA) is a fine-grained sentiment analysis task aimed at predicting the sentiment polarity associated with aspect categories within a sentence.Most existing ACSA methods are based on a given aspect category to locate sentiment words related to it. When irrelevant sentiment words have semantic meaning for the given aspect category, it may cause the problem that sentiment words cannot be matched with aspect categories. To address the aforementioned issue, this paper proposes a novel approach for ACSA utilizing pre-trained Bidirectional Long Short-Term Memory (BiLSTM) and syntax-aware graph attention network. To address the issue of insufficient existing annotated datasets, a method of using transfer learning is proposed. Firstly, the BiLSTM model is used to pre-train on the document-level sentiment analysis dataset, and the obtained pre-training parameters are transferred to the aspect-level task model. Then, a syntax-aware graph attention network model is proposed to make full use of the syntactic structure and semantic information in the text, and combine the knowledge learned in pre-training to achieve the ACSA task. The performance evaluation of this method is carried out on five user comment text datasets, and the comprehensive ablation experiments prove that this method performs best compared with baseline models. |
---|---|
ISSN: | 2045-2322 |