A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification

User intent classification is a vital component of a question-answering system or a task-based dialogue system. In order to understand the goals of users’ questions or discourses, the system categorizes user text into a set of pre-defined user intent categories. User questions or discourses are usua...

Full description

Saved in:
Bibliographic Details
Main Authors: Hai Liu, Yuanxia Liu, Leung-Pun Wong, Lap-Kei Lee, Tianyong Hao
Format: Article
Language:English
Published: Wiley 2020-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2020/8858852
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:User intent classification is a vital component of a question-answering system or a task-based dialogue system. In order to understand the goals of users’ questions or discourses, the system categorizes user text into a set of pre-defined user intent categories. User questions or discourses are usually short in length and lack sufficient context; thus, it is difficult to extract deep semantic information from these types of text and the accuracy of user intent classification may be affected. To better identify user intents, this paper proposes a BERT-Cap hybrid neural network model with focal loss for user intent classification to capture user intents in dialogue. The model uses multiple transformer encoder blocks to encode user utterances and initializes encoder parameters with a pre-trained BERT. Then, it extracts essential features using a capsule network with dynamic routing after utterances encoding. Experiment results on four publicly available datasets show that our model BERT-Cap achieves a F1 score of 0.967 and an accuracy of 0.967, outperforming a number of baseline methods, indicating its effectiveness in user intent classification.
ISSN:1076-2787
1099-0526