A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification

User intent classification is a vital component of a question-answering system or a task-based dialogue system. In order to understand the goals of users’ questions or discourses, the system categorizes user text into a set of pre-defined user intent categories. User questions or discourses are usua...

Full description

Saved in:
Bibliographic Details
Main Authors: Hai Liu, Yuanxia Liu, Leung-Pun Wong, Lap-Kei Lee, Tianyong Hao
Format: Article
Language:English
Published: Wiley 2020-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2020/8858852
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832560091829633024
author Hai Liu
Yuanxia Liu
Leung-Pun Wong
Lap-Kei Lee
Tianyong Hao
author_facet Hai Liu
Yuanxia Liu
Leung-Pun Wong
Lap-Kei Lee
Tianyong Hao
author_sort Hai Liu
collection DOAJ
description User intent classification is a vital component of a question-answering system or a task-based dialogue system. In order to understand the goals of users’ questions or discourses, the system categorizes user text into a set of pre-defined user intent categories. User questions or discourses are usually short in length and lack sufficient context; thus, it is difficult to extract deep semantic information from these types of text and the accuracy of user intent classification may be affected. To better identify user intents, this paper proposes a BERT-Cap hybrid neural network model with focal loss for user intent classification to capture user intents in dialogue. The model uses multiple transformer encoder blocks to encode user utterances and initializes encoder parameters with a pre-trained BERT. Then, it extracts essential features using a capsule network with dynamic routing after utterances encoding. Experiment results on four publicly available datasets show that our model BERT-Cap achieves a F1 score of 0.967 and an accuracy of 0.967, outperforming a number of baseline methods, indicating its effectiveness in user intent classification.
format Article
id doaj-art-1d88442b22a3414f81389c0ec0cf292e
institution Kabale University
issn 1076-2787
1099-0526
language English
publishDate 2020-01-01
publisher Wiley
record_format Article
series Complexity
spelling doaj-art-1d88442b22a3414f81389c0ec0cf292e2025-02-03T01:28:26ZengWileyComplexity1076-27871099-05262020-01-01202010.1155/2020/88588528858852A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent ClassificationHai Liu0Yuanxia Liu1Leung-Pun Wong2Lap-Kei Lee3Tianyong Hao4School of Computer Science, South China Normal University, Guangzhou 510000, ChinaSchool of Computer Science, South China Normal University, Guangzhou 510000, ChinaSchool of Science and Technology, The Open University of Hong Kong, Kowloon, Hong Kong SAR 999077, ChinaSchool of Science and Technology, The Open University of Hong Kong, Kowloon, Hong Kong SAR 999077, ChinaSchool of Computer Science, South China Normal University, Guangzhou 510000, ChinaUser intent classification is a vital component of a question-answering system or a task-based dialogue system. In order to understand the goals of users’ questions or discourses, the system categorizes user text into a set of pre-defined user intent categories. User questions or discourses are usually short in length and lack sufficient context; thus, it is difficult to extract deep semantic information from these types of text and the accuracy of user intent classification may be affected. To better identify user intents, this paper proposes a BERT-Cap hybrid neural network model with focal loss for user intent classification to capture user intents in dialogue. The model uses multiple transformer encoder blocks to encode user utterances and initializes encoder parameters with a pre-trained BERT. Then, it extracts essential features using a capsule network with dynamic routing after utterances encoding. Experiment results on four publicly available datasets show that our model BERT-Cap achieves a F1 score of 0.967 and an accuracy of 0.967, outperforming a number of baseline methods, indicating its effectiveness in user intent classification.http://dx.doi.org/10.1155/2020/8858852
spellingShingle Hai Liu
Yuanxia Liu
Leung-Pun Wong
Lap-Kei Lee
Tianyong Hao
A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
Complexity
title A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
title_full A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
title_fullStr A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
title_full_unstemmed A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
title_short A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
title_sort hybrid neural network bert cap based on pre trained language model and capsule network for user intent classification
url http://dx.doi.org/10.1155/2020/8858852
work_keys_str_mv AT hailiu ahybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT yuanxialiu ahybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT leungpunwong ahybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT lapkeilee ahybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT tianyonghao ahybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT hailiu hybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT yuanxialiu hybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT leungpunwong hybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT lapkeilee hybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification
AT tianyonghao hybridneuralnetworkbertcapbasedonpretrainedlanguagemodelandcapsulenetworkforuserintentclassification