Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification

Prompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifical...

Full description

Saved in:
Bibliographic Details
Main Authors: Jinfeng Gao, Xianliang Xia, Ruxian Yao, Junming Zhang, Yu Zhang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10848120/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832583247300657152
author Jinfeng Gao
Xianliang Xia
Ruxian Yao
Junming Zhang
Yu Zhang
author_facet Jinfeng Gao
Xianliang Xia
Ruxian Yao
Junming Zhang
Yu Zhang
author_sort Jinfeng Gao
collection DOAJ
description Prompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifically, breadth refers to the cross-granularity issue of label words, while depth refers to the number of elements within a granularity that make a positive contribution to classification. This study proposes a dynamic search tree (DST) method to enhance the coverage of the verbalizer further. The core idea is to utilize the hierarchical relationships within the tree to automatically unearth concealed high-quality words, thereby ensuring that the constructed verbalizer possesses both higher breadth and depth. DST involves amalgamating knowledgeable prompt tuning (KPT) by leveraging the breadth of the KPT&#x2019;s label word space, which encompasses characteristics at various granularities and from various perspectives, thereby addressing the problem of the verbalizer&#x2019;s breadth. Subsequently, a method that is capable of measuring the interrelation between words on a designated feature is proposed by analyzing the word vector, which successfully eradicates the noise introduced by irrelevant dimensions during the process of extending the verbalizer and effectively enhances the quality of the verbalizer in terms of depth. Extensive experiments were conducted on zero- and few-shot text classification tasks to demonstrate the effectiveness of our method. Our source code is publicly available at <uri>https://github.com/XianliangXia/VerbalizerConstrucionByDynamicSearchTree</uri>.
format Article
id doaj-art-abf5927af2354292be96947fe47a6da7
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-abf5927af2354292be96947fe47a6da72025-01-29T00:01:06ZengIEEEIEEE Access2169-35362025-01-0113163381635110.1109/ACCESS.2025.353245810848120Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text ClassificationJinfeng Gao0Xianliang Xia1https://orcid.org/0009-0005-0634-276XRuxian Yao2Junming Zhang3Yu Zhang4School of Computer and Artificial Intelligence, Huanghuai University, Zhumadian, ChinaSchool of Computer and Information Engineering, Shanghai Polytechnic University, Shanghai, ChinaSchool of Computer and Artificial Intelligence, Huanghuai University, Zhumadian, ChinaSchool of Computer and Artificial Intelligence, Huanghuai University, Zhumadian, ChinaSchool of Computer and Artificial Intelligence, Huanghuai University, Zhumadian, ChinaPrompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifically, breadth refers to the cross-granularity issue of label words, while depth refers to the number of elements within a granularity that make a positive contribution to classification. This study proposes a dynamic search tree (DST) method to enhance the coverage of the verbalizer further. The core idea is to utilize the hierarchical relationships within the tree to automatically unearth concealed high-quality words, thereby ensuring that the constructed verbalizer possesses both higher breadth and depth. DST involves amalgamating knowledgeable prompt tuning (KPT) by leveraging the breadth of the KPT&#x2019;s label word space, which encompasses characteristics at various granularities and from various perspectives, thereby addressing the problem of the verbalizer&#x2019;s breadth. Subsequently, a method that is capable of measuring the interrelation between words on a designated feature is proposed by analyzing the word vector, which successfully eradicates the noise introduced by irrelevant dimensions during the process of extending the verbalizer and effectively enhances the quality of the verbalizer in terms of depth. Extensive experiments were conducted on zero- and few-shot text classification tasks to demonstrate the effectiveness of our method. Our source code is publicly available at <uri>https://github.com/XianliangXia/VerbalizerConstrucionByDynamicSearchTree</uri>.https://ieeexplore.ieee.org/document/10848120/Cosine similarityfew-shot text classificationprompt tuningword vector
spellingShingle Jinfeng Gao
Xianliang Xia
Ruxian Yao
Junming Zhang
Yu Zhang
Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
IEEE Access
Cosine similarity
few-shot text classification
prompt tuning
word vector
title Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
title_full Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
title_fullStr Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
title_full_unstemmed Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
title_short Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
title_sort construction of prompt verbalizer based on dynamic search tree for text classification
topic Cosine similarity
few-shot text classification
prompt tuning
word vector
url https://ieeexplore.ieee.org/document/10848120/
work_keys_str_mv AT jinfenggao constructionofpromptverbalizerbasedondynamicsearchtreefortextclassification
AT xianliangxia constructionofpromptverbalizerbasedondynamicsearchtreefortextclassification
AT ruxianyao constructionofpromptverbalizerbasedondynamicsearchtreefortextclassification
AT junmingzhang constructionofpromptverbalizerbasedondynamicsearchtreefortextclassification
AT yuzhang constructionofpromptverbalizerbasedondynamicsearchtreefortextclassification