SPIRIT: Structural Entropy Guided Prefix Tuning for Hierarchical Text Classification

Hierarchical text classification (HTC) is a challenging task that requires classifiers to solve a series of multi-label subtasks considering hierarchical dependencies among labels. Recent studies have introduced prompt tuning to create closer connections between the language model (LM) and the compl...

Full description

Saved in:
Bibliographic Details
Main Authors: He Zhu, Jinxiang Xia, Ruomei Liu, Bowen Deng
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/27/2/128
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Hierarchical text classification (HTC) is a challenging task that requires classifiers to solve a series of multi-label subtasks considering hierarchical dependencies among labels. Recent studies have introduced prompt tuning to create closer connections between the language model (LM) and the complex label hierarchy. However, we find that the model’s attention to the prompt gradually decreases as the prompt moves from the input to the output layer, revealing the limitations of previous prompt tuning methods for HTC. Given the success of prefix tuning-based studies in natural language understanding tasks, we introduce <b>S</b>tructural entro<b>P</b>y gu<b>I</b>ded p<b>R</b>ef<b>I</b>x <b>T</b>uning (<b>SPIRIT</b>). Specifically, we extract the essential structure of the label hierarchy via structural entropy minimization and decode the abstractive structural information as the prefix to prompt all intermediate layers in the LM. Additionally, a depth-wise reparameterization strategy is developed to enhance optimization and propagate the prefix throughout the LM layers. Extensive evaluation on four popular datasets demonstrates that SPIRIT achieves a state-of-the-art performance.
ISSN:1099-4300