Latent Graph Induction Networks and Dependency Graph Networks for Events Detection
The goal of event detection is to identify instances of various event types within text. In real-world scenarios, multiple events often coexist within the same sentence, making the extraction of these events more challenging than extracting a single event. While graph neural networks operating over...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10818466/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832592929761263616 |
---|---|
author | Jing Yang Hu Gao Depeng Dang |
author_facet | Jing Yang Hu Gao Depeng Dang |
author_sort | Jing Yang |
collection | DOAJ |
description | The goal of event detection is to identify instances of various event types within text. In real-world scenarios, multiple events often coexist within the same sentence, making the extraction of these events more challenging than extracting a single event. While graph neural networks operating over dependency parsing trees have shown some capability in handling multi-event scenarios and improving event detection effectiveness, their improvement is limited. This limitation arises because dependency trees cannot automatically establish connections between trigger words and other key words, which are crucial for recognizing and classifying trigger words. Additionally, syntactic-based methods typically focus on the closest neighbors in the dependency graphs to aggregate information for the trigger candidate word, even though relevant words are often multi-hop away. In this paper, we combine the word dependency graphs with our automatically induced latent graph structure for event detection and multiple events detection. Furthermore, we propose two regularizers to enhance the representation of the dependency graphs and the induced latent graph structure. Experimental results demonstrate the effectiveness of our model for events detection. |
format | Article |
id | doaj-art-68c5e8acfabf4737a9cc4e84e3ca07ac |
institution | Kabale University |
issn | 2169-3536 |
language | English |
publishDate | 2025-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj-art-68c5e8acfabf4737a9cc4e84e3ca07ac2025-01-21T00:01:15ZengIEEEIEEE Access2169-35362025-01-0113107131072310.1109/ACCESS.2024.352389510818466Latent Graph Induction Networks and Dependency Graph Networks for Events DetectionJing Yang0https://orcid.org/0009-0002-2714-8091Hu Gao1https://orcid.org/0000-0001-8987-3956Depeng Dang2https://orcid.org/0000-0001-7923-9329School of Artificial Intelligence, Beijing Normal University, Beijing, ChinaSchool of Artificial Intelligence, Beijing Normal University, Beijing, ChinaSchool of Artificial Intelligence, Beijing Normal University, Beijing, ChinaThe goal of event detection is to identify instances of various event types within text. In real-world scenarios, multiple events often coexist within the same sentence, making the extraction of these events more challenging than extracting a single event. While graph neural networks operating over dependency parsing trees have shown some capability in handling multi-event scenarios and improving event detection effectiveness, their improvement is limited. This limitation arises because dependency trees cannot automatically establish connections between trigger words and other key words, which are crucial for recognizing and classifying trigger words. Additionally, syntactic-based methods typically focus on the closest neighbors in the dependency graphs to aggregate information for the trigger candidate word, even though relevant words are often multi-hop away. In this paper, we combine the word dependency graphs with our automatically induced latent graph structure for event detection and multiple events detection. Furthermore, we propose two regularizers to enhance the representation of the dependency graphs and the induced latent graph structure. Experimental results demonstrate the effectiveness of our model for events detection.https://ieeexplore.ieee.org/document/10818466/Event detectionmultiple event detectiongraph convolutional networkslatent graph induction networks |
spellingShingle | Jing Yang Hu Gao Depeng Dang Latent Graph Induction Networks and Dependency Graph Networks for Events Detection IEEE Access Event detection multiple event detection graph convolutional networks latent graph induction networks |
title | Latent Graph Induction Networks and Dependency Graph Networks for Events Detection |
title_full | Latent Graph Induction Networks and Dependency Graph Networks for Events Detection |
title_fullStr | Latent Graph Induction Networks and Dependency Graph Networks for Events Detection |
title_full_unstemmed | Latent Graph Induction Networks and Dependency Graph Networks for Events Detection |
title_short | Latent Graph Induction Networks and Dependency Graph Networks for Events Detection |
title_sort | latent graph induction networks and dependency graph networks for events detection |
topic | Event detection multiple event detection graph convolutional networks latent graph induction networks |
url | https://ieeexplore.ieee.org/document/10818466/ |
work_keys_str_mv | AT jingyang latentgraphinductionnetworksanddependencygraphnetworksforeventsdetection AT hugao latentgraphinductionnetworksanddependencygraphnetworksforeventsdetection AT depengdang latentgraphinductionnetworksanddependencygraphnetworksforeventsdetection |