Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference
Abstract Gene regulatory network (GRN) inference, a process of reconstructing gene regulatory rules from experimental data, has the potential to discover new regulatory rules. However, existing methods often struggle to generalize across diverse cell types and account for unseen regulators. Here, th...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2025-01-01
|
Series: | Advanced Science |
Subjects: | |
Online Access: | https://doi.org/10.1002/advs.202409990 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832593487483109376 |
---|---|
author | Guangzheng Weng Patrick Martin Hyobin Kim Kyoung Jae Won |
author_facet | Guangzheng Weng Patrick Martin Hyobin Kim Kyoung Jae Won |
author_sort | Guangzheng Weng |
collection | DOAJ |
description | Abstract Gene regulatory network (GRN) inference, a process of reconstructing gene regulatory rules from experimental data, has the potential to discover new regulatory rules. However, existing methods often struggle to generalize across diverse cell types and account for unseen regulators. Here, this work presents GRNPT, a novel Transformer‐based framework that integrates large language model (LLM) embeddings from publicly accessible biological data and a temporal convolutional network (TCN) autoencoder to capture regulatory patterns from single‐cell RNA sequencing (scRNA‐seq) trajectories. GRNPT significantly outperforms both supervised and unsupervised methods in inferring GRNs, particularly when training data is limited. Notably, GRNPT exhibits exceptional generalizability, accurately predicting regulatory relationships in previously unseen cell types and even regulators. By combining LLMs ability to distillate biological knowledge from text and deep learning methodologies capturing complex patterns in gene expression data, GRNPT overcomes the limitations of traditional GRN inference methods and enables more accurate and comprehensive understanding of gene regulatory dynamics. |
format | Article |
id | doaj-art-95cf82700a7a4ef6b80b05f52348300d |
institution | Kabale University |
issn | 2198-3844 |
language | English |
publishDate | 2025-01-01 |
publisher | Wiley |
record_format | Article |
series | Advanced Science |
spelling | doaj-art-95cf82700a7a4ef6b80b05f52348300d2025-01-20T13:04:19ZengWileyAdvanced Science2198-38442025-01-01123n/an/a10.1002/advs.202409990Integrating Prior Knowledge Using Transformer for Gene Regulatory Network InferenceGuangzheng Weng0Patrick Martin1Hyobin Kim2Kyoung Jae Won3Biotech Research and Innovation Centre (BRIC)University of CopenhagenOle Maaløes Vej 5Copenhagen2200DenmarkDepartment of Computational BiomedicineCedars‐Sinai Medical CenterLos Angeles CA 90069 USADepartment of Computational BiomedicineCedars‐Sinai Medical CenterLos Angeles CA 90069 USADepartment of Computational BiomedicineCedars‐Sinai Medical CenterLos Angeles CA 90069 USAAbstract Gene regulatory network (GRN) inference, a process of reconstructing gene regulatory rules from experimental data, has the potential to discover new regulatory rules. However, existing methods often struggle to generalize across diverse cell types and account for unseen regulators. Here, this work presents GRNPT, a novel Transformer‐based framework that integrates large language model (LLM) embeddings from publicly accessible biological data and a temporal convolutional network (TCN) autoencoder to capture regulatory patterns from single‐cell RNA sequencing (scRNA‐seq) trajectories. GRNPT significantly outperforms both supervised and unsupervised methods in inferring GRNs, particularly when training data is limited. Notably, GRNPT exhibits exceptional generalizability, accurately predicting regulatory relationships in previously unseen cell types and even regulators. By combining LLMs ability to distillate biological knowledge from text and deep learning methodologies capturing complex patterns in gene expression data, GRNPT overcomes the limitations of traditional GRN inference methods and enables more accurate and comprehensive understanding of gene regulatory dynamics.https://doi.org/10.1002/advs.202409990deep learninggene regulatory networksinferencelarge language modeltemporal convolutional networktransformer |
spellingShingle | Guangzheng Weng Patrick Martin Hyobin Kim Kyoung Jae Won Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference Advanced Science deep learning gene regulatory networks inference large language model temporal convolutional network transformer |
title | Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference |
title_full | Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference |
title_fullStr | Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference |
title_full_unstemmed | Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference |
title_short | Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference |
title_sort | integrating prior knowledge using transformer for gene regulatory network inference |
topic | deep learning gene regulatory networks inference large language model temporal convolutional network transformer |
url | https://doi.org/10.1002/advs.202409990 |
work_keys_str_mv | AT guangzhengweng integratingpriorknowledgeusingtransformerforgeneregulatorynetworkinference AT patrickmartin integratingpriorknowledgeusingtransformerforgeneregulatorynetworkinference AT hyobinkim integratingpriorknowledgeusingtransformerforgeneregulatorynetworkinference AT kyoungjaewon integratingpriorknowledgeusingtransformerforgeneregulatorynetworkinference |