Showing 1 - 6 results of 6 for search '"mixture-of-experts"', query time: 0.04s Refine Results
  1. 1
  2. 2

    Mixture of Experts Framework Based on Soft Actor-Critic Algorithm for Highway Decision-Making of Connected and Automated Vehicles by Fuxing Yao, Chao Sun, Bing Lu, Bo Wang, Haiyang Yu

    Published 2025-01-01
    “…This paper proposes a Mixture of Expert method (MoE) based on Soft Actor-Critic (SAC), where the upper-level discriminator dynamically decides whether to activate the lower-level DRL expert or the heuristic expert based on the features of the input state. …”
    Get full text
    Article
  3. 3
  4. 4
  5. 5

    Research on Predicting Super-Relational Data Links for Mine Hoists Within Hyper-Relational Knowledge Graphs by Xiaochao Dang, Xiaoling Shu, Fenfang Li, Xiaohui Dong

    Published 2024-12-01
    “…This paper proposes the HyLinker model, designed to improve the representation of entities and relations through modular components, including an entity neighbor aggregator, a relation qualifier aggregator, MoE-LSTM (Mixture of Experts Long Short-Term Memory), and a convolutional bidirectional interaction module. …”
    Get full text
    Article
  6. 6

    Enhancing depression recognition through a mixed expert model by integrating speaker-related and emotion-related features by Weitong Guo, Qian He, Ziyu Lin, Xiaolong Bu, Ziyang Wang, Dong Li, Hongwu Yang

    Published 2025-02-01
    “…To tackle this challenge, we propose a Mixture-of-Experts (MoE) method that integrates speaker-related and emotion-related features for depression recognition. …”
    Get full text
    Article