Practice of large language model training optimization based on large-scale AI cluster with more than 10 000 domestic NPU
In order to solve the problems of low computing efficiency utilization, poor stability, high difficulty in training optimization, and imperfect domestic accelerator technology ecology in AI cluster model training with more than 10 000 NPU, a large language model training optimization solution based...
Saved in:
| Main Authors: | LOU Tao, NIU Hongweihua, ZHANG Pengfei, DONG Jiangfan, LI Panpan, LI Daotong, XU Weidong, YAO Chenghui, XUE Lianhao, TANG Ting, XIANG Jie |
|---|---|
| Format: | Article |
| Language: | zho |
| Published: |
Beijing Xintong Media Co., Ltd
2025-07-01
|
| Series: | Dianxin kexue |
| Subjects: | |
| Online Access: | http://www.telecomsci.com/zh/article/doi/10.11959/j.issn.1000-0801.2025166/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The Ladder of More-than-Human Participation: A Framework for Inclusive Design
by: Roudavski Stanislav
Published: (2022-12-01) -
Health and more-than-human entanglements
by: Daniela Calvo
Published: (2025-06-01) -
A Semantic Approach for Linked Model, Data, and Dataspace Cards
by: Andy Donald, et al.
Published: (2025-01-01) -
More-Than-Human Aesthetics: Lessons from Enrichment
by: Alinta Krauth
Published: (2023-12-01) -
A Systematic Review of the Comparison of Different Types of Card Sorting
by: Elinda Tchivi, et al.
Published: (2025-01-01)