Elemental augmentation of machine learning interatomic potentials

Machine learning interatomic potentials (MLIPs) bridge the gap between the accuracy of ab initio methods and the computational efficiency needed for large-scale simulations. However, custom-trained MLIPs are often limited to specific materials and lack flexibility for incorporating additional elemen...

Full description

Saved in:
Bibliographic Details
Main Authors: Haibo Xue, Guanjian Cheng, Wan-Jian Yin
Format: Article
Language:English
Published: Elsevier 2025-06-01
Series:Computational Materials Today
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S295046352500002X
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849325336503779328
author Haibo Xue
Guanjian Cheng
Wan-Jian Yin
author_facet Haibo Xue
Guanjian Cheng
Wan-Jian Yin
author_sort Haibo Xue
collection DOAJ
description Machine learning interatomic potentials (MLIPs) bridge the gap between the accuracy of ab initio methods and the computational efficiency needed for large-scale simulations. However, custom-trained MLIPs are often limited to specific materials and lack flexibility for incorporating additional elements, while universal potentials (UPots), despite covering a wide range of chemical elements, may sacrifice accuracy for generalization. In this work, we propose an elemental augmentation strategy to efficiently expand MLIPs by incorporating new elements into pre-trained models. Using a Bayesian optimization driven active learning framework, we target the configuration space of new elements where the current MLIPs exhibit high uncertainty and demonstrate the addition of up to 10 elements to a pre-trained UPot. The results demonstrate a high tendency for sampling new structures composed of these elements, minimizing sampling requirements. It reduces computational costs by over an order of magnitude compared to training an MLIP from scratch, while preserving accuracy. This strategy offers a scalable pathway to extend MLIP applicability across diverse chemical spaces.
format Article
id doaj-art-387e7c8028734a7e9e1d5b459a687789
institution Kabale University
issn 2950-4635
language English
publishDate 2025-06-01
publisher Elsevier
record_format Article
series Computational Materials Today
spelling doaj-art-387e7c8028734a7e9e1d5b459a6877892025-08-20T03:48:27ZengElsevierComputational Materials Today2950-46352025-06-01610002610.1016/j.commt.2025.100026Elemental augmentation of machine learning interatomic potentialsHaibo Xue0Guanjian Cheng1Wan-Jian Yin2College of Energy, Soochow Institute for Energy and Materials InnovationS (SIEMIS), and Jiangsu Provincial Key Laboratory for Advanced Carbon Materials and Wearable Energy Technologies, Soochow University, Suzhou 215006, ChinaCollege of Energy, Soochow Institute for Energy and Materials InnovationS (SIEMIS), and Jiangsu Provincial Key Laboratory for Advanced Carbon Materials and Wearable Energy Technologies, Soochow University, Suzhou 215006, ChinaCorresponding author.; College of Energy, Soochow Institute for Energy and Materials InnovationS (SIEMIS), and Jiangsu Provincial Key Laboratory for Advanced Carbon Materials and Wearable Energy Technologies, Soochow University, Suzhou 215006, ChinaMachine learning interatomic potentials (MLIPs) bridge the gap between the accuracy of ab initio methods and the computational efficiency needed for large-scale simulations. However, custom-trained MLIPs are often limited to specific materials and lack flexibility for incorporating additional elements, while universal potentials (UPots), despite covering a wide range of chemical elements, may sacrifice accuracy for generalization. In this work, we propose an elemental augmentation strategy to efficiently expand MLIPs by incorporating new elements into pre-trained models. Using a Bayesian optimization driven active learning framework, we target the configuration space of new elements where the current MLIPs exhibit high uncertainty and demonstrate the addition of up to 10 elements to a pre-trained UPot. The results demonstrate a high tendency for sampling new structures composed of these elements, minimizing sampling requirements. It reduces computational costs by over an order of magnitude compared to training an MLIP from scratch, while preserving accuracy. This strategy offers a scalable pathway to extend MLIP applicability across diverse chemical spaces.http://www.sciencedirect.com/science/article/pii/S295046352500002XMachine learning interatomic potentialsElemental augmentationPotential energy surfaceComputational materials science
spellingShingle Haibo Xue
Guanjian Cheng
Wan-Jian Yin
Elemental augmentation of machine learning interatomic potentials
Computational Materials Today
Machine learning interatomic potentials
Elemental augmentation
Potential energy surface
Computational materials science
title Elemental augmentation of machine learning interatomic potentials
title_full Elemental augmentation of machine learning interatomic potentials
title_fullStr Elemental augmentation of machine learning interatomic potentials
title_full_unstemmed Elemental augmentation of machine learning interatomic potentials
title_short Elemental augmentation of machine learning interatomic potentials
title_sort elemental augmentation of machine learning interatomic potentials
topic Machine learning interatomic potentials
Elemental augmentation
Potential energy surface
Computational materials science
url http://www.sciencedirect.com/science/article/pii/S295046352500002X
work_keys_str_mv AT haiboxue elementalaugmentationofmachinelearninginteratomicpotentials
AT guanjiancheng elementalaugmentationofmachinelearninginteratomicpotentials
AT wanjianyin elementalaugmentationofmachinelearninginteratomicpotentials