-
1
XGBoost-enhanced ensemble model using discriminative hybrid features for the prediction of sumoylation sites
Published 2025-02-01“…By fusing word embeddings with evolutionary descriptors, it applies the SHapley Additive exPlanations (SHAP) algorithm for optimal feature selection and uses eXtreme Gradient Boosting (XGBoost) for classification. …”
Get full text
Article -
2
Forecasting mental states in schizophrenia using digital phenotyping data.
Published 2025-02-01“…Besides it remains unclear which machine learning algorithm is best suited for forecast tasks, the eXtreme Gradient Boosting (XGBoost) and long short-term memory (LSTM) algorithms being 2 popular choices in digital phenotyping studies. …”
Get full text
Article -
3
Constructing a machine learning model for systemic infection after kidney stone surgery based on CT values
Published 2025-02-01“…All five machine learning models demonstrated strong discrimination on the validation set (AUC: 0.690–0.858). The eXtreme Gradient Boosting (XGBoost) model was the best performer [AUC: 0.858; sensitivity: 0.877; specificity: 0.981; accuracy: 0.841; positive predictive value: 0.629; negative predictive value: 0.851]. …”
Get full text
Article -
4
Multiple PM Low-Cost Sensors, Multiple Seasons’ Data, and Multiple Calibration Models
Published 2023-02-01“…The ML models included (i) Decision Tree, (ii) Random Forest (RF), (iii) eXtreme Gradient Boosting, and (iv) Support Vector Regression (SVR). …”
Get full text
Article -
5
Combining machine learning algorithms for bridging gaps in GRACE and GRACE Follow-On missions using ERA5-Land reanalysis
Published 2025-06-01“…Unlike previous studies, we use a combination of Machine Learning (ML) methods—Random Forest (RF), Support Vector Machine (SVM), eXtreme Gradient Boosting (XGB), Deep Neural Network (DNN), and Stacked Long-Short Term Memory (SLSTM)—to identify and efficiently bridge the gap between GRACE and GFO by using the best-performing ML model to estimate TWSA at each grid cell. …”
Get full text
Article -
6
Using Deep Learning to Identify High-Risk Patients with Heart Failure with Reduced Ejection Fraction
Published 2021-07-01“…For comparison, we also tested multiple traditional machine learning models including logistic regression, random forest, and eXtreme Gradient Boosting (XGBoost). Model performance was assessed by area under the curve (AUC) values, precision, and recall on an independent testing dataset. …”
Get full text
Article -
7
RCE-IFE: recursive cluster elimination with intra-cluster feature elimination
Published 2025-02-01“…Furthermore, RCE-IFE surpasses several state-of-the-art FS methods, such as Minimum Redundancy Maximum Relevance (MRMR), Fast Correlation-Based Filter (FCBF), Information Gain (IG), Conditional Mutual Information Maximization (CMIM), SelectKBest (SKB), and eXtreme Gradient Boosting (XGBoost), obtaining an average AUC of 0.76 on five gene expression datasets. …”
Get full text
Article -
8
Exploring cement Production's role in GDP using explainable AI and sustainability analysis in Nepal
Published 2025-06-01“…Utilizing regression models like Extra Trees (Extremely Randomized Trees) Regressor, CatBoost (Categorial Boosting) Regressor, and XGBoost (eXtreme Gradient Boosting) Regressor, Random Forest and Ensemble of Sparse Embedded Trees (SET) machine learning is used to examine the demand, supply, and Gross Domestic Product (GDP) performance of cement manufacturing in India which shares a common cement related infrastructure to Nepal. …”
Get full text
Article -
9
Establishing a radiomics model using contrast-enhanced ultrasound for preoperative prediction of neoplastic gallbladder polyps exceeding 10 mm
Published 2025-02-01“…This model, derived from machine learning frameworks including Support Vector Machine (SVM), Logistic Regression (LR), Multilayer Perceptron (MLP), k-Nearest Neighbors (KNN), and eXtreme Gradient Boosting (XGBoost) with fivefold cross-validation, showed AUCs of 0.95 (95% CI: 0.90–0.99) and 0.87 (95% CI: 0.72–1.0) in internal validation. …”
Get full text
Article