Search alternatives:
search » research (Expand Search)
Showing 1 - 15 results of 15 for search '((( fact OR face) search random tree algorithm ) OR ( east search random three algorithm ))', query time: 0.26s Refine Results
  1. 1

    A stacked ensemble machine learning model for the prediction of pentavalent 3 vaccination dropout in East Africa by Meron Asmamaw Alemayehu, Shimels Derso Kebede, Agmasie Damtew Walle, Daniel Niguse Mamo, Ermias Bekele Enyew, Jibril Bashir Adem

    Published 2025-04-01
    “…The objective is to identify predictors of dropout and enhance intervention strategies.MethodsThe study utilized seven base machine learning algorithms to create a stacked ensemble model with three meta-learners: Random Forest (RF), Generalized Linear Model (GLM), and Extreme Gradient Boosting (XGBoost). …”
    Get full text
    Article
  2. 2
  3. 3

    Securing IoT Communications via Anomaly Traffic Detection: Synergy of Genetic Algorithm and Ensemble Method by Behnam Seyedi, Octavian Postolache

    Published 2025-06-01
    “…In the final phase, an ensemble classifier combines the strengths of the Decision Tree, Random Forest, and XGBoost algorithms to achieve the accurate and robust detection of anomalous behaviors. …”
    Get full text
    Article
  4. 4
  5. 5

    Improving Surgical Site Infection Prediction Using Machine Learning: Addressing Challenges of Highly Imbalanced Data by Salha Al-Ahmari, Farrukh Nadeem

    Published 2025-02-01
    “…Seven machine learning algorithms were created and tested: Decision Tree (DT), Gaussian Naive Bayes (GNB), Support Vector Machine (SVM), Logistic Regression (LR), Random Forest (RF), Stochastic Gradient Boosting (SGB), and K-Nearest Neighbors (KNN). …”
    Get full text
    Article
  6. 6
  7. 7

    Path planning algorithm based on the improved Informed-RRT* using the sea-horse optimizer by YAN Guiseng, YANG Jie

    Published 2025-02-01
    “…ObjectiveIn order to solve the problems of random sampling, inefficient search, and difficulty in providing optimal paths in complex environments faced by traditional Informed-RRT* algorithms, an improved Informed-RRT* path planning algorithm based on the sea-horse optimizer (SHO) was proposed.MethodsThis algorithm combined the strengths of Informed-RRT* and SHO. …”
    Get full text
    Article
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12

    Reducing bias in coronary heart disease prediction using Smote-ENN and PCA. by Xinyi Wei, Boyu Shi

    Published 2025-01-01
    “…To address the data imbalance issue, SMOTE-ENN is utilized, and five machine learning algorithms-Decision Trees, KNN, SVM, XGBoost, and Random Forest-are applied for classification tasks. …”
    Get full text
    Article
  13. 13
  14. 14

    A deep neural network framework for estimating coastal salinity from SMAP brightness temperature data by Yidi Wei, Qing Xu, Qing Xu, Xiaobin Yin, Xiaobin Yin, Yan Li, Yan Li, Kaiguo Fan

    Published 2025-06-01
    “…Despite advancements in satellite-based radiometry such as NASA’s Soil Moisture Active Passive (SMAP), significant challenges persist in coastal SSS retrieval due to radio frequency interference (RFI), land-sea contamination, and complex interactions of nearshore dynamic processes.MethodThis study proposes a deep neural network (DNN) framework that integrates SMAP L-band brightness temperature data with ancillary oceanographic and geographic parameters such as sea surface temperature, the shortest distance to the coastline (dis) to enhance SSS estimation accuracy in the Yellow and East China Seas. The framework leverages machine learning interpretability tools (Shapley Additive Explanations, SHAP) to optimize input feature selection and employs a grid search strategy for hyperparameter tuning.Results and discussionSystematic validation against independent in-situ measurements demonstrates that the baseline DNN model constructed for the entire region and time period outperforms conventional algorithms including K-Nearest Neighbors, Random Forest, and XGBoost and the standard SMAP SSS product, achieving a reduction of 36.0%, 33.4%, 40.1%, and 23.2%, respectively in root mean square error (RMSE). …”
    Get full text
    Article
  15. 15