Showing 1,061 - 1,080 results of 1,276 for search 'support (vector OR sector) regression algorithm', query time: 0.13s Refine Results
  1. 1061

    Prediction of early postoperative complications and transfusion risk after lumbar spinal stenosis surgery in geriatric patients: machine learning approach based on comprehensive ge... by Wounsuk Rhee, Sam Yeol Chang, Bong-Soon Chang, Hyoungmin Kim

    Published 2025-07-01
    “…A total of 48 features, including demographics, comorbidity, nutrition, and perioperative status, were collected. Logistic regression, support vector machine (SVM), random forest, XGBoost, and LightGBM were trained using five-fold cross-validation. …”
    Get full text
    Article
  2. 1062

    Ensemble Learning-Driven and UAV Multispectral Analysis for Estimating the Leaf Nitrogen Content in Winter Wheat by Yu Han, Jiaxue Zhang, Yan Bai, Zihao Liang, Xinhui Guo, Yu Zhao, Meichen Feng, Lujie Xiao, Xiaoyan Song, Meijun Zhang, Wude Yang, Guangxin Li, Sha Yang, Xingxing Qiao, Chao Wang

    Published 2025-07-01
    “…Correlation analysis identified highly relevant indices with LNC. Support Vector Regression (SVR), Random Forest (RF), Ridge Regression (RR), K-Nearest Neighbors (K-NN), and ensemble learning algorithms (Voting and Stacking) were employed to model the relationship between selected vegetation indices and LNC. …”
    Get full text
    Article
  3. 1063

    Evaluation of machine learning methods for forecasting turbidity in river networks using Sentinel-2 remote sensing data by Victor Oliveira Santos, Paulo Alexandre Costa Rocha, Jesse Van Griensven Thé, Bahram Gharabaghi

    Published 2025-12-01
    “…Spectral bands from Sentinel-2 were analyzed using machine learning algorithms, namely XGBoost, Random Forests, GMDH, Support Vector Regression, k-Nearest Neighbors and Least Absolute Shrinkage and Selection Operator to model turbidity, using data from twelve monitoring stations across the Mississippi River, USA. …”
    Get full text
    Article
  4. 1064

    Explainable machine learning model for predicting compressive strength of CO2-cured concrete by Jia Chu, Bingbing Guo, Taotao Zhong, Qinghao Guan, Yan Wang, Ditao Niu

    Published 2025-07-01
    “…A comprehensive database comprising 198 datasets was collected from published experimental investigations, and four ML algorithms were employed, i.e., RF (random forest), SVR (support vector regression), GBRT (gradient boosting regression tree), and XGB (extreme gradient boosting). …”
    Get full text
    Article
  5. 1065

    Unmanned Aerial Vehicle Remote Sensing for Monitoring Fractional Vegetation Cover in Creeping Plants: A Case Study of <i>Thymus mongolicus</i> Ronniger by Hao Zheng, Wentao Mi, Kaiyan Cao, Weibo Ren, Yuan Chi, Feng Yuan, Yaling Liu

    Published 2025-02-01
    “…FVC estimation models were developed using four algorithms: multiple linear regression (MLR), random forest (RF), support vector regression (SVR), and artificial neural network (ANN). …”
    Get full text
    Article
  6. 1066

    Evaluation of Machine Learning Models for Estimating Grassland Pasture Yield Using Landsat-8 Imagery by Linming Huang, Fen Zhao, Guozheng Hu, Hasbagan Ganjurjav, Rihan Wu, Qingzhu Gao

    Published 2024-12-01
    “…These data, combined with field-measured pasture yields, were employed to construct models using four machine learning algorithms: elastic net regression (Enet), Random Forest (RF), Extreme Gradient Boosting (XGBoost), and Support Vector Machine (SVM). …”
    Get full text
    Article
  7. 1067

    Machine learning-driven development of a stratified CES-D screening system: optimizing depression assessment through adaptive item selection by Ruo-Fei Xu, Zhen-Jing Liu, Shunan Ouyang, Qin Dong, Wen-Jing Yan, Dong-Wu Xu

    Published 2025-03-01
    “…Model performance was systematically evaluated through discrimination (ROC analysis), calibration (Brier score), and clinical utility analyses (decision curve analysis), with additional validation using random forest and support vector machine algorithms across independent samples. …”
    Get full text
    Article
  8. 1068

    Comparative Analysis of Machine Learning and Deep Learning Models for Classification and Prediction of Liver Disease in Patients with Hepatitis C by Shalem Preetham Gandu, M. Roshni Thanka, E. Bijolin Edwin, Ebenezer Veemaraj, S. Stewart Kirubakaran

    Published 2025-06-01
    “…The classifiers used in model assessment included K-Nearest Neighbors (KNN), Support Vector Machine (SVM), Decision Tree (DT), Random Forest (RF), Logistic Regression (LGR), XGBoost (XGB), and Gaussian Naive Bayes (GNB). …”
    Get full text
    Article
  9. 1069

    Machine learning for detection of diffusion abnormalities-related respiratory changes among normal, overweight, and obese individuals based on BMI and pulmonary ventilation paramet... by Xin-Yue Song, Xin-Peng Xie, Wen-Jing Xu, Yu-Jia Cao, Bin-Miao Liang

    Published 2025-07-01
    “…We applied several supervised ML algorithms and feature selection strategies to distinguish between DN and DA, including Support Vector Machine (SVM), Random Forest (RF), Adaptive Boosting (AdaBoost), Naive Bayes (BAYES), K-Nearest Neighbors (KNN), SelectKBest, Recursive Feature Elimination with Cross-Validation (RFECV), and SelectFromModel. …”
    Get full text
    Article
  10. 1070

    Machine learning-based real-time prediction of duodenal stump leakage from gastrectomy in gastric cancer patients by Jae Hun Chung, Jae Hun Chung, Jae Hun Chung, Yushin Kim, Dongjun Lee, Dongwon Lim, Dongwon Lim, Dongwon Lim, Sun-Hwi Hwang, Sun-Hwi Hwang, Sun-Hwi Hwang, Si-Hak Lee, Si-Hak Lee, Si-Hak Lee, Woohwan Jung

    Published 2025-05-01
    “…One hundred eighty-nine features were extracted from each patient record, including demographic data, preoperative comorbidities, and blood test outcomes from the subsequent seven postoperative days (POD). Six ML algorithms were evaluated: Logistic Regression (LR), K-nearest neighbors (KNN), Support Vector Machine (SVM), Random Forest (RF), Extreme Gradient Boosting (XGB), and Neural Network (NN). …”
    Get full text
    Article
  11. 1071

    A machine learning based radiomics approach for predicting No. 14v station lymph node metastasis in gastric cancer by Tingting Ma, Tingting Ma, Tingting Ma, Tingting Ma, Tingting Ma, Mengran Zhao, Mengran Zhao, Mengran Zhao, Mengran Zhao, Mengran Zhao, Xiangli Li, Xiangchao Song, Xiangchao Song, Xiangchao Song, Xiangchao Song, Lingwei Wang, Lingwei Wang, Lingwei Wang, Lingwei Wang, Zhaoxiang Ye, Zhaoxiang Ye, Zhaoxiang Ye, Zhaoxiang Ye

    Published 2024-10-01
    “…Seven machine learning (ML) algorithms including naïve Bayes (NB), k-nearest neighbor (KNN), decision tree (DT), logistic regression (LR), random forest (RF), eXtreme gradient boosting (XGBoost) and support vector machine (SVM) were trained for development of optimal radiomics signature. …”
    Get full text
    Article
  12. 1072

    Noninvasive imaging biomarker reveals invisible microscopic variation in acute ischaemic stroke (≤ 24 h): a multicentre retrospective study by Kui Sun, Rongchao Shi, Xinxin Yu, Ying Wang, Wei Zhang, Xiaoxia Yang, Mei Zhang, Jian Wang, Shu Jiang, Haiou Li, Bing Kang, Tong Li, Shuying Zhao, Yu Ai, Jianfeng Qiu, Haiyan Wang, Ximing Wang

    Published 2025-01-01
    “…Multiple ML models (random forest, RF; support vector machine, SVM; logistic regression, LR; multilayer perceptron, MLP) were used to discriminate microscopic AIS and non-AIS. …”
    Get full text
    Article
  13. 1073

    Recognition of pivotal immune genes NR1H4 and IL4R as diagnostic biomarkers in distinguishing ovarian clear cell cancer from high-grade serous cancer by Yumin Ke, Meili Liang, Zhimei Zhou, Yajing Xie, Li Huang, Liying Sheng, Yueli Wang, Xinyan Zhou, Zhuna Wu

    Published 2025-06-01
    “…Least Absolute Shrinkage and Selection Operator (LASSO) regression model and Multiple Support Vector Machine Recursive Feature Elimination (mSVM-RFE) methods were applied to identify predictive genes. …”
    Get full text
    Article
  14. 1074

    Development and validation of a radiomics-based nomogram for predicting pathological grade of upper urinary tract urothelial carcinoma by Yanghuang Zheng, Hongjin Shi, Shi Fu, Haifeng Wang, Xin Li, Zhi Li, Bing Hai, Jinsong Zhang

    Published 2024-12-01
    “…The maximum relevance minimum redundancy algorithm, least absolute shrinkage and selection operator, and various machine learning (ML) algorithms—including random forest, support vector machine, and eXtreme gradient boosting—were employed to select radiomics features and calculate radiomics scores. …”
    Get full text
    Article
  15. 1075

    Machine Learning-Based Methodologies for Cyber-Attacks and Network Traffic Monitoring: A Review and Insights by Filippo Genuario, Giuseppe Santoro, Michele Giliberti, Stefania Bello, Elvira Zazzera, Donato Impedovo

    Published 2024-11-01
    “…The proposed work compares both shallow learning algorithms, such as decision trees, random forests, Naïve Bayes, logistic regression, XGBoost, and support vector machines, and deep learning algorithms, such as DNNs, CNNs, and LSTM, whose approach is relatively new in the literature. …”
    Get full text
    Article
  16. 1076

    Comparing the Effectiveness of Artificial Intelligence Models in Predicting Ovarian Cancer Survival: A Systematic Review by Farkhondeh Asadi, Milad Rahimi, Nahid Ramezanghorbani, Sohrab Almasi

    Published 2025-03-01
    “…Notably, most publications emerged after 2021. Commonly used algorithms for survival prediction included random forest, support vector machines, logistic regression, XGBoost, and various deep learning models. …”
    Get full text
    Article
  17. 1077

    A Novel Ensemble Classifier Selection Method for Software Defect Prediction by Xin Dong, Jie Wang, Yan Liang

    Published 2025-01-01
    “…The experimental results demonstrate that the DFD ensemble learning-based software defect prediction model outperforms the ten other models, including five common machine learning (ML) classification algorithms (logistic regression (LR), na&#x00EF;ve Bayes (NB), K-nearest neighbor (KNN), decision tree (DT), and support vector machine (SVM)), two deep learning (DL) algorithms (multi-layer perceptron (MLP) and convolutional neural network (CNN)), and three ensemble learning algorithms (random forest (RF), extreme gradient boosting (XGB), and stacking). …”
    Get full text
    Article
  18. 1078

    Examining Car Accident Prediction Techniques and Road Traffic Congestion: A Comparative Analysis of Road Safety and Prevention of World Challenges in Low-Income and High-Income Cou... by Yetay Berhanu, Esayas Alemayehu, Dietrich Schröder

    Published 2023-01-01
    “…The study evaluates various approaches such as logistic regression, decision tree, random forest, deep neural network, support vector machine, random forest, K-nearest neighbors, Naïve Bayes, empirical Bayes, geospatial analysis methods, and UIMA, NSGA-II, and MOPS algorithms. …”
    Get full text
    Article
  19. 1079

    Application of Machine Learning in the Prediction of the Acute Aortic Dissection Risk Complicated by Mesenteric Malperfusion Based on Initial Laboratory Results by Zhechuan Jin, Jiale Dong, Jian Yang, Chengxiang Li, Zequan Li, Zhaofei Ye, Yuyu Li, Ping Li, Yulin Li, Zhili Ji

    Published 2025-06-01
    “…Key preoperative predictive variables were identified through the least absolute shrinkage and selection operator (LASSO) regression. Subsequently, six machine learning algorithms were used to develop and validate an MMP risk identification model: logistic regression (LR), support vector classification (SVC), random forest (RF), extreme gradient boosting (XGBoost), naive Bayes (NB), and multilayer perceptron (MLP). …”
    Get full text
    Article
  20. 1080

    Crowd Evacuation in Stadiums Using Fire Alarm Prediction by Afnan A. Alazbah, Osama Rabie, Abdullah Al-Barakati

    Published 2025-04-01
    “…A comparative analysis of six machine learning models—Logistic Regression, Support Vector Machines (SVM), Random Forest, and proposed EvacuNet—demonstrates that EvacuNet outperforms all other models, achieving an accuracy of 99.99%, precision of 1.00, recall of 1.00, and an AUC-ROC score close to 1.00. …”
    Get full text
    Article