Showing 901 - 920 results of 1,276 for search 'support (vector OR sector) regression algorithm', query time: 0.16s Refine Results
  1. 901

    Comparing Models and Performance Metrics for Lung Cancer Prediction using Machine Learning Approaches. by Ruqiya, Noman Khan, Saira Khan

    Published 2024-12-01
    “…The models included Logistic Regression (LR), Random Forest (RF), Naive Bayes (NB), and Support Vector Classifier (SVC). …”
    Get full text
    Article
  2. 902

    Assessment of Landslide Susceptibility Based on the Two-Layer Stacking Model—A Case Study of Jiacha County, China by Zhihan Wang, Tao Wen, Ningsheng Chen, Ruixuan Tang

    Published 2025-03-01
    “…These landslide conditioning factors were integrated into a total of 4660 Stacking ensemble learning models, randomly combined by 10 base-algorithms, including AdaBoost, Decision Tree (DT), Gradient Boosting Decision Tree (GBDT), k-Nearest Neighbors (kNNs), LightGBM, Multilayer Perceptron (MLP), Random Forest (RF), Ridge Regression, Support Vector Machine (SVM), and XGBoost. …”
    Get full text
    Article
  3. 903

    AI-Driven Optimization of Breakwater Design: Predicting Wave Reflection and Structural Dimensions by Mohammed Loukili, Soufiane El Moumni, Kamila Kotrasova

    Published 2025-01-01
    “…Two datasets of 32,000 data points were generated for underwater and free-surface breakwaters, with an additional 10,000 data points for validation, totaling 42,000 data points per case. Five ML algorithms—Random Forest, Support Vector Regression, Artificial Neural Network, Decision Tree, and Gaussian Process—were applied and evaluated. …”
    Get full text
    Article
  4. 904

    Nitrogen content estimation of apple trees based on simulated satellite remote sensing data by Meixuan Li, Xicun Zhu, Xicun Zhu, Xinyang Yu, Cheng Li, Dongyun Xu, Ling Wang, Dong Lv, Yuyang Ma

    Published 2025-07-01
    “…Support Vector Machine (SVM) and Backpropagation Neural Network (BPNN) algorithms were used to construct and screen the optimal models for apple tree nitrogen content estimation.ResultsResults showed that visible light, red edge, near-infrared, and yellow edge bands were sensitive bands for estimating apple tree nitrogen content. …”
    Get full text
    Article
  5. 905

    High-precision prediction of non-resonant high-order harmonics energetic particle modes via stacking ensemble strategies by Sheng Liu, Zhenzhen Ren, Weihua Wang, Kai Zhong, Jinhong Yang, Hongwei Ning

    Published 2025-01-01
    “…This ensemble model has a 2-layer structure with the base learner and the meta-learner: the first layer, base learners include K-nearest neighbor regression, Extreme Gradient Boosting, gradient boosting regression (GBR), decision tree (DT) and support vector regression (SVR); the second layer, meta-learners output the final result via GBR. …”
    Get full text
    Article
  6. 906

    Building a composition-microstructure-performance model for C–V–Cr–Mo wear-resistant steel via the thermodynamic calculations and machine learning synergy by Shuaiwu Tong, Shuaijun Zhang, Chong Chen, Tao Jiang, Peng Li, Shizhong Wei

    Published 2025-05-01
    “…By using phase content and experimental parameters as input features, the Gradient Boosted Tree model and Support Vector Regression model demonstrated strong applicability in predicting frictional performance and wear, respectively. …”
    Get full text
    Article
  7. 907

    Prognostic correlation analysis of colorectal cancer patients based on monocyte to lymphocyte ratio and folate receptor-positive circulating tumor cells and construction of a machi... by Siying Pan, Chi Lu, Chi Lu, Hongda Lu, Hongda Lu, Hongfeng Zhang

    Published 2025-05-01
    “…Progression-Free Survival (PFS) and Overall Survival (OS) were analyzed using COX analysis and the Kaplan-Meier survival curve. Three ML algorithms, namely, random forest (RF), support vector machine (SVM), and logistic regression (LR), were utilized to construct the predictive models, and their performance metrics including accuracy, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), precision, recall, F1 value, AUC, and calibration curve were compared.ResultsMLR, FR+ CTCs, and T stage independently predicted PFS (P<0.05), both higher MLR and FR+CTCs levels indicating a significantly shorter PFS (P=0.004). …”
    Get full text
    Article
  8. 908

    Diagnosis of Malignant Endometrial Lesions from Ultrasound Radiomics Features and Clinical Variables Using Machine Learning Methods by Shanshan Li, Jiali Wang, Li Zhou, Hui Wang, Xiangyu Wang, Jian Hu, Qingxiu Ai

    Published 2025-01-01
    “…Six common machine learning algorithms, including Support Vector Machine (SVM), Logistic Regression, Decision Tree, Random Forest, Gradient Boosting Tree, and k-Nearest Neighbors, were employed to identify benign and malignant changes in endometrial tissue. …”
    Get full text
    Article
  9. 909

    Research and analysis of differential gene expression in CD34 hematopoietic stem cells in myelodysplastic syndromes. by Min-Xiao Wang, Chang-Sheng Liao, Xue-Qin Wei, Yu-Qin Xie, Peng-Fei Han, Yan-Hui Yu

    Published 2025-01-01
    “…After comprehensive evaluation, we ultimately selected three algorithms-Lasso regression, random forest, and support vector machine (SVM)-as our core predictive models. …”
    Get full text
    Article
  10. 910

    Predicting the risk of lean non-alcoholic fatty liver disease based on interpretable machine models in a Chinese T2DM population by Shixue Bao, Qiankai Jin, Tieqiao Wang, Yushan Mao, Guoqing Huang

    Published 2025-07-01
    “…Linear discriminant analysis (LDA), logistic regression (LR), Naive Bayes (NB), random forest (RF), support vector machine (SVM), and extreme gradient boosting (XGboost) were used in constructing risk prediction models for lean NAFLD in T2DM patients. …”
    Get full text
    Article
  11. 911

    CECT-Based Radiomic Nomogram of Different Machine Learning Models for Differentiating Malignant and Benign Solid-Containing Renal Masses by Qian L, Fu B, He H, Liu S, Lu R

    Published 2025-01-01
    “…Four mainstream machine learning algorithm training models, namely, support vector machine (SVM), k-nearest neighbour (kNN), light gradient boosting (LightGBM) and logistic regression (LR), were constructed to determine the best classifier model. …”
    Get full text
    Article
  12. 912

    Predicting bearing capacity of gently inclined bauxite pillar based on numerical simulation and machine learning by Deyu WANG, Defu ZHU, Biaobiao YU, Chen WANG

    Published 2025-03-01
    “…A coupled FLAC3D-3DEC simulation method was employed to conduct tests on the bearing characteristics of a gently inclined pillar, based on the rock mass and joint parameters that had been calibrated by the trial-and-error method, monitor and build a machine learning gently inclined pillar strength dataset and verify its reliability. Support Vector Machine (SVM), Extreme Learning Machine (ELM) and Light Gradient Boosting Machine (LightGBM) were used to construct the model for predicting the strength of gently inclined pillars. …”
    Get full text
    Article
  13. 913

    Machine Learning for Predicting Zearalenone Contamination Levels in Pet Food by Zhenlong Wang, Wei An, Jiaxue Wang, Hui Tao, Xiumin Wang, Bing Han, Jinquan Wang

    Published 2024-12-01
    “…Additionally, the “AIR PEN 3” E-nose, equipped with 10 metal oxide sensors, was employed to identify volatile compounds in the pet food samples, categorized into 10 different groups. Machine learning algorithms, including liner regression, k-nearest neighbors, support vector machines, random forests, XGBoost, and multi-layer perceptron (MLP), were used to classify the samples based on their volatile profiles. …”
    Get full text
    Article
  14. 914

    Interpretable machine learning for depression recognition with spatiotemporal gait features among older adults: a cross-sectional study in Xiamen, China by Shaowu Lin, Sicheng Li, Ya Fang

    Published 2025-07-01
    “…Four machine learning techniques including Logistic Regression, Support Vector Machine, Gradient Boosting Decision Tree, and Random Forest were employed to develop predictive models for depression. …”
    Get full text
    Article
  15. 915

    Stacking data analysis method for Langmuir multi-probe payload by Jin Wang, Jin Wang, Duan Zhang, Qinghe Zhang, Qinghe Zhang, Xinyao Xie, Fangye Zou, Qingfu Du, Qingfu Du, V. Manu, Yanjv Sun

    Published 2025-08-01
    “…The integrated characteristics of the stacking model make full use of the advantages of various models such as multilayer perceptron (MLP), support vector regression (SVR), K-nearest neighbors (KNN), and light gradient boosting machine (LightGBM). …”
    Get full text
    Article
  16. 916

    Potential of Multi-Source Multispectral vs. Hyperspectral Remote Sensing for Winter Wheat Nitrogen Monitoring by Xiaokai Chen, Yuxin Miao, Krzysztof Kusnierek, Fenling Li, Chao Wang, Botai Shi, Fei Wu, Qingrui Chang, Kang Yu

    Published 2025-08-01
    “…Three variable selection strategies (one-dimensional (1D) spectral reflectance, optimized two-dimensional (2D), and three-dimensional (3D) spectral indices) were combined with Random Forest Regression (RFR), Support Vector Machine Regression (SVMR), and Partial Least Squares Regression (PLSR) to build PNC prediction models. …”
    Get full text
    Article
  17. 917

    Predicting High-Cost Healthcare Utilization Using Machine Learning: A Multi-Service Risk Stratification Analysis in EU-Based Private Group Health Insurance by Eslam Abdelhakim Seyam

    Published 2025-07-01
    “…The research applied three machine learning algorithms, namely logistic regression using elastic net regularization, the random forest, and support vector machines. …”
    Get full text
    Article
  18. 918

    Supervised Machine Learning Models for Predicting SS304H Welding Properties Using TIG, Autogenous TIG, and A-TIG by Subhodwip Saha, Barun Haldar, Hillol Joardar, Santanu Das, Subrata Mondal, Srinivas Tadepalli

    Published 2025-06-01
    “…A total of 80% of the collected dataset was used for training the models, while the remaining 20% was reserved for testing their performance. Six ML algorithms—Artificial Neural Network (ANN), K-Nearest Neighbors (KNN), Support Vector Regression (SVR), Random Forest (RF), Gradient Boosting Regression (GBR), and Extreme Gradient Boosting (XGBoost)—were implemented to assess their predictive accuracy. …”
    Get full text
    Article
  19. 919

    Development and validation of a hypoxemia prediction model in middle-aged and elderly outpatients undergoing painless gastroscopy by Leilei Zheng, Xinyan Wu, Wei Gu, Rui Wang, Jing Wang, Hongying He, Zhao Wang, Bin Yi, Yi Zhang

    Published 2025-05-01
    “…Five machine learning algorithm models, including logistic regression (LR), support vector machine (SVM), random forest (RF), extreme gradient boosting (XGB), and light gradient boosting machine (LightGBM), were selected. …”
    Get full text
    Article
  20. 920

    Predicting the risk of postoperative gastrointestinal bleeding in patients with Type A aortic dissection based on an interpretable machine learning model by Lin Li, Xing Yang, Wei Guo, Wenxian Wu, Meixia Guo, Huanhuan Li, Xueyan Wang, Siyu Che

    Published 2025-05-01
    “…Predictors were screened using LASSO regression, and four ML algorithms—Random Forest (RF), K-nearest neighbor (KNN), Support Vector Machines (SVM), and Decision Tree (DT)—were employed to construct models for predicting postoperative GIB risk. …”
    Get full text
    Article