Showing 1,701 - 1,720 results of 2,852 for search 'support (vector OR sector) machine algorithm', query time: 0.18s Refine Results
  1. 1701

    Utilizing machine learning techniques to identify severe sleep disturbances in Chinese adolescents: an analysis of lifestyle, physical activity, and psychological factors by Lirong Zhang, Shaocong Zhao, Wei Yang, Zhongbing Yang, Zhi’an Wu, Hua Zheng, Mingxing Lei, Mingxing Lei, Mingxing Lei

    Published 2024-11-01
    “…Participants in the training set were utilized to establish models, and the logistic regression (LR) and five machine learning algorithms, including the eXtreme Gradient Boosting Machine (XGBM), Naïve Bayesian (NB), Support Vector Machine (SVM), Decision Tree (DT), CatBoosting Machine (CatBM), were utilized to develop models. …”
    Get full text
    Article
  2. 1702

    Supervised Machine Learning Models for Predicting SS304H Welding Properties Using TIG, Autogenous TIG, and A-TIG by Subhodwip Saha, Barun Haldar, Hillol Joardar, Santanu Das, Subrata Mondal, Srinivas Tadepalli

    Published 2025-06-01
    “…A total of 80% of the collected dataset was used for training the models, while the remaining 20% was reserved for testing their performance. Six ML algorithms—Artificial Neural Network (ANN), K-Nearest Neighbors (KNN), Support Vector Regression (SVR), Random Forest (RF), Gradient Boosting Regression (GBR), and Extreme Gradient Boosting (XGBoost)—were implemented to assess their predictive accuracy. …”
    Get full text
    Article
  3. 1703

    Interpretable machine learning models for predicting in-hospital mortality in patients with chronic critical illness and heart failure: A multicenter study by Min He, Yongqi Lin, Siyu Ren, Pengzhan Li, Guoqing Liu, Liangbo Hu, Xueshuang Bei, Lingyan Lei, Yue Wang, Qianghong Zhang, Xiaocong Zeng

    Published 2025-06-01
    “…Key predictive variables were identified through recursive feature elimination. A range of ML algorithms, including random forest, K-nearest neighbors, and support vector machine (SVM), were evaluated alongside four other models. …”
    Get full text
    Article
  4. 1704
  5. 1705

    Machine learning and response surface methodology forecasting comparison for improved spray dry scrubber performance with brine sludge-derived sorbent by B.J. Chepkonga, L. Koech, R.S. Makomere, H.L. Rutto

    Published 2025-03-01
    “…The effects of key process parameters in spray drying (sorbent particle size, inlet gas phase temperature, and Ca:S ratio) on desulfurization efficiency were investigated using central composite design (CCD). Three machine learning (ML) models, multilayer perceptron (MLP), support vector regressor (SVR), and light gradient boosting machine (LightGBM), were assessed for their output estimation accuracy and compared to the CCD prediction model. …”
    Get full text
    Article
  6. 1706

    Machine learning-based predictive modeling of angina pectoris in an elderly community-dwelling population: Results from the PoCOsteo study. by Shahrokh Mousavi, Zahrasadat Jalalian, Sima Afrashteh, Akram Farhadi, Iraj Nabipour, Bagher Larijani

    Published 2025-01-01
    “…We developed the following models: logistic regression (LR), multilayer perceptron (MLP), support vector machine (SVM), k-nearest neighbors (KNN), linear and quadratic discriminant analysis (LDA, QDA), decision tree (DT), and two ensemble models: random forest (RF) and adaptive boosting (AdaBoost). …”
    Get full text
    Article
  7. 1707

    Establishing a preoperative predictive model for gallbladder adenoma and cholesterol polyps based on machine learning: a multicentre retrospective study by Yubing Wang, Chao Qu, Jiange Zeng, Yumin Jiang, Ruitao Sun, Changlei Li, Jian Li, Chengzhi Xing, Bin Tan, Kui Liu, Qing Liu, Dianpeng Zhao, Jingyu Cao, Weiyu Hu

    Published 2025-01-01
    “…Results Among the 110 combination predictive models, the Support Vector Machine + Random Forest (SVM + RF) model demonstrated the highest AUC values of 0.972 and 0.922 in the training and internal validation sets, respectively, indicating an optimal predictive performance. …”
    Get full text
    Article
  8. 1708

    Color-Sensitive Sensor Array Combined with Machine Learning for Non-Destructive Detection of AFB<sub>1</sub> in Corn Silage by Daqian Wan, Haiqing Tian, Lina Guo, Kai Zhao, Yang Yu, Xinglu Zheng, Haijun Li, Jianying Sun

    Published 2025-07-01
    “…Five machine learning models were constructed: Light Gradient Boosting Machine (LightGBM), XGBoost, Support Vector Regression (SVR), RF, and K-Nearest Neighbor (KNN). …”
    Get full text
    Article
  9. 1709

    Explainable machine learning model and nomogram for predicting the efficacy of Traditional Chinese Medicine in treating Long COVID: a retrospective study by Jisheng Zhang, Yang Chen, Aijun Zhang, Yi Yang, Liqian Ma, Hangqi Meng, Jintao Wu, Kean Zhu, Jiangsong Zhang, Ke Lin, Xianming Lin

    Published 2025-03-01
    “…Data from 1,204 patients served as the training set, while 127 patients formed the testing set.ResultsWe employed five ML algorithms: Support Vector Machine (SVM), Random Forest (RF), K-Nearest Neighbors (KNN), Extreme Gradient Boosting (XGBoost), and Neural Network (NN). …”
    Get full text
    Article
  10. 1710

    Prediction of obesity levels based on physical activity and eating habits with a machine learning model integrated with explainable artificial intelligence by Yasin Görmez, Fatma Hilal Yagin, Burak Yagin, Yalin Aygun, Hulusi Boke, Georgian Badicu, Matheus Santos De Sousa Fernandes, Abedalrhman Alkhateeb, Mahmood Basil A. Al-Rawi, Mohammadreza Aghaei, Mohammadreza Aghaei

    Published 2025-07-01
    “…The inclusion of XAI methodologies facilitates a comprehensive understanding of the risk factors influencing the model predictions and thus increases transparency in the identification of obesity risk factors.MethodsSix ML models were used: Bernoulli Naive Bayes, CatBoost, Decision Tree, Extra Trees Classifier, Histogram-based Gradient Boosting and Support Vector Machine. For each model, hyperparameters were tuned by random search methodology and model effectiveness was evaluated by repeated holdout testing. …”
    Get full text
    Article
  11. 1711

    Stratified allocation method for water injection based on machine learning: A case study of the Bohai A oil and gas field by Changlong Liu, Pingli Liu, Qiang Wang, Lu Zhang, Zechao Huang, Yuande Xu, Shaojiu Jiang, Le Zhang, Changxiao Cao

    Published 2025-04-01
    “…Second, the training and prediction effects of three machine learning prediction models—support vector machine, BP neural network, and random forest—were compared, and the BP neural network was selected as the machine learning mathematical model for injection allocation optimization. …”
    Get full text
    Article
  12. 1712

    Machine learning-based prognostic prediction for acute ischemic stroke using whole-brain and infarct multi-PLD ASL radiomics by Zhenyu Wang, Chaojun Jiang, Xianxian Zhang, Tianchi Mu, Qingqing Li, Shu Wang, Congsong Dong, Yuan Shen, Zhenyu Dai, Fei Chen

    Published 2025-07-01
    “…Among all models, the comprehensive model based on a support vector machine achieved the highest predictive performance (AUC = 0.904). …”
    Get full text
    Article
  13. 1713

    Using machine learning for mortality prediction and risk stratification in atezolizumab‐treated cancer patients: Integrative analysis of eight clinical trials by Yougen Wu, Wenyu Zhu, Jing Wang, Lvwen Liu, Wei Zhang, Yang Wang, Jindong Shi, Ju Xia, Yuting Gu, Qingqing Qian, Yang Hong

    Published 2023-02-01
    “…The whole cohort was randomly split into development and validation cohorts in a 7:3 ratio. Machine‐learning algorithms (extreme gradient boosting, random forest, logistic regression with lasso regularization, support vector machine, and K‐nearest neighbor) were applied to develop prediction models. …”
    Get full text
    Article
  14. 1714
  15. 1715

    Min3GISG: A Synergistic Feature Selection Framework for Industrial Control System Security with the Integrating Genetic Algorithm and Filter Methods by Saiprasad Potharaju, Swapnali N. Tambe, G. Madhukar Rao, M. V. V. Prasad Kantipudi, Kalyan Devappa Bamane, Mininath Bendre

    Published 2025-05-01
    “…These features were used to train classification models (Naive Bayes (NB), Random Forest (RF), and Support Vector Machine (SVM)) with a 70:30 train-test split and tenfold cross-validation. …”
    Get full text
    Article
  16. 1716

    Machine learning models for predicting metabolic dysfunction-associated steatotic liver disease prevalence using basic demographic and clinical characteristics by Gangfeng Zhu, Yipeng Song, Zenghong Lu, Qiang Yi, Rui Xu, Yi Xie, Shi Geng, Na Yang, Liangjian Zheng, Xiaofei Feng, Rui Zhu, Xiangcai Wang, Li Huang, Yi Xiang

    Published 2025-03-01
    “…Using eight demographic and clinical characteristics (age, educational level, height, weight, waist and hip circumference, and history of hypertension and diabetes), we built predictive models for MASLD (classified as none or mild: controlled attenuation parameter (CAP) ≤ 269 dB/m; moderate: 269–296 dB/m; severe: CAP > 296 dB/m) employing 10 machine learning algorithms: logistic regression (LR), multilayer perceptron (MLP), extreme gradient boosting (XGBoost), bootstrap aggregating, decision tree, K-nearest neighbours, light gradient boosting machine, naive Bayes, random forest, and support vector machine. …”
    Get full text
    Article
  17. 1717

    Development and validation of a machine learning-based risk prediction model for stroke-associated pneumonia in older adult hemorrhagic stroke by Yi Cao, Yi Cao, Haipeng Deng, Shaoyun Liu, Xi Zeng, Yangyang Gou, Weiting Zhang, Yixinyuan Li, Hua Yang, Min Peng

    Published 2025-06-01
    “…Advanced age [OR = 1.064, 95% CI (1.024, 1.104)], smoking[OR = 2.488, 95% CI (1.460, 4.24)], low GCS score [OR = 0.675, 95% CI (0.553, 0.825)], low Braden score [OR = 0.741, 95% CI (0.640, 0.858)], and nasogastric tube [OR = 1.761, 95% CI (1.048, 2.960)] were identified as risk factors for SAP. Among the four machine learning algorithms evaluated [XGBoost, Logistic Regression (LR), Support Vector Machine (SVM), and Naive Bayes], the LR model demonstrated robust and consistent performance in predicting SAP among older adult patients with hemorrhagic stroke across multiple evaluation metrics. …”
    Get full text
    Article
  18. 1718

    Service quality evaluation of integrated health and social care for older Chinese adults in residential settings based on factor analysis and machine learning by Zhihan Liu, Caini Ouyang, Nian Gu, Jiaheng Zhang, Xiaojiao He, Qiuping Feng, Chunguyu Chang

    Published 2024-12-01
    “…Objective To evaluate the service quality of integrated health and social care institutions for older adults in residential settings in China, addressing a critical gap in the theoretical and empirical understanding of service quality assurance in this rapidly expanding sector. Methods This study employs three machine learning algorithms—Backpropagation Neural Networks (BPNN), Feedforward Neural Networks (FNN), and Support Vector Machines (SVM)—to train and validate an evaluative item system. …”
    Get full text
    Article
  19. 1719

    A novel method to predict the haemoglobin concentration after kidney transplantation based on machine learning: prediction model establishment and method optimization by Songping He, Xiangxi Li, Fangyu Peng, Jiazhi Liao, Xia Lu, Hui Guo, Xin Tan, Yanyan Chen

    Published 2025-07-01
    “…Finally, five kinds of machine learning methods, random forest, extreme gradient boosting, light gradient boosting machine, linear support vector classifier and support vector machine, were used to establish classification prediction models, and error-correcting output codes were used to optimize each model. …”
    Get full text
    Article
  20. 1720

    Applying machine learning to predict bowel preparation adequacy in elderly patients for colonoscopy: development and validation of a web-based prediction tool by Jianying Liu, Wei Jiang, Yahong Yu, Jiali Gong, Guie Chen, Yuxing Yang, Chao Wang, Dalong Sun, Xuefeng Lu

    Published 2025-12-01
    “…Clinical data from 471 elderly patients collected between February and December 2023 were utilized for developing and internally validating the model, while 221 patients’ data from March to June 2024 were used for external validation. The Boruta algorithm was applied for feature selection. Models including logistic regression, light gradient boosting machines, support vector machines (SVM), decision trees, random forests, and extreme gradient boosting were evaluated using metrics such as AUC, accuracy, sensitivity, and specificity. …”
    Get full text
    Article