Search alternatives:
tree » three (Expand Search)
Showing 41 - 60 results of 100 for search 'Random binary tree', query time: 0.09s Refine Results
  1. 41

    Lightweight Deepfake Detection Based on Multi-Feature Fusion by Siddiqui Muhammad Yasir, Hyun Kim

    Published 2025-02-01
    “…Moreover, the features extracted with a histogram of oriented gradients (HOG), local binary pattern (LBP), and KAZE bands were integrated to evaluate using random forest, extreme gradient boosting, extra trees, and support vector classifier algorithms. …”
    Get full text
    Article
  2. 42

    Pembentukan Model Pohon Keputusan pada Database Car Evaluation Menggunakan Statistik Chi-Square by Retno Maharesi

    Published 2022-03-01
    “…The resulting decision tree should have a minimal structure like a binary tree. …”
    Get full text
    Article
  3. 43
  4. 44
  5. 45
  6. 46

    An Empirical Evaluation of Supervised Learning Methods for Network Malware Identification Based on Feature Selection by C. Manzano, C. Meneses, P. Leger, H. Fukuda

    Published 2022-01-01
    “…The empirical results show that random forest obtains an average accuracy of 96% and an AUC-ROC of 0.98 in binary classification. …”
    Get full text
    Article
  7. 47

    Optimizing Cardiovascular Risk Assessment with a Soft Voting Classifier Ensemble by Ammar Oad, Zulfikar Ahmed Maher, Imtiaz Hussain Koondhar, Karishima Kumari, Hammad Bacha

    Published 2024-12-01
    “…The proposed ensemble soft voting classifier employs an ensemble of seven machine learning algorithms to provide binary classification, the Naïve Bayes K Nearest Neighbor SVM Kernel Decision Tree Random Forest Logistic Regression and Support Vector Classifier. …”
    Get full text
    Article
  8. 48

    Predicting chronic kidney disease progression using small pathology datasets and explainable machine learning models by Sandeep Reddy, Supriya Roy, Kay Weng Choy, Sourav Sharma, Karen M Dwyer, Chaitanya Manapragada, Zane Miller, Joy Cheon, Bahareh Nakisa

    Published 2024-01-01
    “…Results: Internal validation achieved exceptional predictive accuracy, with the area under the receiver operating characteristic curve (ROC-AUC) reaching 0.94 and 0.98 on the binary task of predicting kidney failure for decision tree and random forest, respectively. …”
    Get full text
    Article
  9. 49

    Using machine learning for the assessment of ecological status of unmonitored waters in Poland by Andrzej Martyszunis, Małgorzata Loga, Karol Przeździecki

    Published 2024-10-01
    “…The pivotal solution was implementation of ML techniques which enable processing of seemingly unrelated information concerning pressures in the catchment. Decision Tree, Random Forest, KNN, Support Vector Machine, Multinomial Naive Bayes, XGBoost models have been tested and the results indicated most suitable techniques. …”
    Get full text
    Article
  10. 50

    A feature selection and scoring scheme for dimensionality reduction in a machine learning task by PHILEMON UTEN EMMOH, christopher ifeanyi Eke, Timothy Moses

    Published 2025-02-01
    “…The experimental results of the proposed technique on lung cancer dataset shows that logistic regression, decision tree, adaboost, gradient boost and random forest produced a predictive accuracy of 0.919%, 0.935%, 0.919%, 0.935% and 0.935% respectively, and that of happiness classification dataset produced a predictive accuracy of 0.758%, 0.689%, 0.724%, 0.655% and 0.689% on random forest, k-nearest neighbor, decision tree, gradient boost and cat boost respectively, which outperformed the existing techniques. …”
    Get full text
    Article
  11. 51

    Tachyon: Enhancing stacked models using Bayesian optimization for intrusion detection using different sampling approaches by T. Anitha Kumari, Sanket Mishra

    Published 2024-09-01
    “…This paper introduces Tachyon, a combination of various statistical and tree-based Artificial Intelligence (AI) techniques, such as Extreme Gradient Boosting (XGBoost), Random Forest (RF), Bidirectional Auto-Regressive Transformers (BART), Logistic Regression (LR), Multivariate Adaptive Regression Splines (MARS), Decision Tree (DT), and a top k stack ensemble to distinguish between normal and malicious attacks in a binary classification setting. …”
    Get full text
    Article
  12. 52

    Towards precision oncology: a multi-level cancer classification system integrating liquid biopsy and machine learning by Amr Eledkawy, Taher Hamza, Sara El-Metwally

    Published 2025-04-01
    “…A majority vote feature selection process is employed by combining six feature selectors: Information Value, Chi-Square, Random Forest Feature Importance, Extra Tree Feature Importance, Recursive Feature Elimination, and L1 Regularization. …”
    Get full text
    Article
  13. 53

    Frobenius deep feature fusion architecture to detect diabetic retinopathy by C. Priyadharsini, Y. Asnath Victy Phamila

    Published 2025-03-01
    “…The proposed approach delves into various phases- data collection and data pre-processing, feature extraction from VGG16 and Densenet201, feature selection using Random Forest, feature fusion using Frobenius norm, and classification using stacked ensembling of XGBoost classifier and ExtraTreeClassifier with SVC as meta-learner. …”
    Get full text
    Article
  14. 54

    A Comparison of Machine Learning Algorithms for Predicting Alzheimer’s Disease Using Neuropsychological Data by Zakaria Mokadem, Mohamed Djerioui, Bilal Attallah, Youcef Brik

    Published 2024-12-01
    “…This study investigates the predictive performance of nine supervised machine learning algorithms—Logistic Regression, Decision Tree, Random Forest, K-Nearest Neighbors, Support Vector Machine, Gaussian Naïve Bayes, Multi-Layer Perceptron, eXtreme Gradient Boost, and Gradient Boosting—using neuropsychological assessment data. …”
    Get full text
    Article
  15. 55

    Comparison of Various Feature Extractors and Classifiers in Wood Defect Detection by Kenan Kiliç, Kazım Kiliç, İbrahim Alper Doğru, Uğur Özcan

    Published 2025-01-01
    “…The findings show that the most effective features in detecting defective wood are extracted by the Local Binary Pattern (LBP) method and the most effective classifier is the Random Forest Algorithm. …”
    Get full text
    Article
  16. 56

    Fingernail analysis management system using microscopy sensor and blockchain technology by Shih Hsiung Lee, Chu Sing Yang

    Published 2018-03-01
    “…It uses support vector machine and random forest tree for classification. The performance of each feature extraction algorithm was analyzed for the two classifiers and the deep neural network algorithm was used comparatively. …”
    Get full text
    Article
  17. 57

    Detection and Analysis of Malicious Software Using Machine Learning Models by Selman Hızal, Ahmet Öztürk

    Published 2024-08-01
    “…The evaluated algorithms include Random Tree (RT), Random Forest (RF), J-48 (C4.5), Naive Bayes (NB), and XGBoost. …”
    Get full text
    Article
  18. 58

    Recognition Method of Corn and Rice Crop Growth State Based on Computer Image Processing Technology by Li Tian, Chun Wang, Hailiang Li, Haitian Sun

    Published 2022-01-01
    “…According to contour information of images, the block wavelet transform method is used for feature adaptive matching. The binary tree structure is used to divide the growth period of corn and rice crops. …”
    Get full text
    Article
  19. 59

    Shape Penalized Decision Forests for Imbalanced Data Classification by Rahul Goswami, Aindrila Garai, Payel Sadhukhan, Palash Ghosh, Tanujit Chakraborty

    Published 2025-01-01
    “…While traditional machine learning models and modern deep learning techniques struggle with such imbalances, decision trees and random forests combined with data sampling strategies have shown effectiveness, especially for tabular datasets. …”
    Get full text
    Article
  20. 60

    Assessment of Machine Learning Algorithms in Short-term Forecasting of PM10 and PM2.5 Concentrations in Selected Polish Agglomerations by Bartosz Czernecki, Michał Marosz, Joanna Jędruszkiewicz

    Published 2021-03-01
    “…We tested four ML models: AIC-based stepwise regression, two tree-based algorithms (random forests and XGBoost), and neural networks. …”
    Get full text
    Article