-
41
Lightweight Deepfake Detection Based on Multi-Feature Fusion
Published 2025-02-01“…Moreover, the features extracted with a histogram of oriented gradients (HOG), local binary pattern (LBP), and KAZE bands were integrated to evaluate using random forest, extreme gradient boosting, extra trees, and support vector classifier algorithms. …”
Get full text
Article -
42
Pembentukan Model Pohon Keputusan pada Database Car Evaluation Menggunakan Statistik Chi-Square
Published 2022-03-01“…The resulting decision tree should have a minimal structure like a binary tree. …”
Get full text
Article -
43
A stacked ensemble approach to detect cyber attacks based on feature selection techniques
Published 2024-01-01Get full text
Article -
44
Identification of direct and indirect drivers of land use and land cover changes from agriculture to Eucalyptus plantation using the DPSIR framework in Sinan and Mecha Districts of...
Published 2025-03-01“…We used purposive and simple random sampling to select study areas and households. …”
Get full text
Article -
45
-
46
An Empirical Evaluation of Supervised Learning Methods for Network Malware Identification Based on Feature Selection
Published 2022-01-01“…The empirical results show that random forest obtains an average accuracy of 96% and an AUC-ROC of 0.98 in binary classification. …”
Get full text
Article -
47
Optimizing Cardiovascular Risk Assessment with a Soft Voting Classifier Ensemble
Published 2024-12-01“…The proposed ensemble soft voting classifier employs an ensemble of seven machine learning algorithms to provide binary classification, the Naïve Bayes K Nearest Neighbor SVM Kernel Decision Tree Random Forest Logistic Regression and Support Vector Classifier. …”
Get full text
Article -
48
Predicting chronic kidney disease progression using small pathology datasets and explainable machine learning models
Published 2024-01-01“…Results: Internal validation achieved exceptional predictive accuracy, with the area under the receiver operating characteristic curve (ROC-AUC) reaching 0.94 and 0.98 on the binary task of predicting kidney failure for decision tree and random forest, respectively. …”
Get full text
Article -
49
Using machine learning for the assessment of ecological status of unmonitored waters in Poland
Published 2024-10-01“…The pivotal solution was implementation of ML techniques which enable processing of seemingly unrelated information concerning pressures in the catchment. Decision Tree, Random Forest, KNN, Support Vector Machine, Multinomial Naive Bayes, XGBoost models have been tested and the results indicated most suitable techniques. …”
Get full text
Article -
50
A feature selection and scoring scheme for dimensionality reduction in a machine learning task
Published 2025-02-01“…The experimental results of the proposed technique on lung cancer dataset shows that logistic regression, decision tree, adaboost, gradient boost and random forest produced a predictive accuracy of 0.919%, 0.935%, 0.919%, 0.935% and 0.935% respectively, and that of happiness classification dataset produced a predictive accuracy of 0.758%, 0.689%, 0.724%, 0.655% and 0.689% on random forest, k-nearest neighbor, decision tree, gradient boost and cat boost respectively, which outperformed the existing techniques. …”
Get full text
Article -
51
Tachyon: Enhancing stacked models using Bayesian optimization for intrusion detection using different sampling approaches
Published 2024-09-01“…This paper introduces Tachyon, a combination of various statistical and tree-based Artificial Intelligence (AI) techniques, such as Extreme Gradient Boosting (XGBoost), Random Forest (RF), Bidirectional Auto-Regressive Transformers (BART), Logistic Regression (LR), Multivariate Adaptive Regression Splines (MARS), Decision Tree (DT), and a top k stack ensemble to distinguish between normal and malicious attacks in a binary classification setting. …”
Get full text
Article -
52
Towards precision oncology: a multi-level cancer classification system integrating liquid biopsy and machine learning
Published 2025-04-01“…A majority vote feature selection process is employed by combining six feature selectors: Information Value, Chi-Square, Random Forest Feature Importance, Extra Tree Feature Importance, Recursive Feature Elimination, and L1 Regularization. …”
Get full text
Article -
53
Frobenius deep feature fusion architecture to detect diabetic retinopathy
Published 2025-03-01“…The proposed approach delves into various phases- data collection and data pre-processing, feature extraction from VGG16 and Densenet201, feature selection using Random Forest, feature fusion using Frobenius norm, and classification using stacked ensembling of XGBoost classifier and ExtraTreeClassifier with SVC as meta-learner. …”
Get full text
Article -
54
A Comparison of Machine Learning Algorithms for Predicting Alzheimer’s Disease Using Neuropsychological Data
Published 2024-12-01“…This study investigates the predictive performance of nine supervised machine learning algorithms—Logistic Regression, Decision Tree, Random Forest, K-Nearest Neighbors, Support Vector Machine, Gaussian Naïve Bayes, Multi-Layer Perceptron, eXtreme Gradient Boost, and Gradient Boosting—using neuropsychological assessment data. …”
Get full text
Article -
55
Comparison of Various Feature Extractors and Classifiers in Wood Defect Detection
Published 2025-01-01“…The findings show that the most effective features in detecting defective wood are extracted by the Local Binary Pattern (LBP) method and the most effective classifier is the Random Forest Algorithm. …”
Get full text
Article -
56
Fingernail analysis management system using microscopy sensor and blockchain technology
Published 2018-03-01“…It uses support vector machine and random forest tree for classification. The performance of each feature extraction algorithm was analyzed for the two classifiers and the deep neural network algorithm was used comparatively. …”
Get full text
Article -
57
Detection and Analysis of Malicious Software Using Machine Learning Models
Published 2024-08-01“…The evaluated algorithms include Random Tree (RT), Random Forest (RF), J-48 (C4.5), Naive Bayes (NB), and XGBoost. …”
Get full text
Article -
58
Recognition Method of Corn and Rice Crop Growth State Based on Computer Image Processing Technology
Published 2022-01-01“…According to contour information of images, the block wavelet transform method is used for feature adaptive matching. The binary tree structure is used to divide the growth period of corn and rice crops. …”
Get full text
Article -
59
Shape Penalized Decision Forests for Imbalanced Data Classification
Published 2025-01-01“…While traditional machine learning models and modern deep learning techniques struggle with such imbalances, decision trees and random forests combined with data sampling strategies have shown effectiveness, especially for tabular datasets. …”
Get full text
Article -
60
Assessment of Machine Learning Algorithms in Short-term Forecasting of PM10 and PM2.5 Concentrations in Selected Polish Agglomerations
Published 2021-03-01“…We tested four ML models: AIC-based stepwise regression, two tree-based algorithms (random forests and XGBoost), and neural networks. …”
Get full text
Article