Search alternatives:
errors » error (Expand Search)
Showing 661 - 680 results of 1,673 for search 'forest errors', query time: 0.10s Refine Results
  1. 661

    Ensemble Machine Learning Model Prediction and Metaheuristic Optimisation of Oil Spills Using Organic Absorbents: Supporting Sustainable Maritime by Le Quang Dung, Pham Duc, Bui Thi Anh Em, Nguyen Lan Huong, Nguyen Phuoc Quy Phong, Dang Thanh Nam

    Published 2025-06-01
    “…To close this gap, our work combines metaheuristic algorithms with ensemble machine learning and suggests a hybrid technique for the precise prediction and improvement of oil removal efficiency. Using Random Forest (RF) and XGBoost models, high R² values (RF: 0.9517–0.9559; XGBoost: 0.9760), minimal errors, and strong generalisation were obtained by predictive modelling. …”
    Get full text
    Article
  2. 662

    Parametric Forecast of Solar Energy over Time by Applying Machine Learning Techniques: Systematic Review by Fernando Venâncio Mucomole, Carlos Augusto Santos Silva, Lourenço Lázaro Magaia

    Published 2025-03-01
    “…The results revealed strong trends towards the adoption of artificial neural network (ANN), random forest (RF), and simple linear regression (SLR) models for a sample taken from the Nipepe station in Niassa, validated by a PF model with errors of 0.10, 0.11, and 0.15. …”
    Get full text
    Article
  3. 663

    A traceability model for upper corner gas in fully mechanized mining faces based on XGBoost-SHAP by SHENG Wu, WANG Lingzi

    Published 2025-06-01
    “…Case analysis results showed that: ① the coefficient of determination (R2), mean absolute error (MAE), and root mean square error (RMSE) of the XGBoost model were 0.93, 0.007, and 0.008, respectively, indicating the highest goodness of fit and the lowest errors compared with random forest (RF), support vector regression (SVR), and gradient boosting decision tree (GBDT). ② The mean relative error of the XGBoost model was 4.478%, demonstrating higher accuracy and better generalization performance compared with the other models. ③ Based on the mean absolute SHAP values of input features, the gas concentration at T1 on the working face had the greatest influence on the gas concentration in the upper corner, followed by the gas concentration in the upper corner extraction pipeline, with the gas content and roof pressure of the mining coal seam following closely. …”
    Get full text
    Article
  4. 664
  5. 665
  6. 666

    A machine learning-based method for predicting the shear behaviors of rock joints by Liu He, Yu Tan, Timothy Copeland, Jiannan Chen, Qiang Tang

    Published 2024-12-01
    “…In this study, machine learning prediction models (MLPMs), including artificial neural network (ANN), support vector regression (SVR), K-nearest neighbors (KNN), and random forest (RF) algorithms, were developed to predict the peak shear stress values and shear stress-displacement curves of rock joints. …”
    Get full text
    Article
  7. 667

    Predictive Machine Learning Approaches for Supply and Manufacturing Processes Planning in Mass-Customization Products by Shereen Alfayoumi, Amal Elgammal, Neamat El-Tazi

    Published 2025-02-01
    “…This experimentation included K-Nearest Neighbors with regression and Random Forest from the machine learning family, as well as Neural Networks and Ensembles as deep learning approaches. …”
    Get full text
    Article
  8. 668

    Study on the temperature prediction model of residual coal in goaf based on ACO-KELM by ZHAI Xiaowei, WANG Chen, HAO Le, LI Xintian, HOU Qinyuan, MA Teng

    Published 2024-12-01
    “…Compared to the prediction models based on extreme learning machine (ELM) and random forest (RF) algorithms, the ACO-KELM model achieved an average absolute error of 0.0701 ℃ and a root mean square error (RMSE) of 0.0748 ℃ on the test set, reducing these errors by 65% and 195%, respectively, compared to the ELM-based model, and by 53% and 156%, respectively, compared to the RF-based model. …”
    Get full text
    Article
  9. 669

    Global Ionospheric TEC Forecasting for Geomagnetic Storm Time Using a Deep Learning‐Based Multi‐Model Ensemble Method by Xiaodong Ren, Pengxin Yang, Dengkui Mei, Hang Liu, Guozhen Xu, Yue Dong

    Published 2023-03-01
    “…In this study, we developed a new deep learning‐based multi‐model ensemble method (DLMEM) to forecast geomagnetic storm‐time ionospheric TEC that combines the Random Forest (RF) model, the Extreme Gradient Boosting (XGBoost) algorithm, and the Gated Recurrent Unit (GRU) network with the attention mechanism. …”
    Get full text
    Article
  10. 670

    Ammonia and ethanol detection via an electronic nose utilizing a bionic chamber and a sparrow search algorithm-optimized backpropagation neural network. by Yeping Shi, Yunbo Shi, Haodong Niu, Jinzhou Liu, Pengjiao Sun

    Published 2024-01-01
    “…In tests comparing the performance of the SSA-BPNN, support vector machine (SVM), and random forest (RF) models, the SSA-BPNN achieves a 99.1% classification accuracy, better than the SVM and RF models. …”
    Get full text
    Article
  11. 671

    Fuzzy PD control for a quadrotor with experimental results by Anh T. Nguyen, Nam H. Nguyen, Mien L. Trinh

    Published 2025-06-01
    “…Quadrotor is an unmanned aerial vehicle widely used in traffic construction monitoring, volcano monitoring, forest fire, power line inspection, missing person search and disaster relief. …”
    Get full text
    Article
  12. 672
  13. 673

    Multi-Fidelity Machine Learning for Identifying Thermal Insulation Integrity of Liquefied Natural Gas Storage Tanks by Wei Lin, Meitao Zou, Mingrui Zhao, Jiaqi Chang, Xiongyao Xie

    Published 2024-12-01
    “…The results of the data experiments demonstrate that the multi-fidelity framework outperforms models trained solely on low- or high-fidelity data, achieving a coefficient of determination of 0.980 and a root mean square error of 0.078 m. Three machine learning algorithms—Multilayer Perceptron, Random Forest, and Extreme Gradient Boosting—were evaluated to determine the optimal implementation. …”
    Get full text
    Article
  14. 674
  15. 675

    Trust in the machine: How contextual factors and personality traits shape algorithm aversion and collaboration by Vinícius Ferraz, Leon Houf, Thomas Pitz, Christiane Schwieren, Jörn Sickmann

    Published 2025-03-01
    “…We evaluated the impact of Big Five personality traits, locus of control, generalized trust, and demographics alongside the treatment effects using statistical analyses and machine learning models, including Random Forest Classifiers for delegation behavior and Uplift Random Forests for causal effects. …”
    Get full text
    Article
  16. 676

    Assessment of active fire detection in Serra da Canastra National Park using MODIS and VIIRS sensors by G. S. Pinto, H. Bernini, C. G. Messias, O. A. S. Silva, P. W. Cunha, P. S. Victorino, F. Morelli

    Published 2024-11-01
    “…The geographic data utilization for the occurrence of wildfires and forest fires for monitoring fire usage in vegetation has become increasingly important for generating information that aids in decision-making and policy development regarding climate change and its impacts. …”
    Get full text
    Article
  17. 677

    Remaining Useful Life Estimation through Deep Learning Partial Differential Equation Models: A Framework for Degradation Dynamics Interpretation Using Latent Variables by Sergio Cofre-Martel, Enrique Lopez Droguett, Mohammad Modarres

    Published 2021-01-01
    “…A latent space representation can also be used as a health state estimator through a random forest classifier with up to a 90% performance on new unseen data.…”
    Get full text
    Article
  18. 678

    An LSTM neural network prediction model of ultra-short-term transformer winding hotspot temperature by Kun Yan, Jingfu Gan, Yizhen Sui, Hongzheng Liu, Xincheng Tian, Zehan Lu, Ali Mohammed Ali Abdo

    Published 2025-03-01
    “…In addition, building back propagation neural networks and random forest prediction models to forecast using the same samples allows for a comparison with the LSTM model enhanced by improved PCA. …”
    Get full text
    Article
  19. 679

    Robust Hybrid Data-Level Approach for Handling Skewed Fat-Tailed Distributed Datasets and Diverse Features in Financial Credit Risk by Musara Keith R, Ranganai Edmore, Chimedza Charles, Matarise Florence, Munyira Sheunesu

    Published 2025-06-01
    “…This approach was coupled with widely employed ensemble algorithms, namely the random forest (RF) and the extreme gradient boost (XGBoost). …”
    Get full text
    Article
  20. 680