Search alternatives:
errors » error (Expand Search)
Showing 1,181 - 1,200 results of 1,673 for search 'forest errors', query time: 0.10s Refine Results
  1. 1181

    Tunnel water inflow prediction using explainable machine learning and augmented partially missing dataset by Shengdong Ju, Guangzhao Ou, Tao Peng, Yanning Wang, Quanlin Song, Peng Guan

    Published 2025-04-01
    “…The results indicate that: (1) The constructed BO-XGBoost model exhibited exceptionally high predictive accuracy on the test set, with a root mean square error (RMSE) of 7.5603, mean absolute error (MAE) of 3.2940, mean absolute percentage error (MAPE) of 4.51%, and coefficient of determination (R2) of 0.9755; (2) Compared to the predictive performance of support vector mechine (SVR), decision tree (DT), and random forest (RF) models, the BO-XGBoost model demonstrates the highest R2 values and the smallest prediction error; (3) The input feature importance yielded by SHAP is groundwater level (h) > water-producing characteristics (W) > tunnel burial depth (H) > rock mass quality index (RQD). …”
    Get full text
    Article
  2. 1182

    The relative efficiency of time‐to‐progression and continuous measures of cognition in presymptomatic Alzheimer's disease by Dan Li, Samuel Iddi, Paul S. Aisen, Wesley K. Thompson, Michael C. Donohue, Alzheimer's Disease Neuroimaging Initiative

    Published 2019-01-01
    “…Simulated progression events are algorithmically derived from the continuous assessments using a random forest model fit to the same data. Results We find that power is approximately doubled with models of repeated continuous outcomes compared with the time‐to‐progression analysis. …”
    Get full text
    Article
  3. 1183

    A Comparative Study of Electric Vehicles Battery State of Charge Estimation Based on Machine Learning and Real Driving Data by Salma Ariche, Zakaria Boulghasoul, Abdelhafid El Ouardi, Abdelhadi Elbacha, Abdelouahed Tajer, Stéphane Espié

    Published 2024-12-01
    “…The neural networks consistently show high predictive precision across different scenarios within the datasets, outperforming other models by achieving the lowest mean squared error (MSE) and the highest R<sup>2</sup> values.…”
    Get full text
    Article
  4. 1184

    Accurate Time-to-Target Forecasting for Autonomous Mobile Robots by Stefan-Alexandru Precup, Arpad Gellert, Alexandru Matei, Bogdan-Constantin Pirvu, Constantin-Bala Zamfirescu

    Published 2025-01-01
    “…This paper addresses this challenge by evaluating the effectiveness of four time forecasting methods: Linear Regression, Random Forest Regression, Transformer and Bidirectional Long Short-Term Memory (BiLSTM) networks. …”
    Get full text
    Article
  5. 1185

    GRU–Transformer Hybrid Model for GNSS/INS Integration in Orchard Environments by Peng Gao, Jinzhen Fang, Junlin He, Shuang Ma, Guanghua Wen, Zhen Li

    Published 2025-05-01
    “…Forest field tests demonstrate that GRU-T significantly improves positioning accuracy. …”
    Get full text
    Article
  6. 1186

    A student academic performance prediction model based on the interval belief rule base by Wenkai Zhou, Yunsong Li, Jiaxing Li, Tianhao Zhang, Xiping Duan, Ning Ma, Yuhe Wang

    Published 2025-08-01
    “…To overcome these challenges, an SPP model using an interval BRB structure based on the random forest (RF) attribute selection method (IBRB-C) is proposed. …”
    Get full text
    Article
  7. 1187

    Lightweight CNC digital process twin framework: IIoT integration with open62541 OPC UA protocol by Arivazhagan Anbalagan, Waqir Yusuf Zanhar, Shone George, Marcos Kauffman, Tengfei Long

    Published 2025-12-01
    “…This data trained five ML models to predict sensor positions with high accuracies (Random-Forest: R²(0.9994), KNN: R²(0.9998). Predictions validated key digital twin functions, including error estimation, synthetic data fidelity, and system integrity. …”
    Get full text
    Article
  8. 1188

    Modelling the Temperature of a Data Centre Cooling System Using Machine Learning Methods by Adam Kula, Daniel Dąbrowski, Marcin Blachnik, Maciej Sajkowski, Albert Smalcerz, Zygmunt Kamiński

    Published 2025-05-01
    “…The proposed solution compares two new neural network architectures, namely Time-Series Dense Encoder (TiDE) and Time-Series Mixer (TSMixer) with classical methods such as Random Forest and XGBoost and AutoARIMA. The obtained results indicate that the lowest prediction error was achieved by the TiDE model allowing to achieve 0.1270 of N-RMSE followed by the XGBoost model with 0.1275 of N-RMSE. …”
    Get full text
    Article
  9. 1189

    Predicting hospital outpatient volume using XGBoost: a machine learning approach by Lingling Zhou, Qin Zhu, Qian Chen, Ping Wang, Hao Huang

    Published 2025-05-01
    “…Model performance was assessed using three metrics: Mean Absolute Error (MAE), Root Mean Squared Error (RMSE) , Mean Absolute Percentage Error (MAPE), and R-squared (R2) metrics. …”
    Get full text
    Article
  10. 1190

    Resource Optimization for Grid-Connected Smart Green Townhouses Using Deep Hybrid Machine Learning by Seyed Morteza Moghimi, Thomas Aaron Gulliver, Ilamparithi Thirumarai Chelvan, Hossen Teimoorinia

    Published 2024-12-01
    “…In particular, the Mean Absolute Percentage Error (MAPE) is below 5%, the Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) are within acceptable levels, and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi>R</mi><mn>2</mn></msup></semantics></math></inline-formula> is consistently above 0.85. …”
    Get full text
    Article
  11. 1191

    Prediction of Highway Tunnel Pavement Performance Based on Digital Twin and Multiple Time Series Stacking by Gang Yu, Shuang Zhang, Min Hu, Y. Ken Wang

    Published 2020-01-01
    “…The prediction accuracy evaluation shows that the mean absolute error (MAE) is 0.1314, the root mean squared error (RMSE) is 0.0386, the mean absolute percentage error (MAPE) is 5.10%, and the accuracy is 94.90%. …”
    Get full text
    Article
  12. 1192

    Application of Extra-Trees Regression and Tree-Structured Parzen Estimators Optimization Algorithm to Predict Blast-Induced Mean Fragmentation Size in Open-Pit Mines by Madalitso Mame, Shuai Huang, Chuanqi Li, Jian Zhou

    Published 2025-07-01
    “…Among the evaluated models, the TPE-ET model exhibits the best performance with a coefficient of determination (<i>R</i><sup>2</sup>), root mean squared error (RMSE), mean absolute error (MAE), and max error of 0.93, 0.04, 0.03, and 0.25 during the testing phase. …”
    Get full text
    Article
  13. 1193

    Spatial interpolation of cropland soil bulk density by increasing soil samples with filled missing values by Aiwen Li, Jinli Cheng, Dan Chen, Wendan Li, Yaruo Mao, Xinyi Chen, Bin Zhao, Wenjiao Shi, Tianxiang Yue, Qiquan Li

    Published 2025-03-01
    “…The RBFNN model, tailored for each sub-watershed, yielded the highest accuracy in filling missing BD, with an increase in coefficient of determination (R 2) by 19.54–37.36% and reductions in mean absolute error (MAE), mean relative error (MRE) and root mean square error (RMSE) by 8.91–14.81%, 9.02–16.22% and 7.71–13.61%, respectively. …”
    Get full text
    Article
  14. 1194

    Analysis of the mechanism of physical activity enhancing well-being among college students using artificial neural network by Yuxin Cong, Roxana Dev Omar Dev, Shamsulariffin Bin Samsudin, Kaihao Yu

    Published 2025-07-01
    “…The results show that the proposed LSTM + CNN model has achieved significant improvement on the test set. Its mean absolute error is only 0.072, the mean square error is 0.00596, and the root mean square error is 0.077, which is remarkably superior to traditional machine learning methods such as random forest and support vector regression. …”
    Get full text
    Article
  15. 1195

    Smart Agile Prioritization and Clustering: An AI-Driven Approach for Requirements Prioritization by Aya M. Radwan, Manal A. Abdel-Fattah, Wael Mohamed

    Published 2025-01-01
    “…Various machine learning algorithms are tested, with KNN and Random Forest demonstrating the highest accuracy and lowest Mean Squared Error (MSE), outperforming traditional prioritization techniques. …”
    Get full text
    Article
  16. 1196

    Development and validation of machine learning models for predicting post-cesarean pain and individualized pain management strategies: a multicenter study by Shenjuan Lv, Ning Sun, Chunhui Hao, Junqing Li, Yun Li

    Published 2025-04-01
    “…Method The study analyzed the efficacy of eight ML models, including XGBoost, Random Forest, and Neural Networks, using data from two distinct hospital cohorts. …”
    Get full text
    Article
  17. 1197

    Machine Learning and Multilayer Perceptron-Based Customized Predictive Models for Individual Processes in Food Factories by Byunghyun Lim, Dongju Kim, Woojin Cho, Jae-Hoi Gu

    Published 2025-06-01
    “…Additionally, it proposes a customized predictive model employing four machine learning algorithms—linear regression, decision tree, random forest, and k-nearest neighbor—as well as two deep learning algorithms: long short-term memory and multi-layer perceptron. …”
    Get full text
    Article
  18. 1198

    A Novel Classification of Uncertain Stream Data using Ant Colony Optimization Based on Radial Basis Function by Tahsin Ali Mohammed Amin, Sabah Robitan Mahmood, Rebar Dara Mohammed, Pshtiwan Jabar Karim

    Published 2022-11-01
    “…Error metrics show that our model significantly outperforms the gold standard and other popular ML methods. …”
    Get full text
    Article
  19. 1199

    Prediction Model of Household Carbon Emission in Old Residential Areas in Drought and Cold Regions Based on Gene Expression Programming by Shiao Chen, Yaohui Gao, Zhaonian Dai, Wen Ren

    Published 2025-07-01
    “…., electricity usage and heating energy consumption) were selected using Pearson correlation analysis and the Random Forest (RF) algorithm. Subsequently, a hybrid prediction model was constructed, with its parameters optimized by minimizing the root mean square error (RMSE) as the fitness function. …”
    Get full text
    Article
  20. 1200

    Mapping soil organic carbon stocks of different land use types in the Southern Moscow region by applying machine learning to legacy data by Yury A. Dvornikov, Lukyan A. Mirniy, Ekaterina S. Mukvich, Kristina V. Ivashchenko

    Published 2024-12-01
    “…At the same time, the spectral reflectance in the near infrared band (B5) of Landsat‑5 TM made the greatest contribution in explaining the differences within individual types (among fallow lands and urbanized areas), and the spectral index NDVI has explained the spatial variability of soil organic carbon among forest ecosystems. The root mean square error of cross-validation (RMSEcv = 0.67 kg/m2) was chosen to describe the uncertainty of soil organic carbon stock prediction. …”
    Get full text
    Article