Showing 301 - 320 results of 861 for search 'random binary (tree OR three)', query time: 0.13s Refine Results
  1. 301

    The Work Engagement Among Nurses in an Urban-Based Tertiary Hospital by Ampan Vimonvattana, Nontawat Benjakul

    Published 2025-07-01
    “…Participants were selected through simple random sampling. They completed an online survey including demographic data and the Utrecht Work Engagement Scale (UWES), which assesses three dimensions of engagement: vigor, dedication, and absorption. …”
    Get full text
    Article
  2. 302
  3. 303
  4. 304
  5. 305

    The association between sugar-sweetened beverage consumption, muscle strength, and psychological symptoms among Chinese adolescents: a multicenter cross-sectional survey by Yanjie Zhou, Chunhua Xue, Gulnur Ahmat, Huijuan Lou, Yun Liu, Li Ma

    Published 2025-07-01
    “…The present study may provide theoretical support and assistance for the prevention and intervention of psychological symptoms in Chinese adolescents.MethodsIn this study, 42,832 adolescents aged 12–17 years in mainland China were assessed cross-sectionally for SSB consumption, standing long jump reflecting muscle strength, psychological symptoms, and related covariates using a three-stage stratified whole-cluster random sampling method. …”
    Get full text
    Article
  6. 306

    Multi-Class Decoding of Attended Speaker Direction Using Electroencephalogram and Audio Spatial Spectrum by Yuanming Zhang, Jing Lu, Fei Chen, Haoliang Du, Xia Gao, Zhibin Lin

    Published 2025-01-01
    “…Prior research on directional focus decoding, a.k.a. selective Auditory Attention Decoding (sAAD), has primarily focused on binary “left-right” tasks. However, decoding of the attended speaker’s precise direction is desired. …”
    Get full text
    Article
  7. 307
  8. 308
  9. 309

    Comparison of Machine Learning Models to Predict Nighttime Crash Severity: A Case Study in Tyler, Texas, USA by Raja Daoud, Matthew Vechione, Okan Gurbuz, Prabha Sundaravadivel, Chi Tian

    Published 2025-02-01
    “…Then, seven machine learning techniques, namely binary logistic regression, k-nearest neighbors, naïve Bayes, random forest, artificial neural network, Extreme Gradient Boosting (XGBoost), and a Long Short-Term Memory (LSTM) model, were all applied to the unseen test data. …”
    Get full text
    Article
  10. 310

    Robust methoxy-based covalent organic frameworks membranes enable efficient near-molecular-weight selectivity by Yanqing Xu, Jiaqi Xiong, Chenfei Lin, Yixiang Yu, Qite Qiu, Junbin Liao, Huimin Ruan, Arcadio Sotto, Jiangnan Shen

    Published 2025-01-01
    “…The TFB-OMe-TAPA COFs membrane demonstrated sharp rejection profiles, separating solutes of different molecular sizes. A three-stage cascade process was used to fractionate binary molecules with varying charges, achieving a separation factor of 26.7 for heterogeneous charge molecules. …”
    Get full text
    Article
  11. 311

    The Role of Marketing Mix on Voluntary Tax Compliance: Small Taxpayers’ Experience in Dodoma City by Allen Mrindoko, Eunice Nyange

    Published 2024-06-01
    “…The quantitative data were collected from small business managers/owners who were selected using systematic random sampling technique, while qualitative data were collected from three TRA officers selected purposively. …”
    Get full text
    Article
  12. 312

    Cultivating a thriving agricultural sector: Unveiling the drivers of farmer participation in agricultural development interventions in Ghana by Magdalene Aidoo, Stephen Prah, Irene Serwaa Asante, Charles Kwame Sackey, Bright Owusu Asante

    Published 2025-01-01
    “…We utilized three models – binary probit, multivariate probit and generalized Poisson to achieve the objectives of this paper. …”
    Get full text
    Article
  13. 313

    URM: A Unified RAM Management Scheme for NAND Flash Storage Devices by A. Xiaochang Li, B. Jichen Chen, C. Zhengjun Zhai, D. Mingchen Feng, E. Xin Ye

    Published 2022-01-01
    “…Our multivariate classification is transformed into multiple binary classifications (logistic regressions). Finally, we extensively evaluate URM using various realistic workloads, and the experimental results show that, compared with three data buffer management schemes, CFLRU, BPLRU, and VBBMS, URM can improve the hit ratio of data buffer and save response time by an average to 32% and 18%, respectively.…”
    Get full text
    Article
  14. 314
  15. 315

    ELECTRONIC BAND STRUCTURE OF THE ORDERED Zn0.5Cd0.5Se ALLOY CALCULATED BY THE SEMI-EMPIRICAL TIGHT-BINDING METHOD CONSIDERING SECOND-NEAREST NEIGHBOR by Juan Carlos Salcedo-Reyes

    Published 2008-09-01
    “…Usually, semiconductor ternary alloys are studied via a pseudo-binary approach in which the semiconductoris described like a crystalline array were the cation/anion sub-lattice consist of a random distribution of thecationic/anionic atoms. …”
    Get full text
    Article
  16. 316

    Cheating Detection in Online Exams Using Deep Learning and Machine Learning by Bahaddin Erdem, Murat Karabatak

    Published 2025-01-01
    “…For regression and classification, deep neural network (DNN) from deep learning algorithms and support vector machine (SVM), decision trees (DTs), k-nearest neighbor (KNN), random forest (RF), logistic regression (LR), and extreme gradient boosting (XGBoost) algorithms from machine learning algorithms were used. …”
    Get full text
    Article
  17. 317

    Supervised machine learning applied in nursing notes for identifying the need of childhood cancer patients for psychosocial support by Akseli Reunamo, Hans Moen, Sanna Salanterä, Sanna Salanterä, Päivi M. Lähteenmäki

    Published 2025-08-01
    “…Patients with the latter label were identified by having an outpatient appointment reservation in a mental health–related care unit at least 1 year after their primary diagnosis.ResultsThe random forest classification model trained on both cancer and diabetes patients performed best for the cancer patient population in three-times repeated nested cross-validation with 0.798 mean area under the receiver operating characteristics curve and was better with 99% probability (credibility interval −0.2840 to −0.0422) than the neural network–based model using only cancer patients in training when comparing all classifiers pairwise by using the Bayesian correlated t-test.ConclusionsUsing machine learning to predict childhood cancer patients needing psychosocial support was possible using nursing notes with a good area under the receiver operating characteristics curve. …”
    Get full text
    Article
  18. 318

    Personalized diagnosis of radiation pneumonitis in breast cancer patients based on radiomics by Xiaobo Wen, Xiaobo Wen, Xiaobo Wen, Yutao Zhao, Wen Dong, Congbo Yang, Jinzhi Li, Li Sun, Yutao Xiu, Chang’e Gao, Ming Zhang

    Published 2025-07-01
    “…Eight classifiers [logistic regression (LR), support vector machine (SVM), K-nearest neighbor (KNN), random forest (RF), Extra Tree (ET), XGBoost, LightGBM, and multilayer perceptron (MLP)] were trained and evaluated using accuracy, area under the receiver operating characteristic curve (AUC) with 95% confidence intervals, sensitivity, and specificity.ResultsIn the independent test cohort, LR achieved the highest performance [accuracy 0.897, AUC 0.929 (95% CI, 0.838–1.000), sensitivity 0.786, and specificity 1.000]. …”
    Get full text
    Article
  19. 319

    Deterministic and Stochastic Machine Learning Classification Models: A Comparative Study Applied to Companies’ Capital Structures by Joseph F. Hair, Luiz Paulo Fávero, Wilson Tarantin Junior, Alexandre Duarte

    Published 2025-01-01
    “…The results indicate that decision trees, random forest, and XGBoost excelled in the training phase but showed higher overfitting when evaluated in the test sample. …”
    Get full text
    Article
  20. 320

    Proposal for Using AI to Assess Clinical Data Integrity and Generate Metadata: Algorithm Development and Validation by Caroline Bönisch, Christian Schmidt, Dorothea Kesztyüs, Hans A Kestler, Tibor Kesztyüs

    Published 2025-06-01
    “…Logistic regression, k-nearest neighbors, a naive bayes classifier, a decision tree classifier, a random forest classifier, extreme gradient boosting (XGB), and support vector machines (SVM) were selected as machine learning algorithms. …”
    Get full text
    Article