Showing 1 - 7 results of 7 for search '"bandit"', query time: 0.02s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4

    Nonstationary Stochastic Bandits: UCB Policies and Minimax Regret by Lai Wei, Vaibhav Srivastava

    Published 2024-01-01
    Subjects: “…nonstationary multiarmed bandit…”
    Get full text
    Article
  5. 5
  6. 6

    Thompson Sampling for Non-Stationary Bandit Problems by Han Qi, Fei Guo, Li Zhu

    Published 2025-01-01
    Subjects: “…multi-armed bandits…”
    Get full text
    Article
  7. 7