Skip to content

Latest commit

 

History

History
2 lines (2 loc) · 241 Bytes

README.md

File metadata and controls

2 lines (2 loc) · 241 Bytes

Multiarmed bandits

I introduced various multiarmed bandits algorithms such as e-greedy, annealing epsilon greedy, thompson sampling, UCB etc. I also compared the performance of these algorithms and how they can quickly find the best arm.