shanive / bgu-ailab-bandits

bandits expiriment
6 stars 0 forks source link

Multi-armed-bandits related experiments and algorithms (AI lab, Ben Gurion University of the Negev).

README --- this file bandit-experiment/ --- simple bandit experiment on pure exploration SOS/ --- comparison of MCTS algorithms on the Sum-of-Switches game results-plot/ --- a small matplotlib-based module and a command-line utility to plot experiment results papers/ --- papers by others worth reading fuego-1.1/ --- a framework in C++ for experimenting with MCTS and the Go game

More details at: https://github.com/shanive/bgu-ailab-bandits