-
- News
- Deadline
- Interspeech 2022, 수고 많으셨습니다!
- ICML 22: Review out (4. 7, 저녁)
- [인공지능과 지식재산백서](https://www.kipo.go.kr/ko/kpoBultnDetail.do?ntatcSeq=16558&aprchId=BUT0000048&searchC…
-
Hi everyone,
I am working on solving a peg-in hole problem. Initially I stared with RL approach but it seems like its not the right approach for my problem.
**Task description**
**Setup**: Rob…
-
Hey,
I want to use MAB in an LTR problem. Can you help me figure out which algorithm to use and a little about how to use it?
(a beginner in MAB I am)
I'm thinking of using the functionality fo…
-
Hello,
First of all, I would like to say thank you for your work on that package, which considerably simplified the work on contextual bandits!
I was looking at your package to use in my pet project…
-
Hello @david-cortes, thanks for making contextualbandits!
I'm currently exploring solutions to recommend products based on user preferences and it looks like contextual bandits is an interesting appr…
-
Take a look at [https://github.com/david-cortes/contextualbandits](url). Omar is already taking care of installing the package and requirements in the docker file. Take a look at the example and imple…
-
I can't repeat the example using the command line.
https://github.com/VowpalWabbit/jupyter-notebooks/blob/master/Simulating_a_news_personalization_scenario_using_Contextual_Bandits.ipynb
Previousl…
-
**Important Note: We do not do technical support, nor consulting** and don't answer personal questions per email.
Please post your question on the [RL Discord](https://discord.com/invite/xhfNqQv), [R…
-
## Short description
I have a requirement where I have around 10,000 arms, a contextual bandit. But during the exploration, I don't want to explore all the arms. For example, I will explore 1 t…
-
Can we still use Importance Weighted Multitask Regression as in this [tutorial](http://hunch.net/~mltf/cb_static.pdf)?
![image](https://user-images.githubusercontent.com/4996067/43047375-15a6533a-8…