ngruver / llmtime

https://arxiv.org/abs/2310.07820
MIT License
628 stars 139 forks source link

Basic usage without an LLM key? #24

Closed gsamaras closed 5 months ago

gsamaras commented 5 months ago

I was wondering if I could quickly use your model without an LLM key (e.g. OpenAI key)?

ngruver commented 5 months ago

You can try running a LLaMA or mistral model locally, as shown in demo.py

gsamaras commented 5 months ago

Yes exactly @ngruver, that is what I was trying in a Kaggle Haggle notebook, but it was asking for openai and Mistral keys. Can you provide an example?

I am only interested in feeding the air passenger dataset to LLMtime and get predictions, just for a basic demo

zaizou commented 5 months ago

@gsamaras You just need to comment the lines here Code_1o9Y2gzX26

or you create a dummy OPENAI_API_KEY env variable

gsamaras commented 5 months ago

@zaizou thank you for your suggestions. Unfortunately they don't work, since it asks for an openai key, as you can see on this Kaggle notebook, in which you are welcome to play around.

zaizou commented 5 months ago

Just comment the models that you don't want to use in the model_predict_fns variable Code_6VwxjaavpY

gsamaras commented 5 months ago

@zaizou apologies, I had done that but didn't mention it. When I did this, it would ask for a Mistral Key. If I provide a dummy mistral key, it will say unauthorized. If I create a real Mistral key, it seems it is a pay-as-you-go service. If I add Mistral mode via Kaggle, it won't make a difference, because LLM-Time tries to connect to the Mistral client.

So what I did was to use mistral instead of mistral-api-medium and add the model locally. Now the notebook runs out of memory, but I'll see what I can do about it, thanks!