jlonge4 / local_llama

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
Apache License 2.0
221 stars 31 forks source link

m2 error upon first running #12

Open SuperCowboyDinosaur opened 1 year ago

SuperCowboyDinosaur commented 1 year ago

streamlit.errors.StreamlitAPIException: set_page_config() can only be called once per app page, and must be called as the first Streamlit command in your script. is thrown after following instructions and filling the env_vars. looked at the code and it does seem like you are calling it correctly, at least to my untrained eye, so a bit unsure where that's coming from?

jlonge4 commented 11 months ago

@SuperCowboyDinosaur hmmm...is it still happening to you?

SuperCowboyDinosaur commented 11 months ago

sorry for previous response, different repo... yes , this still happens on this one for me. still unsure wtf is going on.

jlonge4 commented 6 months ago

@SuperCowboyDinosaur

Checkout v3, and follow the new update instructions in the read me. You won't be disappointed.