Closed SlavBoi420 closed 2 months ago
Roasts are cached. It makes sense because this project uses OpenAI API, obviously author does not want to pay for each roast, as it could be very expensive.
Thank you for the explanation @style77
Thanks for the insight @style77 , I should have thought of that lmao
Roasts are cached. It makes sense because this project uses OpenAI API, obviously author does not want to pay for each roast, as it could be very expensive.
Sounds reasonable, thus a solution for this based on open-source models can be a deal breaker!
I agree with @Hussseinkizz but it all depends on the author of the project.
Yeah, I agree with @Hussseinkizz as well. I wonder how long these roasts are cached in though
Forever @SlavBoi420, from what i've seen in source code, these are saved in database as rows, not cached with temporary system like Redis.
@style77 for now, the result are cached forever.
Yeah, I agree with @Hussseinkizz as well. I wonder how long these roasts are cached in though
Alternatively is there a way for people to provide their own API key?
@style77 Aw man, that's a real shame (but understandable)
Yeah, I agree with @Hussseinkizz as well. I wonder how long these roasts are cached in though
Alternatively is there a way for people to provide their own API key?
Yes, check the .env
file, within this file you can add your own OpenAI API key.
Hope that helps.
Yeah, I agree with @Hussseinkizz as well. I wonder how long these roasts are cached in though
Alternatively is there a way for people to provide their own API key?
Yes, check the
.env
file, within this file you can add your own OpenAI API key.Hope that helps.
As @Kyle8973 mentioned you csn host your own version of application, with your own openAI api key and disable caching.
Hi, I think "caching" the responses is a great solution to avoid high bills.
What if the English prompt is updated? I will never be able to be roasted with the new prompt.
I don't think the solution is clearing the db rows on every deployment.
I've thought about having a prompt version control (for each language?) and storing that on the database. When someone sents a request and if the version sent by client differs then send the new prompt.
This without a control could potentially have a bug, I would be able to send the request to the backend with a new version each time even if that version is not "available". So that should be considered when developing this "feature".
Does anybody have any other ideas or complement mine? 💡
Yeah, I agree with @Hussseinkizz as well. I wonder how long these roasts are cached in though
Alternatively is there a way for people to provide their own API key?
Yes, check the
.env
file, within this file you can add your own OpenAI API key.Hope that helps.
Wouldn't it be nice that even on the now hosted live frontend everyone is using there's an input where I just put my key? so it uses that? and also where I disable cache from?
My silly ass just made a repo for one of my old goofy projects to see how the roasts will change. I am pretty disappointed to note that the roasts won't update with new changes like additional repos and stuff (at least for the language you chose before making changes). I chose English, and it returns the exact same roast even after I have new public repos and such.
Valve please fix