Closed dallytaur closed 5 months ago
I second this. Its OpenAI extension makes it plug-and-play with chatting front-ends like SillyTavern using a URL, so I don't see why this couldn't be modified to work with it. It'd allow for free testing, albeit on lower-quality models. Major concerns would be on the necessary context length to run this though.
I second this. Its OpenAI extension makes it plug-and-play with chatting front-ends like SillyTavern using a URL, so I don't see why this couldn't be modified to work with it. It'd allow for free testing, albeit on lower-quality models. Major concerns would be on the necessary context length to run this though.
i tired using the system for using a local open AI clone.
if we can able in the config to set the open AI URL and port will be the easy fix
the tutorial is located in wiki page here
https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API
will be fun to mix and match models especially beyonder or mistral/mixtral. we can make it heavily based on vision and minecraft knowledge. [the wiki would be fun to shove in]
We'd like to support custom models via http requests eventually. As mentioned, the main problem is that different models/apis require different prompt formatting, so there is no one size fits all solution.
We'd like to support custom models via http requests eventually. As mentioned, the main problem is that different models/apis require different prompt formatting, so there is no one size fits all solution.
as noted there
I second this. Its OpenAI extension makes it plug-and-play with chatting front-ends like SillyTavern using a URL, so I don't see why this couldn't be modified to work with it. It'd allow for free testing, albeit on lower-quality models. Major concerns would be on the necessary context length to run this though.
i tired using the system for using a local open AI clone.
if we can able in the config to set the open AI URL and port will be the easy fix
the tutorial is located in wiki page here
https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API
as noted in my post here web gentraion web ui has an open API clone of it API you just need to add the settings to allow us to configure the OPEN_AI URL as noted in the wiki
from what I remember oobabooga/text-generation-webu is the a1111 for llms
not fully sure but you may need disable SSL as well
the edits to gpt.js would probably be as followed however i have zero nodeJS experience so this will be full of errors
constructor(model_name) {
this.model_name = model_name;
let openAiConfig = null;
// my code added as an example
if(process.env.OPENAI_API_BASE){
openAiConfig = {
apiKey: process.env.OPENAI_API_KEY,
apiBaseUrl: process.env.OPENAI_API_BASE),
};
// end of my code
}
if (process.env.OPENAI_ORG_ID) {
openAiConfig = {
organization: process.env.OPENAI_ORG_ID,
apiKey: process.env.OPENAI_API_KEY,
};
}
else if (process.env.OPENAI_API_KEY) {
openAiConfig = {
apiKey: process.env.OPENAI_API_KEY,
};
}
else {
throw new Error('OpenAI API key missing! Make sure you set your OPENAI_API_KEY environment variable.');
}
this.openai = new OpenAIApi(openAiConfig);
}```
I did this. Replace "this.openai = new OpenAIApi(openAiConfig);" in gpt.js with
this.openai = new OpenAIApi(
{
apiKey: process.env.OPENAI_API_KEY,
baseURL: process.env.OPENAI_API_BASE,
}
);
Just like the wiki for the text generation ui said, I also set environment variables OPENAI_API_BASE=http://127.0.0.1:5000/v1 and OPENAI_API_KEY=sk-111111111111111111111111111111111111111111111111. I had a little bit of trouble getting the openai extension to work on the text generation ui side, but I fixed it by manually installing sentence-transformers.
On a side note, I don't what model to use for this. TinyLlama-1.1B-Chat-v1.0 is definitely not working well.
I don't what model to use for this. TinyLlama-1.1B-Chat-v1.0 is definitely not working well.
I'd be impressed if you could get a 7b model to perform decently without additional finetuning.
web gentraion web ui has an open API clone of it API you just need to add the settings to allow us to configure the OPEN_AI URL
Very cool. I didn't realize you meant that it would work with the openai nodejs package.
If one of you could open a pr with detailed instructions for testing this feature then I can start to review it. I don't know when I'll get to it otherwise.
I did this. Replace "this.openai = new OpenAIApi(openAiConfig);" in gpt.js with this.openai = new OpenAIApi( { apiKey: process.env.OPENAI_API_KEY, baseURL: process.env.OPENAI_API_BASE, } );
Just like the wiki for the text generation ui said, I also set environment variables OPENAI_API_BASE=http://127.0.0.1:5000/v1 and OPENAI_API_KEY=sk-111111111111111111111111111111111111111111111111. I had a little bit of trouble getting the openai extension to work on the text generation ui side, but I fixed it by manually installing sentence-transformers.
On a side note, I don't what model to use for this. TinyLlama-1.1B-Chat-v1.0 is definitely not working well.
If that resolves the one item that means this should be good to go!!
amazing work I couldnt figure that part out
think for organising sake that like your done with other models you should have a different model js file than using GPT.js
maybe call it localLLM.js
think for organising sake that like your done with other models you should have a different model js file than using GPT.js
maybe call it localLLM.js
I made a pull request where I put it in local.js, as well as altering main.js to allow for its usage, though I don't know if my implementation is much better than a band-aid fix.
think for organising sake that like your done with other models you should have a different model js file than using GPT.js maybe call it localLLM.js
I made a pull request where I put it in local.js, as well as altering main.js to allow for its usage, though I don't know if my implementation is much better than a band-aid fix.
model section as well loading prams for oobabooga may be a nice to have
I had a little bit of trouble getting the openai extension to work on the text generation ui side, but I fixed it by manually installing sentence-transformers.
Could you elaborate? I've manually installed sentence-transformers but I'm still getting the sentence_transformers module has not been found error,
I had a little bit of trouble getting the openai extension to work on the text generation ui side, but I fixed it by manually installing sentence-transformers.
Could you elaborate? I've manually installed sentence-transformers but I'm still getting the sentence_transformers module has not been found error,
What I ended up doing is adding a couple lines at the top of extensions/openai/embeddings.py
import subprocess
subprocess.check_call([sys.executable, "-m", "pip", "install", "-U", "sentence-transformers"])
I'm sure there is probably a better way to do it, but just adding those and running the program worked for me. After you've run it once, you can even remove them.
Thanks, I just ended up creating a .bat file in the main directory for oobabooga to install it and it worked.
For anyone who is getting a replace error on the oobabooga side of things, I was able to get around it by replacing the line:
stop: stop_seq
with the lines:
stop: stop_seq, user_bio: 'Minecraft Player'
The actual functionality though barely works for me though, Can't tell if it's my fix, or the model I'm using. Bot keeps thinking it's a twitter bot or a jquery bot or something haha
can you tell me what branch your using for futher testing?
this a local LLM model.
I think if done right you could train model to work with the game system you made
it has API docs that mostly a copy open AI api