Closed eschmidbauer closed 6 months ago
Yes it should be possible @eschmidbauer , can you show your code?
basically just these two changes and it doesn't seem to work.
Also in the manifest.content_scripts.matches you should add the vLLM site URL. Could you share the URL with me so I can test making it work by modifying chatgpt.js if necessary?
sorry it's a private URL- i will update the manifest.content_scripts.matches
file and test again
same issue
What's the errors in console say? also vLLM probably accepts different format payloads than OpenAI ones, so I would need to study their API to add a method to use theirs
it's the same API spec as OpenAI's I ended up running streamlit to implement a chat interface using the openai package
For ChatGPT.js it started off as a library of functions to interact with chat.openai.com so uses DOM as the description states. Only recently are API methods slowly being added (like chatgpt.getChatData()) which rely on the access token provided to client when logged in to chat.openai.com. So a method to send prompts hasn't been implemented yet, but it is in the works (if you goto https://github.com/kudoai/duckduckgpt , it will work like that getShowReply(prompt), except you would type chatgpt.send(prompt))
can we extend current solution to be compatible with chatGPT via azure connection : https://learn.microsoft.com/en-us/azure/ai-services/openai/reference ?
vLLM supports an OpenAI-Compatible API. is it possible to configure this code to connect to another service other than OpenAI? I've tried modifying the code but i am getting errors.