Closed yaroslavyaroslav closed 1 year ago
It would take some time to design this appropriately and to implement it then this way. Yet there’s absolutely outstanding telegram bot for chatgpt, that I’m using so far.
All work related to Chat-GPT would be made within chat-gpt branch.
ps: Now there's just one commit, but it's working within a golden path: Output panel mode and only completion command.
ps2: It's hardcoded to gpt-4 model. Please update the source by yourself if required.
Since this is self assigned journey of ChatGPT support implementation. The chat-gpt branch now contains fully supported ChatGPT API plugin version.
It's kinda golden path yet, but it's pretty usable at my taste. As I'm willing to release it as long as the end of April at soonest.
If you wan't some additional info about the feature current state, please take a look at this commit.
UPD: Almost forgot: at this point you have to have MarkdownEditing plugin installed, since its hardcoded as a markdown syntax highlighting. Sorry. It would be fixed at release.
@yaroslavyaroslav awesome job👍
Could you please add socks proxy support btw?
Could you please add socks proxy support btw?
I could try if you throw to me a comprehensive code snippet for that.
And thanks
Next bit of updates could be found here.
It should work within a multiple windows, with a single session though. It's not that I'd liked this behaviour, it's a dumb one, but I see no simple options to make it better. So for 2.0 there's for sure would be just a single session with a hardcoded scroll to the end of the output for each window, yet someday with the version like 2.5 it could get some sessions with appropriate it's management.
User should be able to use it without that md plugin installed and without a markdown syntax highlight though.
It could be simply done by openai.proxy
:
import openai
openai.proxy = "http://<host>:<port>"
Looks openai python lib supports http proxy only, doesn't support socks proxy. Good enough already:)
Could you please add socks proxy support btw?
I could try if you throw to me a comprehensive code snippet for that.
And thanks
@hank-cp Unfortunately it can't be done that way. Theirs is a very limited number of a Python libraries that could be used as a dependency for a Sublime package. And openai one in not in that list.
@hank-cp Unfortunately it can't be done that way. Theirs is a very limited number of a Python libraries that could be used as a dependency for a Sublime package. And openai one in not in that list.
I see. Here is another code snippet without additional dependencies:
conn = http.client.HTTPSConnection("<proxy.host>", proxy.port)
conn.set_tunnel("api.openai.com")
....
conn.request("POST", "/v1/completions", json_payload, headers)
@hank-cp thanks, I'll add it to the plugin settings, and ping you if there would be any problems with that.
@hank-cp Please check that commit https://github.com/yaroslavyaroslav/OpenAI-sublime-text/commit/f2e19ab5070d5ab5168a63813b5a3ef8a227d403 in the gpt branch and reach me back whether it's working as expected?
@yaroslavyaroslav I tried to test in on my local sublime but got below error:
Traceback (most recent call last):
File "/Applications/Sublime Text.app/Contents/MacOS/Lib/python38/sublime_plugin.py", line 1697, in run_
return self.run(edit, **args)
File "/Users/hank/Library/Application Support/Sublime Text/Packages/OpenAI-sublime-text/openai.py", line 48, in run
worker_thread = OpenAIWorker(edit, region, text, self.view, mode, "")
TypeError: __init__() takes 6 positional arguments but 7 were given
Did I miss something?
@hank-cp Nope, it's a bug. Have to be fixed now with the last commit of this branch.
@hank-cp Please check that commit f2e19ab in the gpt branch and reach me back whether it's working as expected?
Tested, it works:) please also apply this update to other api request.
Implemented in #10
OpenAI released API for their new models, they should be added here.
https://platform.openai.com/docs/guides/chat/introduction?utm_medium=email&_hsmi=248340301&_hsenc=p2ANqtz-_-za1TjjgJAe2CzALsgxI35o1gymBzWLE0iy3Q-F8V_beJmIK4-cqh4QQTJggsTTk_HrHi0g_s5ZmXyocedEPfsmocawRoimxC1khMSRIbCY7KiwQ&utm_content=248340301&utm_source=hs_email