zbirenbaum / copilot.lua

Fully featured & enhanced replacement for copilot.vim complete with API for interacting with Github Copilot
MIT License
2.54k stars 72 forks source link

Copilot is about to start charging. #38

Open CharlesChiuGit opened 2 years ago

CharlesChiuGit commented 2 years ago

As the letter many of copilot users received today, maybe it's time to look for a open-source AI coding assistant? I found this cool project in github, maybe there's a way to transform it into a Neovim plugin? This is more like a plugin idea, no need to take it seriously if you're not interested in making one.

Hope this issue won't bother you too much, feel free to close this issue if you think it's irrelevant.

Best!

zbirenbaum commented 2 years ago

Believe it or not I am actually way ahead of you on this. I've been looking into different options for a while now (Including the repo that you linked) that I could add support or create a new plugin for.

There are a few issues I have run into though. GPT-Code-Clippy seems more research oriented, and to take advantage of the neovim features I do to keep things simple, it would require me to implement a LSP server. I reviewed it again just now and it seems there is a vscode extension I missed when I looked into it before that may be of help. I could make a simpler rpc program, but it would require a lot of changes to this code base, and would be more difficult to debug. Plus, adding new features would be much more doable with the existing LSP infrastructure in neovim.

That said, I was actually planning to write my own LSP for the OpenAI codex engines which power copilot, as it would make someone with an api key for either eligible to use this plugin. I thought there would be a few months at the very least before copilot announced the paid model, so this caught me off guard and moves up my time frame quite a bit.

The model you linked is the most promising truly free option I managed to run across as well, but I've heard gpt2 is quite a bit behind gpt3. Given that, I would highly recommend you apply for an OpenAI codex key in the meantime, as that is likely going to be your best bet for getting comparable completions to those copilot provides once I've completed a LSP server for it.

I'll go on record and state that I'll do my best to have a LSP implementation for an alternative by August 22nd.

zbirenbaum commented 2 years ago

I'm gonna pin this so that people won't miss it and can provide input on potential sources or my plans to write a more generalized LSP sever

CharlesChiuGit commented 2 years ago

Wow! That's super great news to know! Thanks for your study and effort! And don't push yourself too hard, take your time!🤣

marcelarie commented 1 year ago

Got my fresh Open AI key :) How is this going??

zbirenbaum commented 1 year ago

@marcelarie I have a functioning prototype LSP that implements the completion handlers properly but it gets rate limited really fast, as I found out upon completing it that open ai limits the free usage to 20 requests per minute (this is documented nowhere to my knowledge). Trying to find a way around that (I implemented cancellations but that did nothing) and to integrate open source models :(

If anyone has thoughts or advice on the rate limiting issue, input would be really appreciated.

CharlesChiuGit commented 1 year ago

Thanks again for your effort!

marcelarie commented 1 year ago

thanks!

me6262 commented 1 year ago

any further progress on this?

marcelarie commented 1 year ago

Yes I would love to see this happening. @zbirenbaum if you need any help please post it here, maybe I can't, but someone else can.

rj1 commented 1 year ago

codeium seems like a good alternative

nyngwang commented 1 year ago

@rj1 Your link is incorrect.

McPatate commented 1 year ago

Hi @zbirenbaum, I started writing a plugin when the bigcode project released StarCoder. Technically the plugin can support any model, you either need to supply a model ID (this will work with our API inference service at Hugging Face) or an http endpoint url if you want to self-host a model.

It may be worth joining forces, interested in collaborating?

jukefr commented 1 year ago

Tried out https://github.com/fauxpilot/fauxpilot really quickly (and was not all that impressed tbh but i can only reliably run the 350M parameter model, afaik copilot uses openai codex which is 12B parms or at least was circa 2021, no clue about these days) but still, made a quick fork of this with an action that auto merges upstream daily at like 1am and sed patches the agent.js to use the local fauxpilot install if people are interested 🤷 https://github.com/jukefr/fauxpilot.lua

So like that means you can just use the exact same config and just change the repo name in your nvim config and it still integrates perfectly well with https://github.com/zbirenbaum/copilot-cmp. I personnally just keep my fork commented for when they improve it (fauxpilot) a bit maybe in the future or switch to the salesforce codegenv2 models or idk 🤷

But feel free to mess around and try it if you want something self hosted that isnt sending all your stuff to some 3rd party server like huggingface or whatnot. (Fyi there is still telemetry stuff being sent to github, probably patchable but did not bother)

Would probably be neat to be able to configure this somehow instead of having to do this botch but from what i gather this project just syncs the `dist` folder from official copilot plugin anyway so it would imply a rewrite/rethink of some sort and this is quick and dirty but works 🤷
Tbh i care more about privacy than the charging or using a free service thats still hosted wherever the hell. I allegedly don't allegedly mind allegedly visiting https://github.com/signup allegedly every alleged 30 days allegedly.

edit: fixed the patcher to actually work on new releases on here and just wanted to make this edit to let you know how absolutely cursed the way it works is if you want to check it out https://github.com/jukefr/fauxpilot.lua/blob/master/.github/workflows/cursed.yaml im not sure how most plugin managers will vibe with the history bonk but eh