zed-industries / zed

Code at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
https://zed.dev
Other
49.77k stars 3.06k forks source link

GitHub Copilot Chat Support #4673

Closed 0smboy closed 3 months ago

0smboy commented 1 year ago

Check for existing issues

Describe the feature

Currently only Github Copilot is supported in Zed. It would be nice if Copilot Chat is integrated into zed as well.

If applicable, add mockups / screenshots to help present your vision of the feature

No response

ZombieHarvester commented 1 year ago

Copilot Chat is such a great improvement to Copilot

JosephTLyons commented 1 year ago

While I know this isn't exactly the same, we wlll be landing a new panel very soon that allows chatting with OpenAI models, with commands to quickly bring code into it, switch the roles in the conversation, etc.

satanworker commented 1 year ago

Copilot with chat is so much better, than without it for sure!

srid commented 1 year ago

The current Assistant Panel will only work if you have a paid OpenAI account.

But I'm one of those users who already has access to Github Copilot Chat (in VSCode), so I'd expect to be able to use that (just like Copilot) in Zed.

hwatkins commented 1 year ago

Any update on @JosephTLyons comment? Seems like that would be close to what VScode has for Copilot chat.

mrcnk commented 1 year ago

Would be great to have it, since Copilot Chat made it to Open Beta 🙏

pczern commented 10 months ago

I'd love to see Copilot Chat soon, loving my first minutes with Zed

websmithcode commented 10 months ago

Add ai label) 😉

serhii-lypko commented 10 months ago

Cannot use Zed without proper copilot chat integration, unfortunately.

Moshyfawn commented 9 months ago

Now that Copilot Chat is generally available, I'd love to see it revisited 😊 https://github.blog/2023-12-29-github-copilot-chat-now-generally-available-for-organizations-and-individuals/

rudyorre commented 9 months ago

This is kind of the one thing holding me back from switching fully to Zed. I love my copilot chat :-)

hmacafee commented 9 months ago

+1

AsynchronousAI commented 9 months ago

I support this!

BenBraunstein commented 9 months ago

This would be amazing. Copilot chat is a gamechanger

PsHohe commented 9 months ago

Having Copilot Chat in Zed would be yet another step towards switching.

half-cto commented 8 months ago

+1

tanmay-a-sharma commented 8 months ago

I just downloaded Zed and I have been using JetBrains product for a while. The one thing holding me to JetBrains is that I cannot have Copilot Chat setup. I would be stoked to see Zed have Copilot Chat.

felixpie commented 8 months ago

+1

MSpiechowicz commented 8 months ago

+1. I can live for now without plugins but Copilot Chat is mandatory for me to make a switch. It is saving me tons of productivity time.

Moshyfawn commented 8 months ago

To show your interest in this feature, please consider adding the 👍 reaction to the original post instead of using generic comments like '+1' or 'I need it to switch'. These comments only make the thread longer and less useful for the maintainers, and they won't boost the priority of the feature.

kldzj commented 8 months ago

Zed seems nice, but I'll put it aside for now until it's more baked. Not having Copilot Chat is a big damper imho.

Why'd I pay for OpenAI tokens when I get basically the same already included in GitHub Copilot?

onexbash commented 8 months ago

I agree. I already pay for GH Copilot monthly, so I don't see myself spending more money on OpenAI API Usage just because the Copilot Chat Window is missing. => Not using Zed yet.

teodoradriann commented 8 months ago

This is the only reason I'm not switching to Zed... I'd love to see Copilot Chat here

giovdk21 commented 8 months ago

I'm not replying to anyone in particular, but I'd invite to please just use the thumb up feature/emoji on the issue itself rather than commenting, unless you have something new/relevant to add to the conversation. (Assuming most people subscribe to issues to monitor updates)

edit: saw someone already mentioned this just a few days ago 😅

furthermore this issue is now listed as top ranking and afaik that's based on the number of 👍

rudyorre commented 8 months ago

@giovdk21 commenting to reiterate your point 😁

harssh commented 8 months ago

Same. I moved back to VS Code due to this missing feature.

caesartrajan commented 8 months ago

Please add this. Loving everything else about Zed and ready to make the switch once you add Copilot chat.

ericgoode commented 8 months ago

If I had this I would make the switch. It is just too convenient to have the chat in the same window with context sharing.

GurYN commented 8 months ago

+1

Copilot Chat is not an asset but a necessity.

widersky commented 8 months ago

AFAIK Chat support is strongly dependent on Github. At the moment, it is not possible to use the "open API" that would allow the use of CoPilot Chat in any editor. The CoPilot extension for Jetbrains has only recently received official Chat support and only because the extension is developed by GitHub.

mrcnk commented 7 months ago

GitHub has just released Copilot CLI, so I guess we can revisit this. https://www.youtube.com/watch?v=fHwtrOcLAnI https://docs.github.com/en/copilot/github-copilot-in-the-cli/using-github-copilot-in-the-cli

vuon9 commented 7 months ago

GitHub has just released Copilot CLI, so I guess we can revisit this. https://www.youtube.com/watch?v=fHwtrOcLAnI https://docs.github.com/en/copilot/github-copilot-in-the-cli/using-github-copilot-in-the-cli

How is it related to this Copilot Chat issue?

rudyorre commented 7 months ago

@vuon9 i think its bc the cli could be used as a sort of api for github copilot, although this feels like a hacky solution

vuon9 commented 7 months ago

@vuon9 i think its bc the cli could be used as a sort of api for github copilot, although this feels like a hacky solution

It couldn't help much in coding stuffs, Github Copilot CLI is for CLI commands, and it's also a part of Github CLI as an extension.

rudyorre commented 7 months ago

I'm not saying you're wrong, because it does feel hacky to use the cli to build a feature like an integrated chat for zed. But it's definitely doable, i've passed in large chunks of code through the copilot cli "explain" command and it behaves just as it would through the vs code chat.

Don't get me wrong though I feel like there should be a more "official" way of implementing this lol

For example: ``` $ gh copilot explain "create a rust function that uses the bash command 'gh copilot explain ' to query prompts" Welcome to GitHub Copilot in the CLI! version 1.0.0 (2024-03-18) I'm powered by AI, so surprises and mistakes are possible. Make sure to verify any generated code or suggestions, and share feedback so that we can learn and improve. For more information, see https://gh.io/gh-copilot-transparency Explanation: To create a Rust function that utilizes the bash command gh copilot explain to query prompts, you can follow these steps: 1. Define a Rust function that takes a PROMPT as a parameter. 2. Use the std::process::Command struct from the Rust standard library to execute the bash command. 3. Set the command to gh copilot explain , where is the prompt you want to query. 4. Execute the command using the output() method of the Command struct. 5. Handle any errors that may occur during command execution. 6. Retrieve the output of the command and process it as needed in your Rust function. Here's an example implementation: use std::process::Command; use std::io::{self, Write}; fn query_prompt(prompt: &str) { let output = Command::new("bash") .arg("-c") .arg(format!("gh copilot explain '{}'", prompt)) .output(); match output { Ok(output) => { if output.status.success() { let stdout = String::from_utf8_lossy(&output.stdout); let stderr = String::from_utf8_lossy(&output.stderr); // Process the output as needed println!("Command executed successfully!"); println!("stdout: {}", stdout); println!("stderr: {}", stderr); } else { let stderr = String::from_utf8_lossy(&output.stderr); println!("Command failed with error: {}", stderr); } } Err(error) => { println!("Failed to execute command: {}", error); } } } fn main() { let prompt = "Write a Rust function to calculate the factorial of a number."; query_prompt(prompt); } This implementation uses the Command struct to execute the bash command gh copilot explain , where is the prompt you want to query. The output of the command is then processed and printed to the console. ```
ArnoldSandwich commented 7 months ago

along with many other users, the absolute only thing stopping me from using zed full time and ditching vscode is the lack of the copilot chat support. If implemented the same way as vscode, or perhaps even better, it will make zed a literal powerhouse.

skyf0cker commented 7 months ago

The user has already subscribed to GitHub Copilot, and I don't think it's a good idea to ask the user to pay extra for the missing features of Zed. I understand that although Zed is open-source, it is still a commercial company, and I can guess that their commercial plans may differ from strengthening the support and coupling with GitHub Copilot from a product perspective. However, if they solely focus on their commercial plans without listening to the community's voice, I understand that this may backfire and go against Zed's decision to take the open-source route.

asesh commented 6 months ago

Any updates on this feature? This is one of the features that I miss from VS Code!!

ArnoldSandwich commented 6 months ago

Nothing as of yet to my knownloedge.

On Thu, 09 May 2024, 05:43 Asesh, @.***> wrote:

Any updates on this feature? This is one of the features that I miss from VS Code!!

— Reply to this email directly, view it on GitHub https://github.com/zed-industries/zed/issues/4673#issuecomment-2101877021, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGBBROSPCEK5LSX23T7KLNTZBLWHJAVCNFSM6AAAAABCJFK74SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMBRHA3TOMBSGE . You are receiving this because you commented.Message ID: @.***>

BeppeMarnell commented 6 months ago

along with many other users, the absolute only thing stopping me from using zed full time and ditching vscode is the lack of the copilot chat support. If implemented the same way as vscode, or perhaps even better, it will make zed a literal powerhouse.

I agree! I think zed looks really good, but because of copilot chat I cannot switch yet. I will be waiting!

leftbones commented 6 months ago

Just adding my +1 to this feature request! Copilot chat is 50% of the reason I pay for it, and it's a shame to not be able to use that after switching from VS Code.

wheregmis commented 6 months ago

+1

killer-cf commented 5 months ago

I really need this too +1. please as soon as possible

OmerWow commented 5 months ago

This is one the biggest reasons I haven't completely made the move over to Zed

juniuszhou commented 5 months ago

Copilot Chat +1

fabiensebban commented 5 months ago

+1 on this feature. I stay on VS Code because of GH copilot chat.

widersky commented 5 months ago

Once again. CoPilot Chat depends on GitHub support. When Zed starts supporting the possibility of this type of integration in extensions, we will still have to count on a step from GitHub. Both the extensions for VSCode and JetBrains IDEs are OFFICIAL extensions - it is not possible to integrate with Chat via API (yet) so we have to count on the good heart of GitHub.

I think the Zed team knows perfectly well how important this functionality is for users (taking into account that Zed already supports a lot of LLM models), so don't spam please.

Walter-Hartwell-White commented 5 months ago

Copilot Chat is necessary

After using Copilot in my JetBrains IDE I have fallen in love with Copilot but the thing is that JetBrains IDE uses more memory and is comparatively slower than Zed. There is Assistant Panel in Zed which uses OpenAI but it's only for paid users. Well if anyone has paid for GitHub copilot why he would again pay for OpenAI, the only feature missing in Zed is Copilot Chat. Now i am regretting for switching to Zed It's almost a year since the issue has raised still it's not fixed 😭😭

appelgriebsch commented 4 months ago

hmmm someone was able to implement a neovim plugin to talk to copilot chat (https://github.com/CopilotC-Nvim/CopilotChat.nvim). My Lua skills are very limiting, but maybe someone can use this as a jump start?

tttffff commented 3 months ago

hmmm someone was able to implement a neovim plugin to talk to copilot chat (https://github.com/CopilotC-Nvim/CopilotChat.nvim). My Lua skills are very limiting, but maybe someone can use this as a jump start?

For anyone looking into this here's what I found out.

  1. You need a GitHub oauth token. CopilotChat.nvim does not handle this. It looks for one in a few places, and can use the standard ~/.config/github-copilot/hosts.json which is what Zed already uses (as does copilot.vim). For example, you can log into Copilot with Zed, which will create the file, then CopilotChat.nvim can use the oauth token from that same file. As such, this part seems solved.
  2. Get a temporary access token for Copilot. CopilotChat.nvim ultimatly uses curl to do this (will give example below). You get JSON back with "token" and "expires_at" keys.
  3. Use the completion API. CopilotChat.nvim again uses curl (will give example below). In the payload, they use stream=true, but for testing with curl, I would turn this off. There is a format expected for the header information and the JSON data that you pass in, I will give some deails in the example.

I don't believe it would currently be possible to add any of this into a Zed extension, as they look to be only for grammar, languages and themes. I guess it could be intergrated into the existing assistant panel. I also ran out of steam when looking through all this and testing it out. Hope this information helps someone, or atleast satisfies some curorisity.

Temporary access token example ```sh curl -sSL -X GET \ -H "User-Agent: CopilotChat.nvim/2.0.0" \ -H "Accept: application/json" \ -H "Editor-Plugin-Version: CopilotChat.nvim/2.0.0" \ -H "Editor-Version: Neovim/0.10.0" \ -H "Authorization: token ghu_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" \ https://api.github.com/copilot_internal/v2/token ```
Completion API example ```sh curl -sSL -X POST \ -H "User-Agent: CopilotChat.nvim/2.0.0" \ -H "Editor-Plugin-Version: CopilotChat.nvim/2.0.0" \ -H "X-Request-Id: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX" \ -H "Authorization: Bearer XXX_TEMP_ACCESS_TOKEN_XXX" \ -H "Content-Type: application/json" \ -H "Copilot-Integration-Id: vscode-chat" \ -H "Openai-Intent: conversation-panel" \ -H "Vscode-Machineid: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX" \ -H "Editor-Version: Neovim/0.10.0" \ -H "Openai-Organization: github-copilot" \ -H "Vscode-Sessionid: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX" \ -d "@/tmp/FILE_WITH_JSON_DATA" "https://api.githubcopilot.com/chat/completions" ``` Notes: - X-Request-Id is a UUID - Vscode-Machineid is randomly generated characters in the range 0-9 and a-f - Vscode-Sessionid is a UUID with a Unix timestamp of miliseconds appended onto the end The Vscode-Sessionid is regenerated whenever a new access token is required. The Vscode-Machineid remains the same throughout all requests. For most of the operations, CopilotChat.nvim uses the completion API with the following messages: a system message giving instructions to the LLM, a system message with the "Active Selection" which is the file name and some/all of it's content, and, a user message asking for the selection to be reviewed/optmized/etc. A chat can also be started with a system message of instructions and no "Active Selection" or user message. Then user input messages and LLM replies are added to the "messages" array. The full chat history is given as a payload every time the API is called, or until a new chat is started. FILE_WITH_JSON_DATA may look like the below. This is a basic example without the "Active Selection" or the user prompt generated by CopilotChat. ```js { "temperature":0.1, "model":"gpt-4", "intent":true, "top_p":1, "stream":true, "n":1, "messages": [ { "content":"You are an AI programming assistant.\nWhen asked for your name, you must respond with \"GitHub Copilot\".\nFollow the user's requirements carefully & to the letter.\nFollow Microsoft content policies.\nAvoid content that violates copyrights.\nIf you are asked to generate content that is harmful, hateful, racist, sexist, lewd, violent, or completely irrelevant to software engineering, only respond with \"Sorry, I can't assist with that.\"\nKeep your answers short and impersonal.\nYou can answer general programming questions and perform the following tasks: \n* Ask a question about the files in your current workspace\n* Explain how the code in your active editor works\n* Generate unit tests for the selected code\n* Propose a fix for the problems in the selected code\n* Scaffold code for a new workspace\n* Create a new Jupyter Notebook\n* Find relevant code to your query\n* Propose a fix for the a test failure\n* Ask questions about Neovim\n* Generate query parameters for workspace search\n* Ask how to do something in the terminal\n* Explain what just happened in the terminal\nYou use the GPT-4 version of OpenAI's GPT models.\nFirst think step-by-step - describe your plan for what to build in pseudocode, written out in great detail.\nThen output the code in a single code block. This code block should not contain line numbers (line numbers are not necessary for the code to be understood, they are in format number: at beginning of lines).\nMinimize any other prose.\nUse Markdown formatting in your answers.\nMake sure to include the programming language name at the start of the Markdown code blocks.\nAvoid wrapping the whole response in triple backticks.\nThe user works in an IDE called Neovim which has a concept for editors with open files, integrated unit test support, an output pane that shows the output of running the code as well as an integrated terminal.\nThe user is working on a Darwin machine. Please respond with system specific commands if applicable.\nThe active document is the source code the user is looking at right now.\nYou can only give one reply for each conversation turn.\n", "role":"system" }, { "content":"How to center a div?", "role":"user" } ] } ```