intitni / CustomSuggestionServiceForCopilotForXcode

Use locally run models (OpenAI compatible APIs, Tabby, Ollama) to generate code suggestions for Xcode.
MIT License
33 stars 4 forks source link
code code-completion copilot-for-xcode llm openai tabby xcode

Custom Suggestion Service for Copilot for Xcode

This extension offers a custom suggestion service for Copilot for Xcode, allowing you to leverage a chat model to enhance the suggestions provided as you write code.

Installation

  1. Install the application in the Applications folder.
  2. Launch the application.
  3. Open Copilot for Xcode and navigate to "Extensions".
  4. Click "Select Extensions" and enable this extension.
  5. You can now set this application as the suggestion provider in the suggestion settings.

Update

To update the app, you can do so directly within the app itself. Once updated, you should perform one of the following steps to ensure Copilot for Xcode recognizes the new version:

  1. Restart the CopilotForXcodeExtensionService.
  2. Alternatively, terminate the "Custom Suggestion Service (CopilotForXcodeExtensionService)" process, open the extension manager in Copilot for Xcode, and click "Restart Extensions".

We are exploring better methods to tweak the update process.

Settings

The app supports three types of suggestion services:

If you are new to running a model locally, you can try Ollama and LM Studio.

Recommended Settings

When using custom models to generate suggestions, it is recommended to setup a lower suggestion limit for faster generation.

Others

In other situations, it is advisable to use a custom model with the completions API over a chat completions API, and employ the default request strategy.

Ensure that the prompt format remains as simple as the following:

{System}
{User}
{Assistant}

The template format differs in different tools.

Strategies

Contribution

Prompt engineering is a challenging task, and your assistance is invaluable.

The most complex things are located within the Core package.