haseeb-heaven / code-interpreter

An innovative open-source Code Interpreter with (GPT,Gemini,Claude,LLaMa) models.
https://pypi.org/project/open-code-interpreter/
MIT License
233 stars 42 forks source link
bard-coder bing-coder bingai chatbot chatgpt code-interpreter code-llama google-bard gpt gpt-4 huggingface interpreter llm llm-coder open-interpreter openai phind-coder python wizard-coder

Interpreter

Hosting and Spaces:

Colab Replit PyPi Building

Support Project:

Welcome to Code-Interpreter πŸŽ‰, an innovative open-source and free alternative to traditional Code Interpreters. This is powerful tool and it also leverages the power of GPT 3.5 Turbo,PALM 2,Groq,Claude, HuggingFace models like Code-llama, Mistral 7b, Wizard Coder, and many more to transform your instructions into executable code for free and safe to use environments and even has Vision Models for Image Processing available.

Code-Interpreter is more than just a code generator. It's a versatile tool that can execute a wide range of tasks. Whether you need to find files in your system πŸ“‚, save images from a website and convert them into a different format πŸ–ΌοΈ, create a GIF 🎞️, edit videos πŸŽ₯, or even analyze files for data analysis and creating graphs πŸ“Š, Code-Interpreter can handle it all.

After processing your instructions, Code-Interpreter executes the generated code and provides you with the result. This makes it an invaluable tool for developers πŸ’», data scientists πŸ§ͺ, and anyone who needs to quickly turn ideas into working code and now with Vision Models it can also process images and videos.

Designed with versatility in mind, Code-Interpreter works seamlessly on every operating system, including Windows, MacOS, and Linux. So, no matter what platform you're on, you can take advantage of this powerful tool πŸ’ͺ.

Experience the future of code interpretation with Code-Interpreter today! πŸš€

Why this is Unique Interpreter?

The distinguishing feature of this interpreter, as compared to others, is its commitment to remain free πŸ†“. It does not require any model to download or follow to tedious processes or methods for execution. It is designed to be simple and free for all users and works on all major OS Windows,Linux,MacOS

Future Plans:

Table of Contents

πŸ“₯ Installation

Installtion with Python package manager.

To install Code-Interpreter, run the following command:

pip install open-code-interpreter

Installtion with Git

To get started with Code-Interpreter, follow these steps:

  1. Clone the repository:
    git clone https://github.com/haseeb-heaven/code-interpreter.git
    cd code-interpreter
  2. Install the required packages:
    pip install -r requirements.txt
  3. Setup the Keys required.

API Key setup for All models.

Follow the steps below to obtain and set up the API keys for each service:

  1. Obtain the API keys:

    • HuggingFace: Visit HuggingFace Tokens and get your Access Token.
    • Google Palm and Gemini: Visit Google AI Studio and click on the Create API Key button.
    • OpenAI: Visit OpenAI Dashboard, sign up or log in, navigate to the API section in your account dashboard, and click on the Create New Key button.
    • Groq AI: Obtain access here, then visit Groq AI Console, sign up or log in, navigate to the API section in your account, and click on the Create API Key button.
    • Anthropic AI: Obtain access here, then visit Anthropic AI Console, sign up or log in, navigate to the API Keys section in your account, and click on the Create Key button.
  2. Save the API keys:

    • Create a .env file in your project root directory.
    • Open the .env file and add the following lines, replacing Your API Key with the respective keys:
export HUGGINGFACE_API_KEY="Your HuggingFace API Key"
export PALM_API_KEY="Your Google Palm API Key"
export GEMINI_API_KEY="Your Google Gemini API Key"
export OPENAI_API_KEY="Your OpenAI API Key"
export GROQ_API_KEY="Your Groq AI API Key"
export ANTHROPIC_API_KEY="Your Anthropic AI API Key"

Offline models setup.

This Interpreter supports offline models via LM Studio and OLlaMa so to download it from LM-Studio and Ollama follow the steps below.

  1. Run the interpreter with Python:

    Running with Python.

    python interpreter.py -md 'code' -m 'gpt-3.5-turbo' -dc 
  2. Run the interpreter directly:

    Running Interpreter without Python (Executable MacOs/Linux only).

    ./interpreter -md 'code' -m 'gpt-3.5-turbo' -dc 

🌟 Features

πŸ› οΈ Usage

To use Code-Interpreter, use the following command options:

Interpreter Commands πŸ–₯️

Here are the available commands:

βš™οΈ Settings

You can customize the settings of the current model from the .config file. It contains all the necessary parameters such as temperature, max_tokens, and more.

Steps to add your own custom API Server

To integrate your own API server for OpenAI instead of the default server, follow these steps:

  1. Navigate to the Configs directory.
  2. Open the configuration file for the model you want to modify. This could be either gpt-3.5-turbo.config or gpt-4.config.
  3. Add the following line at the end of the file:
    api_base = https://my-custom-base.com

    Replace https://my-custom-base.com with the URL of your custom API server.

  4. Save and close the file. Now, whenever you select the gpt-3.5-turbo or gpt-4 model, the system will automatically use your custom server.

Steps to add new Hugging Face model

Manual Method

  1. πŸ“‹ Copy the .config file and rename it to configs/hf-model-new.config.
  2. πŸ› οΈ Modify the parameters of the model like start_sep, end_sep, skip_first_line.
  3. πŸ“ Set the model name from Hugging Face to HF_MODEL = 'Model name here'.
  4. πŸš€ Now, you can use it like this: python interpreter.py -m 'hf-model-new' -md 'code' -e.
  5. πŸ“ Make sure the -m 'hf-model-new' matches the config file inside the configs folder.

Automatic Method

  1. πŸš€ Go to the scripts directory and run the config_builder script .
  2. πŸ”§ For Linux/MacOS, run config_builder.sh and for Windows, run config_builder.bat .
  3. πŸ“ Follow the instructions and enter the model name and parameters.
  4. πŸ“‹ The script will automatically create the .config file for you.

Star History

Star History Chart

🀝 Contributing

If you're interested in contributing to Code-Interpreter, we'd love to have you! Please fork the repository and submit a pull request. We welcome all contributions and are always eager to hear your feedback and suggestions for improvements.

πŸ“Œ Versioning

πŸš€ v1.0 - Initial release.
πŸ“Š v1.1 - Added Graphs and Charts support.
πŸ”₯ v1.2 - Added LiteLLM Support.
🌟 v1.3 - Added GPT 3.5 Support.
🌴 v1.4 - Added PALM 2 Support.
πŸŽ‰ v1.5 - Added GPT 3.5/4 models official Support.
πŸ“ v1.6 - Updated Code Interpreter for Documents files (JSON, CSV, XML).
🌴 v1.7 - Added Gemini Pro Vision Support for Image Processing.

🌟 v1.8 - Added Interpreter Commands Support:

πŸ—¨οΈ v1.9 - Added new Chat mode πŸ—¨οΈ for Chatting with your Files, Data and more.

πŸ”₯ v2.0 - Added Groq-AI Models Fastest LLM with 500 Tokens/Sec with Code-LLaMa, Mixtral models.

πŸ”₯ v2.1 - Added AnhtorpicAI Claude-3 models powerful Opus,Sonnet,Haiku models.

πŸ“œ License

This project is licensed under the MIT License. For more details, please refer to the LICENSE file.

Please note the following additional licensing details:

πŸ™ Acknowledgments

πŸ“ Author

This project is created and maintained by Haseeb-Heaven.