redhat-developer / vscode-granite

Use IBM Granite Code LLM as your Code Assistant in Visual Studio Code
Apache License 2.0
3 stars 6 forks source link

Improvements for Red Hat OpenShift Dev Spaces #96

Closed ibuziuk closed 1 week ago

ibuziuk commented 1 week ago

Currently, there are a few concerns mainly related to 'ollama' availability in the CDE:

  1. If a user installs an extension in the CDE where ollama is not available, they will be asked to install it with sudo permissions:
Screenshot 2024-10-17 at 11 52 53

In general, containers running as root is a significant security risk + containers are supposed to be immutable and installing anything inside the container is not recommended since after the restart all the packages will vanish. What is more containers are running using the Arbitrary User IDs (This provides additional security against processes escaping the container due to a container engine vulnerability and thereby achieving escalated permissions on the host node).

In terms of Dev Spaces if ollama is expected to be used it should be defined as part of devfile as described in https://developers.redhat.com/articles/2024/08/12/integrate-private-ai-coding-assistant-ollama

  1. if a user installs the extension in the CDE where ollama is installed e.g. 'Ollama' getting started sample on Developer Sandbox:

the extension detects ollama, and seem to configure the granite model, but in reality another model is used:

Screenshot 2024-10-17 at 12 08 36

Steps to reproduce:

fbricon commented 1 week ago

Fixed with #97.

The "I'm an Open AI model" is simply Granite hallucinating, nothing we can do here. We're waiting for updated models