k8sgpt-ai / k8sgpt

Giving Kubernetes Superpowers to everyone
http://k8sgpt.ai
Apache License 2.0
5.92k stars 687 forks source link

the --language flag does not work with the azureopenai backend #907

Open herveleclerc opened 9 months ago

herveleclerc commented 9 months ago

Checklist

Affected Components

K8sGPT Version

k8sgpt: 0.3.26 (Homebrew), built at: 2024-01-14T20:28:11Z

Kubernetes Version

v1.27.7

Host OS and its Version

MacOS

Steps to reproduce

I have configured 2 backend

ai:
    providers:
        - name: azureopenai
          model: gpt-4
          password: xxxxxx
          baseurl: https://xxxxx.openai.azure.com/
          engine: gpt-4
          temperature: 0.7
          topp: 0.5
          maxtokens: 2048
        - name: openai
          model: gpt-3.5-turbo
          password: sk-xxxxx
          temperature: 0.7
          topp: 0.5
          maxtokens: 2048
    defaultprovider: ""
kubeconfig: ""
kubecontext: ""

When i try this :

k8sgpt analyze -b azureopenai --language "French"  --explain --with-doc --no-cache
AI Provider: azureopenai

0 cluster-api-provider-vcluster-system/cluster-api-provider-vcluster-metrics-service(cluster-api-provider-vcluster-metrics-service)
- Error: Service has no endpoints, expected label control-plane=controller-manager
Error: The Service is looking for Pods with the label control-plane=controller-manager but none are found, so it has no endpoints to route traffic to.

Solution:
1. Verify Pod labels with `kubectl get pods --show-labels`.
2. If missing, label Pods using `kubectl label pods <pod-name> control-plane=controller-manager`.
3. Check Service selector matches Pod labels with `kubectl describe service <service-name>`.

But i get this with openai :

k8sgpt analyze -b openai --language "French"  --explain --with-doc
AI Provider: openai

0 cluster-api-provider-vcluster-system/cluster-api-provider-vcluster-metrics-service(cluster-api-provider-vcluster-metrics-service)
- Error: Service has no endpoints, expected label control-plane=controller-manager
Error: Le service n'a pas de points de terminaison, l'étiquette control-plane=controller-manager est attendue.
Solution:
1. Vérifiez si le label "control-plane=controller-manager" est correctement appliqué au contrôleur de gestion.
2. Assurez-vous que le contrôleur de gestion est en cours d'exécution et accessible.
3. Vérifiez si les points de terminaison du service sont correctement configurés et qu'ils correspondent au label attendu.
4. Redémarrez le service ou le contrôleur de gestion si nécessaire.

is it related to the prompt formating ?

Expected behaviour

French output when using azureopenai

Actual behaviour

English output when using azureopenai even i specify --language flag

Additional Information

No response

herveleclerc commented 9 months ago

The strange thing is that if i reuse azureopenai after openai i've some message in french and other in english

i've create a bad deployment and services on purpose

k8sgpt analyze -b azureopenai --language "French"  --explain --with-doc
AI Provider: azureopenai

0 argocd/argocd-application-controller(argocd-application-controller)
- Error: StatefulSet uses the service argocd/argocd-application-controller which does not exist.
  Kubernetes Doc: serviceName is the name of the service that governs this StatefulSet. This service must exist before the StatefulSet, and is responsible for the network identity of the set. Pods get DNS/hostnames that follow the pattern: pod-specific-string.serviceName.default.svc.cluster.local where "pod-specific-string" is managed by the StatefulSet controller.
Error: The StatefulSet is configured to use a service named 'argocd/argocd-application-controller' which cannot be found in the cluster.
Solution: 1. Verify service name. 2. Create service 'argocd-application-controller' in 'argocd' namespace. 3. Apply changes with 'kubectl apply'.
1 cluster-api-provider-vcluster-system/cluster-api-provider-vcluster-metrics-service(cluster-api-provider-vcluster-metrics-service)
- Error: Service has no endpoints, expected label control-plane=controller-manager
Error: The Service is looking for Pods with the label control-plane=controller-manager but none are found, so it has no endpoints to route traffic to.

Solution:
1. Verify Pod labels with `kubectl get pods --show-labels`.
2. If missing, label Pods using `kubectl label pods <pod-name> control-plane=controller-manager`.
3. Check Service selector matches Pod labels with `kubectl describe service <service-name>`.
2 default/pi-lb(pi-lb)
- Error: Service has no endpoints, expected label app=pi-web-pod
Error: Le Service ne possède aucun endpoint car il attend des pods avec le label app=pi-web-pod, mais aucun ne correspond.

Solution: 1. Vérifiez les pods avec `kubectl get pods --show-labels`. 2. Ajoutez/Corrigez le label avec `kubectl label pods <pod-name> app=pi-web-pod`. 3. Confirmez avec `kubectl get endpoints`.
3 default/pi-np(pi-np)
- Error: Service has no endpoints, expected label app=pi-web-pod
Error: Le Service ne possède aucun endpoint car il attend des pods avec le label app=pi-web-pod, mais aucun ne correspond.

Solution: 1. Vérifiez les pods avec `kubectl get pods --show-labels`. 2. Ajoutez/Corrigez le label avec `kubectl label pods <pod-name> app=pi-web-pod`. 3. Confirmez avec `kubectl get endpoints`.
VaibhavMalik4187 commented 9 months ago

The following code is used to generate the response for OpenAI backend:

    resp, err := c.client.CreateChatCompletion(ctx, openai.ChatCompletionRequest{
        Model: c.model,
        Messages: []openai.ChatCompletionMessage{
            {
                Role:    "user",
                Content: prompt,
            },
        },
        Temperature:      c.temperature,
        MaxTokens:        maxToken,
        PresencePenalty:  presencePenalty,
        FrequencyPenalty: frequencyPenalty,
        TopP:             topP,
    })

For AzureOpenAI, it is the same code but with fewer parameters:

    resp, err := c.client.CreateChatCompletion(ctx, openai.ChatCompletionRequest{
        Model: c.model,
        Messages: []openai.ChatCompletionMessage{
            {
                Role:    openai.ChatMessageRoleUser,
                Content: prompt,
            },
        },
        Temperature: c.temperature,
    })

I wonder if that's what making the difference in the response.

VaibhavMalik4187 commented 9 months ago

After learning exploring more on this, I think that the TopP parameter being used by the OpenAI client is making the difference. Setting TopP to 1 means the model will select the token with the highest probability for each step of text generation. This results in completions that are highly deterministic and predictable.

VaibhavMalik4187 commented 9 months ago

After learning exploring more on this, I think that the TopP parameter being used by the OpenAI client is making the difference. Setting TopP to 1 means the model will select the token with the highest probability for each step of text generation. This results in completions that are highly deterministic and predictable.

@arbreezy any thoughts on this?

arbreezy commented 8 months ago

Caching will also retrieve based on the analysis message the possible solution, in whichever language was logged first