Open atul86244 opened 7 months ago
Hey @atul86244, We support OpenAI's API spec, are you having a different use-case in your mind ?
Hi @arbreezy , thanks for your response. I was going through this doc https://docs.k8sgpt.ai/reference/providers/backend/ and was trying to figure out how can I point k8sGPT to my company's AI backend. If I have my own custom AI which exposes an endpoint then can I point k8sGPT to it?
I am not sure if the spec below provides a way to do that:
kubectl apply -f - << EOF
apiVersion: core.k8sgpt.ai/v1alpha1
kind: K8sGPT
metadata:
name: k8sgpt-sample
namespace: k8sgpt-operator-system
spec:
ai:
enabled: true
model: gpt-3.5-turbo
backend: openai
secret:
name: k8sgpt-sample-secret
key: openai-api-key
# anonymized: false
# language: english
noCache: false
repository: ghcr.io/k8sgpt-ai/k8sgpt
version: v0.3.8
#integrations:
# trivy:
# enabled: true
# namespace: trivy-system
# filters:
# - Ingress
# sink:
# type: slack
# webhook: <webhook-url> # use the sink secret if you want to keep your webhook url private
# secret:
# name: slack-webhook
# key: url
#extraOptions:
# backstage:
# enabled: true
EOF
I am also interested in this. I have a custom API endpoint that supports openAI API spec but tinyllama nor localAI have auth tokens which my endpoint needs. Can we either add a custom baseURL field to openai provider or auth token field to localAI or tinyllama? Please correct me if this already exists.
Thanks!
Hi Team, can you please help with this.
+1. Many of users or corporations host their various LLM over self-hosted API (i.e. AWS API Gateway, Kong API) via REST protocol regardless of the LLM models sitting behind. In this case, it will be a request call to the bakckend API
Checklist
Is this feature request related to a problem?
No
Problem Description
Please add support to use custom AI backends with k8sGPT. This would help people use k8sGPT along with in house AI backends leading to increase in adoption of k8sGPT.
Solution Description
Need the ability to use k8sGPT along with custom in house AI backends. For example - I want to use k8sGPT in my company and use the company AI solution as the AI backend for k8sGPT.
Benefits
This would help people use k8sGPT along with in house AI backends leading to increase in adoption of k8sGPT.
Potential Drawbacks
No response
Additional Information
No response