danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
https://danielmiessler.com/p/fabric-origin-story
MIT License
25.77k stars 2.74k forks source link

[Feature request]: Groq Integration Example/ Tutorial #361

Closed scaruslooner closed 7 months ago

scaruslooner commented 7 months ago

What do you need?

Can somebody show me groq example with fabric ?

I was trying to use fabric with groq (NOT grok), when i ranned into "Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401".I use my groq key when it asked for open ai key on fabric --setup.

I'm pretty sure you can use groq thru OpenAI endpoints but I don't know how to do it. It has OpenAI compatibility. https://console.groq.com/docs/openai

Source code:

(fabric) C:\Users\camer\Documents\fabric>set OPENAI_MODEL_NAME=llama3-70b-8192

(fabric) C:\Users\camer\Documents\fabric>fabric --pattern summarize Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401

(fabric) C:\Users\camer\Documents\fabric>export OPENAI_API_BASE=https://api.groq.com/openai/v1 'export' is not recognized as an internal or external command, operable program or batch file.

(fabric) C:\Users\camer\Documents\fabric> (fabric) C:\Users\camer\Documents\fabric>export OPENAI_API_BASE=https://api.groq.com/openai/v1 'export' is not recognized as an internal or external command, operable program or batch file.

(fabric) C:\Users\camer\Documents\fabric> (fabric) C:\Users\camer\Documents\fabric>set OPENAI_API_BASE_URL=https://api.groq.com/openai/v1

(fabric) C:\Users\camer\Documents\fabric> (fabric) C:\Users\camer\Documents\fabric>fabric --pattern summarize Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401

(fabric) C:\Users\camer\Documents\fabric>fabric --pattern summarize --model llama3-70b-8192 Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401

xssdoctor commented 7 months ago

Ok so I looked up the groq thing you're talking about. It seems to just be a wrapper around other llm's. Fabric is also a wrapper around other llm's. You can't and shouldn't use groq with fabric. Just pick your llm and use it with either tool.

DaiZack commented 6 months ago

@xssdoctor groq API is not just a wrapper, it is a cloud runtime that provides free/paid cloud resource LPU to run the model. The syntax is the same as openAI just need to replace the endpoint URL and APIKEY to run it. If Fabric can provide a parameter to config OpenAI endpoint, I think it is easy to be done.

DaiZack commented 6 months ago

@scaruslooner I just get a solution:

  1. Install jq if you haven't already. You can usually do this using your package manager. For example, on Ubuntu, you can use:

    sudo apt-get install jq
  2. create groq function in your .bashrc or .bash_profile:


function groq() {
    local message=""
    local model="llama3-8b-8192"

    while getopts ":m:" opt; do
        case ${opt} in
            m )
                case "$OPTARG" in
                    llama3-8b)
                        model="llama3-8b-8192"
                        ;;
                    llama3-70b)
                        model="llama3-70b-8192"
                        ;;
                    mixtral)
                        model="mixtral-8x7b-32768"
                        ;;
                    gemma)
                        model="gemma-7b-it"
                        ;;
                    *)
                        echo "Invalid model: $OPTARG" >&2
                        return 1
                        ;;
                esac
                ;;
            \? )
                echo "Invalid option: $OPTARG" >&2
                return 1
                ;;
            : )
                echo "Invalid option: $OPTARG requires an argument" >&2
                return 1
                ;;
        esac
    done
    shift $((OPTIND -1))
    message="$*"

    if [ -z "$message" ]; then
        echo "Usage: groq [-m model] \"message\""
        return 1
    fi

    curl -s -X POST "https://api.groq.com/openai/v1/chat/completions" \
         -H "Authorization: Bearer $GROQ_API_KEY" \
         -H "Content-Type: application/json" \
         -d "{\"messages\": [{\"role\": \"user\", \"content\": \"$message\"}], \"model\": \"$model\"}" | jq -r '.choices[0].message.content'
}
  1. Save the file and reload your shell configuration by running:
source ~/.bashrc

or

source ~/.bash_profile

Now, when you use the groq function, it will only print the response message content:

groq "Explain the importance of fast language models"
groq -m llama3-70b "Explain the importance of fast language models"
hobbytp commented 6 months ago

You can use Litellm which help to adapt different LLM providers to OpenAI API.

config Litellm

The Litellm configuration file is:

model_list:
  - model_name: mixtral-8x7b-32768
    litellm_params:
      model: groq/mixtral-8x7b-32768
      api_key: <YOU groq KEY>
  - model_name: llama3-70b-8192
    litellm_params: 
      model: groq/llama3-70b-8192
      api_key: <YOU groq KEY>

start litellm with above configuration file.

litellm --config litellm_proxy_config_groq.yaml

export OPENAI env point to groq values

export OPENAI_BASE_URL=http://localhost:8000
export DEFAULT_MODEL="groq/llama3-70b-8192"
export OPENAI_API_KEY="Your groq key"

test it

fabric --pattern ai --text 'Why is the sky blue?'
xb8023 commented 4 months ago

You can use Litellm which help to adapt different LLM providers to OpenAI API.

config Litellm

The Litellm configuration file is:

model_list:
  - model_name: mixtral-8x7b-32768
    litellm_params:
      model: groq/mixtral-8x7b-32768
      api_key: <YOU groq KEY>
  - model_name: llama3-70b-8192
    litellm_params: 
      model: groq/llama3-70b-8192
      api_key: <YOU groq KEY>

start litellm with above configuration file.

litellm --config litellm_proxy_config_groq.yaml

export OPENAI env point to groq values

export OPENAI_BASE_URL=http://localhost:8000
export DEFAULT_MODEL="groq/llama3-70b-8192"
export OPENAI_API_KEY="Your groq key"

test it

fabric --pattern ai --text 'Why is the sky blue?'

I still retrieve an error with the prompts below: "Error code: 404 - {'error': {'message': 'The model groq/llama3-8b-8192 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'code': 'model_not_found'}}"

Then, I tried to get the models with the command: 'fabric --listmodels' GPT Models:

Local Models: gemma2:latest deepseek-coder-v2:latest qwen2:latest

Claude Models:

Google Models:

How can I proceed to use 'groq API'? After I change the model name and remove 'groq/', leaving the pure name only. I got the feedback at the bottom of message, which contained some other information like ' grpc.server_uri=dns:///generativelanguage.googleapis.com:443}}: backoff delay elapsed, reporting IDLE':

I tried to answer 'introduce yourself', then I got:

It seems work, but I have one more question, how can I get the only answers, or how to use the model's API directly?