caikit / caikit-nlp

Apache License 2.0
12 stars 49 forks source link

bad request/validation errorsand default value overwrite #371

Closed waleedqk closed 4 months ago

waleedqk commented 4 months ago

Describe the bug

When the user just provides generated_tokens as false without specifying other parameters, they can end up with an obscure 400/422. This likely happens because there are other fields that default to true without the user knowing.

Additionally, the default value for the include_stop_sequence should be null instead of true. There is a case in the model having include_stop_sequence default to be false, but caikit always overrides it to true.

To Reproduce

curl --request POST -k \
  --url https://<hostname>/api/v1/task/classification-with-text-generation \
  --header 'Content-Type: application/json' \
  --data '{
  "inputs": "Why is Jason such a jerk?",
  "model_id": "llm-model",
  "guardrail_config": {
    "input": {
      "models": { }
    },
    "output": {
      "models": {
              "detector-1": {"threshold": 0.0}, "detector-2": {"threshold": 0.2}
        }
    }
  },
  "text_gen_parameters": {
    "max_new_tokens": 30,
    "min_new_tokens": 30,
    "generated_tokens": false
  }
}