canonical / kserve-operators

Charmed KServe
4 stars 2 forks source link

KServe Batcher isn't working #255

Open amouu opened 4 months ago

amouu commented 4 months ago

Bug Description

I am testing the KServe batcher. However, I encountered an issue where the agent shows the following error: "error: unknown flag enable-batcher."

Can anyone help me understand and resolve this error?

Screen Shot 2024-07-17 at 2 33 55 PM

To Reproduce

I am using the YAML configuration below:

apiVersion: serving.kserve.io/v1beta1
kind: InferenceService
metadata:
  name: "torchserve"
spec:
  predictor:
    minReplicas: 1
    timeout: 60
    batcher:
      maxBatchSize: 32
      maxLatency: 500
    model:
      modelFormat:
        name: pytorch
      storageUri: gs://kfserving-examples/models/torchserve/image_classifier/v1

Environment

microk8s 1.29 CharmedKubeflow 1.9/beta(revision 418) kserve-controller latest/edge(revision 593)

Relevant Log Output

error: unknown flag enable-batcher.

Additional Context

No response

syncronize-issues-to-jira[bot] commented 4 months ago

Thank you for reporting us your feedback!

The internal ticket has been created: https://warthogs.atlassian.net/browse/KF-6027.

This message was autogenerated

misohu commented 3 months ago

Because we are using custom rock for the batcher we think it may cause the problem. As for the next step we want to test with upstream image.

We have a suspicion that there might be incorrectly passed arguments to the rock.