roboflow / cog-vlm-client

Simple CogVLM client script
14 stars 4 forks source link

message: Internal error #2

Open palebluewanders opened 6 months ago

palebluewanders commented 6 months ago

Whenever I try to run script.py or follow instructions here: https://blog.roboflow.com/how-to-deploy-cogvlm/

I always get this result: {'message': 'Internal error.'}

Using Gradio also returns an error. Unfortunately there's no other clues.

YoungjaeDev commented 5 months ago

@SkalskiP

same issue

Traceback (most recent call last):
  File "test3.py", line 9, in <module>
    result = CLIENT.prompt_cogvlm(
  File "/home/aicads/miniconda3/envs/vlm/lib/python3.8/site-packages/inference_sdk/http/client.py", line 88, in decorate
    raise HTTPCallErrorError(
inference_sdk.http.errors.HTTPCallErrorError: HTTPCallErrorError(description='500 Server Error: Internal Server Error for url: http://localhost:9001/llm/cogvlm', api_message='Internal error.',status_code=500)
SkalskiP commented 5 months ago

Hi, @palebluewanders and @YoungjaeDev any more details? Did you run it locally or in the cloud?

YoungjaeDev commented 5 months ago

Hi, @palebluewanders and @YoungjaeDev any more details? Did you run it locally or in the cloud?

I've created and deployed a server locally and am using it

palebluewanders commented 5 months ago

Hi, @palebluewanders and @YoungjaeDev any more details? Did you run it locally or in the cloud?

Mine was cloud, g5.2xlarge on AWS.

YoungjaeDev commented 5 months ago

Hi, @palebluewanders and @YoungjaeDev any more details? Did you run it locally or in the cloud?

Mine was cloud, g5.2xlarge on AWS.

https://discuss.roboflow.com/t/sam-cogvlm-http-request-internal-error/5244

I am continuing on from here

PawelPeczek-Roboflow commented 5 months ago

answered here: https://discuss.roboflow.com/t/sam-cogvlm-http-request-internal-error/5244/7