sergix44 / gradio-client-php

Gradio API client for PHP
MIT License
14 stars 2 forks source link

[Bug]: Endpoint not found on huggingface space #2

Closed peacefulotter closed 1 year ago

peacefulotter commented 1 year ago

What happened?

Throws an InvalidArgumentException "endpoint not found" on the https://ysharma-explore-llamav2-with-tgi.hf.space space. What does not found mean? Why does it look like the package "loads" endpoints? What is it loading even?

Thanks for your help

How to reproduce the bug

$client = new Client('https://ysharma-explore-llamav2-with-tgi.hf.space/'); $result = $client->predict(['Hey, what is chatgpt?'], apiName: '/chat');

Package Version

0.0.1

PHP Version

8.2.0

Which operating systems does with happen with?

Linux

Notes

No response

sergix44 commented 1 year ago

What happened?

Throws an InvalidArgumentException "endpoint not found" on the https://ysharma-explore-llamav2-with-tgi.hf.space space. What does not found mean? Why does it look like the package "loads" endpoints? What is it loading even?

The library is written starting from the Hugging Face python and js one, so it replicates their behaviour. When a new client is instantiated, is going to pull the space configuration with all the endpoints, required parameters and so on. The error says that for some reason, it didnt find a matching apiName in the endpoint configuraton. I'll take a look soon

sergix44 commented 1 year ago

The issue should be now solved, please check if its working on your side

peacefulotter commented 1 year ago

Hi, sorry to come back to you so late - it is indeed fixed.

Nevertheless, the $result->getOutputs() yields the empty array. My guess is that the reply I get from the huggingface space is a stream coming from an LLM and this is not yet well supported in your package...

I am not sure if I should open a new issue.

sergix44 commented 1 year ago

Hi, sorry to come back to you so late - it is indeed fixed.

Nevertheless, the $result->getOutputs() yields the empty array. My guess is that the reply I get from the huggingface space is a stream coming from an LLM and this is not yet well supported in your package...

I am not sure if I should open a new issue.

Yes please, open a separate issue on how reproduce it.