Closed gregd33 closed 2 years ago
@gregd33 This is a good suggestion to enhance the experience. I think the second option is better at handling it for this case.
Can you tell me more about your current workflow? Are you using the docker-compose and nginx proxy setup in development? If you are using this setup in production, what role is swagger UI playing in that?
The Swagger component is not that important at all - real processing would be done via the REST end points (which work fine).
The Swagger UI is mostly nice for demoing how a service works, especially since the service is on an image so its easy to upload one and get some results.
So this is low priority.
Also if you can point me in the direction of how/where the Swagger page is generated, I can try to tackle the issue myself
The Swagger UI is mostly nice for demoing how a service works, especially since the service is on an image so its easy to upload one and get some results.
Yeah, that makes sense.
Also if you can point me in the direction of how/where the Swagger page is generated, I can try to tackle the issue myself
Yes, I am happy to show you.
To render the swagger UI on the '/' page, it went through these steps:
The index route will default to display swagger UI if the user didn't have any customized HTML. https://github.com/bentoml/BentoML/blob/439412fe394116c265a5199c5983ce60eb45e85b/bentoml/server/api_server.py#L250
Inside the render swagger UI function, we send the client a HTML that is based on a template with dynamically generated open API definition in JSON. https://github.com/bentoml/BentoML/blob/439412fe394116c265a5199c5983ce60eb45e85b/bentoml/server/api_server.py#L159
In the template, we use Swagger-UI to display the interactive elements based on the open API definition https://github.com/bentoml/BentoML/blob/439412fe394116c265a5199c5983ce60eb45e85b/bentoml/server/api_server.py#L46-L60
The open API definition(docs.json
) is generated here:
https://github.com/bentoml/BentoML/blob/439412fe394116c265a5199c5983ce60eb45e85b/bentoml/server/open_api.py#L20
I was able to get a start of this and a PR is started in https://github.com/bentoml/BentoML/pull/1411
I still need to update the docs and add tests, if possible. I also need to make sure that I don't have extra changes - I originally went down a path of changes that changed the endpoint that was hosting the service not what Swagger called.
But I thought I'd share progress here. Hopefully will finish it up soon.
@gregd33 Awesome. Thank you for contributing! Let me know if I can help in anyway
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Ultimately we decided to not support this use case for now. In BentoML 1.0, we have standardized how kubernetes deployments are created via Yatai, which will handle all the deployment endpoints management and ingress resources, making sure all swagger pages are available.
Is your feature request related to a problem? Please describe.
So I have a setup like this:
So, if I call myservices.com/bento_services/MyBentoService/predict via python/requests, it all works fine.
If I go to myservices.com/bento_services/MyBentoService/, the Swagger UI pops up.
However, if I try to predict via the UI, it fails. The reason is that the Swagger page is sending the request to myservices.com/predict
Describe the solution you'd like
I don't know enough about how the Swagger page is generated to know how it might be fixed. But I can think of two possibilities: