Closed DeyangKong closed 2 months ago
Hey @kdy0912 glad to see you here!
It's a bit difficult for me to fully grasp the ngrok + nginx config but my instinct is to think something is a bit off there.
Can I suggest an alternative? You actually don't need to run phoenix with your application, you can actually run it as a side car. Here's an example. You would simply use OpenInference instead of Phoenix to export the traces to the phoenix container. Here's a working example with llama-index. https://github.com/Arize-ai/openinference/tree/main/python/examples/llama-index
The migration looks something like this.
Hope that helps
Describe the bug I'm running llama-index and phoenix on a container, and I want to use Nginx for reverse proxying, to proxy the container's output port 6006, to localhost:8080 inside the container. My Nginx configuration file is as follows:
The external IP:port mapping for my container's port 6006 is https://u184955-****-********.westc.gpuhub.com:8443/. I used this code to start phoenix.
When I opened the url "http://localhost:8080/", phoenix started successfully. However, when I opened the url "https://u184955-****-********.westc.gpuhub.com:8443/service1/", the website was white all.
Screenshots
Environment (please complete the following information):