Bug Description
I was using Ollama as an AI backend with Chatbox Android App as an AI frontend , I could not get Ollama models from Ngrok proxy. But it was ok when using Chatbox Windows/MacOS/Linux Desktop Edition. Do you have any better idea to fix this bug?
*Steps to Reproduce**
The steps to reproduce the bug:
Same problem here. But I didn't use ngrok, just local IP address.
Another app on android can access model list but Chatbox can't.
Also Chatbox on iOS can see models perfectly.
Bug Description I was using Ollama as an AI backend with Chatbox Android App as an AI frontend , I could not get Ollama models from Ngrok proxy. But it was ok when using Chatbox Windows/MacOS/Linux Desktop Edition. Do you have any better idea to fix this bug?
Expected Results A Ollama Models list should be Scroll down
Actual Results There was an empty list scroll down