ai-cfia / fertiscan-backend

Fertiscan backend
MIT License
1 stars 0 forks source link

Ollama support #36

Closed snakedye closed 1 week ago

snakedye commented 1 month ago

Add a class that serves as a client for our Ollama instance.

snakedye commented 1 month ago

@ThomasCardin I get this response when I run curl https://ollama.inspection.alpha.canada.ca/api/generate -d '{ "model": "llama3", "prompt": "Hello galaxy", "stream": "false"} in codespace:

<html>
<head><title>403 Forbidden</title></head>
<body>
<center><h1>403 Forbidden</h1></center>
<hr><center>nginx</center>
</body>
</html>
SonOfLope commented 1 month ago

From nginx-ingress logs : 2024/06/03 14:54:59 [error] 31#31: *183113 access forbidden by rule, client: 172.172.208.244, server: ollama.inspection.alpha.canada.ca, request: "POST /api/chat HTTP/1.1", host: "ollama.inspection.alpha.canada.ca" We'll have to see how we want to enable request coming from codespaces

Endlessflow commented 1 week ago

Support for Ollama and local models is no longer being considered. It seems unnecessary to keep this issue open any longer.