Closed snakedye closed 1 week ago
@ThomasCardin I get this response when I run curl https://ollama.inspection.alpha.canada.ca/api/generate -d '{ "model": "llama3", "prompt": "Hello galaxy", "stream": "false"}
in codespace:
<html>
<head><title>403 Forbidden</title></head>
<body>
<center><h1>403 Forbidden</h1></center>
<hr><center>nginx</center>
</body>
</html>
From nginx-ingress logs : 2024/06/03 14:54:59 [error] 31#31: *183113 access forbidden by rule, client: 172.172.208.244, server: ollama.inspection.alpha.canada.ca, request: "POST /api/chat HTTP/1.1", host: "ollama.inspection.alpha.canada.ca" We'll have to see how we want to enable request coming from codespaces
Support for Ollama and local models is no longer being considered. It seems unnecessary to keep this issue open any longer.
Add a class that serves as a client for our Ollama instance.
https://github.com/ollama/ollama/blob/main/docs/api.md
https://github.com/ollama/ollama-python
Tasks:
ollama.py
test_language_model.py
Acceptance Criteria:
All the tests should pass.
Test with real ollama endpoint.