I'm trying to run locally lamma3 on ubuntu 20.04.
I installed everything and it all seem to be working.
Running ollama run llama3:8b let me chat with him. And running ollama serve seems to work.
I tried coppying this code:
import ollama
response = ollama.chat(model='llama3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
But I get an error:
httpx.ConnectError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)
If you need any more information let me know.
Thank you :)
Hello :)
I'm trying to run locally lamma3 on ubuntu 20.04. I installed everything and it all seem to be working. Running
ollama run llama3:8b
let me chat with him. And runningollama serve
seems to work.I tried coppying this code:
But I get an error: httpx.ConnectError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)
If you need any more information let me know. Thank you :)