Closed volovikariel closed 7 months ago
Just loaded that model and I can't reproduce on Mac at least - what happens when you run scalene --viewer
?
That one seems to detect it just fine :open_mouth:
I get this in the browser console when opening the .html
file and not scalene --viewer
Got it. You need to use the viewer (which is triggered automatically when you use Scalene by default). Because of how security works for browsers, it's not possible to access localhost
(not at least without invoking the browser with special flags). If you want to use the viewer on a previously-profiled file, you should run scalene --viewer
and then load the profile.json
file that Scalene produces.
Describe the bug The scalene local language model option is not detecting my locally running ollama model.
To Reproduce Steps to reproduce the behavior:
llama3:8b
model locally, and have it be exposed on http://localhost:11434/.html
file generated by running scalene on any python codeExpected behavior There should be no error message, the Ollama server should be detected.
Desktop (please complete the following information):
Additional context I visited localhost:11434 and it says that Ollama is running. I have tried restarting the server and re-opening the HTML page, the issue persists. I ran
ollama run
and was able to interact with the model, so it seems to be working fine.