Closed chadbrewbaker closed 9 months ago
Related issue. My internet at home is trash and I want to use llama.cpp models already downloaded. Just change this line to my model directory and modify the metadata above for the models I have downloaded to choose from? https://github.com/danielgross/localpilot/blob/a7b03985eef8b3db732853fe11d21e789975727e/config.py#L26
Please pull the latest and then you can change the host model folder in config.py, thank you!
I'm trying to host this on my Mac Studio I vscode remote into from all my laptops via ssh.
Could you add a blurb in the README how to set it up for remote vscode? I'm assuming there are additional steps, or it just works?