amaiya / onprem

A tool for running on-premises large language models with non-public data
https://amaiya.github.io/onprem
Apache License 2.0
684 stars 32 forks source link

feat: Add optional argument to specify custom path to download LLM #5

Closed rabilrbl closed 1 year ago

rabilrbl commented 1 year ago

Currently there is no implementation to specify custom path location to download a model.

By default, the model is downloaded at ~/onprem_data

Example:

from onprem import LLM

llm = LLM(model_download_path="~/models")
# or
llm.download_model(model_download_path="~/models")
rabilrbl commented 1 year ago

@amaiya I am ready to make PR for this. Shall I proceed?

amaiya commented 1 year ago

Hi @rabilrbl I should probably create a CONTRIBUTING.md at some point, but I'll mention this information here. Are you familiar with the nbdev project?

This project was built using nbdev. As a result, all development is done within the notebooks in the nbs folder. The .py files are auto-generated from the notebooks (which is noted in an auto-generated message at the top of the .py files).

You can take a look at this tutorial to learn more about building and contributing to projects that use nbdev.

Adding a model_download_path parameter sounds good as long as the default for the parameter is None in which case LLM will work exactly as it does now (i.e., store model in ~/onprem_data.

Thanks!

rabilrbl commented 1 year ago

@amaiya Got it. I'll try to modify .ipynb files for the PR.

the model_download_path will be Option[str]=None when not present It will work exactly the same as before.