Open AlexChristensen opened 9 months ago
I was thinking of implementing something like ollama pull. For example, model_pull() would take either a model name or a local (absolute or relative) path as input.
It would also be a good idea to add another function that lists all locally available models and the functions they can be used for. When listing local/offline models, we can repeat the message about local inference (#23).
Some folks might download models onto their computer via different paths than what {transformers} or other LLM modules set
To avoid downloading a model multiple times, some users might prefer to provide a path to their model instead