SciSharp / LLamaSharp

A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
https://scisharp.github.io/LLamaSharp
MIT License
2.17k stars 293 forks source link

feat: add experimental auto-download support. #692

Closed AsakusaRinne closed 3 weeks ago

AsakusaRinne commented 1 month ago

Description

This PR implements #670 and should only be merged after #688

It puts all the logics of auto-download in a new package LLama.Exeprimental. Thus we don't need to rush for making the decision that whether it should be added to the main package or as a separate package. Instead we could listen from the users first.

It works fine for me on my windows. I'll appreciate it if anyone would like to help on Linux and MAC!

TODO

How to test it

Comment all the content in LLama/LLamaSharp.Runtime.targets and run the examples with auto-download enabled. It's better to enable the full logs (though in a bit mess now).

SignalRT commented 1 month ago

It seems to work on osx. I think it should be clearly described at some point how the cache is managed and where is located by default:

image
AsakusaRinne commented 3 weeks ago

Merge this PR now since this feature is only included in the experimental package. However, the introduction of this feature in the master branch will be delayed indefinitely. Before figuring a good way to deal with the security issue, this feature will be kept only in the experimental package.

The documentation of the description of its usages and cache will be added in a separate PR later.