dotnet / machinelearning

ML.NET is an open source and cross-platform machine learning framework for .NET.
https://dot.net/ml
MIT License
8.92k stars 1.86k forks source link

Please make a simple way using the local gguf model, thanks #7003

Open SyntaxEvg opened 4 months ago

SyntaxEvg commented 4 months ago

add a simple feature in one line load the model into memory like python does what is your problem create a simple way to download models from huggingface? quantized up to 4 or 5 Q for example mistralai/Mistral-7B-v0.1 gguf extensions Can you explain the difficulties, why you are not addressing this issue, for example, I don’t like python and I don’t want to use it, but due to the fact that you do not support the way of loading a model and communicating with it in ml net, I have to use it Please make a simple way using the local gguf model, thanks

michaelgsharp commented 4 months ago

@luisquintanilla thoughts here?

luisquintanilla commented 4 months ago

Hi @SyntaxEvg,

This is a great idea. I think this is something we can add to our backlog.

@michaelgsharp let's add it to Future.

Currently, the easiest way I'm aware of doing this is using LlamaSharp.

If you want to inference as part of an ML.NET pipeline, you could wrap your inference code that calls the LlamaSharp model in a CustomMapping transform.

Not exactly the same thing, but here's a sample where I wrapped an inference call to Azure OpenAI.

https://github.com/luisquintanilla/OpenAIAutoMLNET/blob/main/Program.cs