StanfordSpezi / .github

Stanford Spezi
1 stars 2 forks source link

Integrate my own fine-tuned model in SpeziLLM #55

Closed Bourne-alt closed 4 months ago

Bourne-alt commented 4 months ago

Use Case

I am currently developing a sports app and need to integrate a large model specific to a vertical domain.

Problem

I found that there is no code supporting the integration of our own LLM.

Solution

Develop a module similar to integrating OpenAI.

Alternatives considered

There is no solution available at the moment.

Additional context

No response

Code of Conduct

PSchmiedmayer commented 4 months ago

@Bourne-alt Thank you for looking into Spezi LLM!

If you want to execute a model locally, I would advise you to take a look at the SpeziLLMLocal. It allows you to execute models locally on your device and we demonstrate it as part of the UI Testing application in this repo using some open-weight models.

If the model size is quite large (which it is for all the models we are using), I would suggest looking into SpeziLLMLocalDownload as an additional context and potential to download it at start time to your app.

I can encourage you to look through the README and documentation to explore these two targets in SpeziLLM, they might be helpful. I addition, we more than welcome PRs to this package and other work we are doing as part of our open-source work to extend the functionality of the frameworks.

In addition to that CoreML and other Apple-provided tools might be helpful.