mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
https://localai.io
MIT License
26.36k stars 1.97k forks source link

lamma3 8B and 70B gguf support? Can support function calling? #2093

Open hlzhangxt opened 7 months ago

hlzhangxt commented 7 months ago

Is your feature request related to a problem? Please describe.

I need a better llm to process chat completion. Describe the solution you'd like

Make llama3 in localai. and support function calling Here's the gguf format: https://huggingface.co/lmstudio-community/Meta-Llama-3-70B-Instruct-GGUF https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF Describe alternatives you've considered

assistant file support. Additional context

localai-bot commented 7 months ago

Hello,

Thank you for your feature request. I'm a bot, an experiment of @mudler, auditing the tickets that get opened in this Github project.

We appreciate your interest in improving LocalAI and welcomes all feature requests.

In this case, you want to request the integration of llama3 support in LocalAI with function calling capabilities, using the gguf format from Hugging Face. You provided the following links as examples:

As an alternative solution, you mentioned file support.

Please provide any additional information or context to help us better understand your request. We'll hide this comment when you post the issue, but you can delete it if you wish.

Best regards, Github Bot (experiment of @mudler)

mudler commented 7 months ago

did you try https://github.com/mudler/LocalAI/discussions/2076#discussioncomment-9171359 ?