microsoft / BitNet

Official inference framework for 1-bit LLMs
MIT License
2.59k stars 181 forks source link

Feature Request: Local Server to Integrate with AI Chat Interface #23

Open harshitlakhani opened 8 hours ago

harshitlakhani commented 8 hours ago

I am developing llmchat.co, an open source local first chat interface. We do have integrations with Ollama, and LM Studio but one of the biggest hurdles that our initial users are telling us is not having a powerful enough machine to run these SLMs smoothly, we want to provide them BitNet access through local server. This project shows greater hope for local first products.