promptengineers-ai / llm-server

🤖 Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://promptengineersai.netlify.app
32 stars 8 forks source link

FROM feature/139-simple-static-frontend-for-standalone-deploy INTO development #140

Closed ryaneggz closed 2 weeks ago

ryaneggz commented 1 month ago

Closes #139

vercel[bot] commented 1 month ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment | Name | Status | Preview | Comments | Updated (UTC) | | :--- | :----- | :------ | :------- | :------ | | **llm-server** | ⬜️ Ignored ([Inspect](https://vercel.com/promptengineers-ai/llm-server/3xXrH8sn7D3mNnv27rBGdQ9Pe84o)) | [Visit Preview](https://llm-server-git-feature-139-simple-sta-e3cc50-promptengineers-ai.vercel.app) | | Nov 3, 2024 8:56pm |