mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
23.89k stars 1.83k forks source link

Moderations endpoint is not present #647

Open AntonioCiolino opened 1 year ago

AntonioCiolino commented 1 year ago

While attempting to move my AnythingLLM into LocalAI, there is a setting for the moderations endpoint, which is missing in LocalAI. Please consider at least a stub for this endpoint.

localai-bot commented 1 year ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

I'm sorry to hear that you are having trouble with the LocalAI integration. I understand that you would like to use the "moderations" endpoint in your project, but it seems that LocalAI does not have a built-in implementation for this feature. One possible solution is to create a custom integration for the moderations endpoint using the LocalAI API. You can use the OpenAI API documentation to see how to interact with the LocalAI API and how to implement your own custom endpoints. Alternatively, you could also consider using a different open-source project that has built-in support for the moderations endpoint, such as Hugging Face Transformers or TensorFlow Hub. I hope this helps! Let me know if you have any further questions or need more assistance.

Sources: