Open mudler opened 1 year ago
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
To build a custom gRPC server from scratch, you can follow the instructions in the official gRPC documentation: https://grpc.io/docs/languages/go/. Once you have built your own gRPC server, you can use it with LocalAI by specifying the --external-grpc-backends parameter as explained here: https://localai.io/advanced/index.html#connect-external-backends.
To connect an external gRPC server to LocalAI, you need to specify the server address and port in the format
./local-ai --debug --external-grpc-backends "my-awesome-backend:localhost:8080"
This will make the new backend available for all API endpoints. You can also specify multiple backends in a comma-separated list.
To test the connection, you can use the curl
command to send a request to LocalAI and verify that it is processed by your custom gRPC server. For example:
curl -X POST \
http://localhost:8080/api/v1/models/my_model/predict \
-H "Content-Type: application/json" \
-d '{"input": {"image": "http://example.com/image.jpg"}}'
This will send a prediction request to LocalAI, which will forward it to your custom gRPC server for processing. You can verify that the request is processed correctly by checking the response body and headers.
Sources:
Initial documentation on how to write backends has been added here https://localai.io/advanced/development/
LocalAI backends are just
gRPC
servers satisfying an interface. Indeed it is possible to also start externalgRPC
servers and attach them to a LocalAI instance as explained here: https://localai.io/advanced/index.html#connect-external-backends, however, we miss documents on how to build such a backend from scratch and an e2e guide on how to connect those to LocalAI.