Open Bruno-val-bus opened 3 months ago
To be able to use local models, we have to launch the containers with the ollama/openai setup.
services/evaluators_factory.py
@Bruno-val-bus , I will look into your last mentioned point and create a proposal, if you have not yet created that config file/environment
To be able to use local models, we have to launch the containers with the ollama/openai setup.
services/evaluators_factory.py
, containers are launched at application start #19 . Requires #4