Bruno-val-bus / student-helper

0 stars 0 forks source link

Setup containers to run local models #2

Open Bruno-val-bus opened 3 months ago

Bruno-val-bus commented 3 months ago

To be able to use local models, we have to launch the containers with the ollama/openai setup.

ernOho commented 2 months ago

@Bruno-val-bus , I will look into your last mentioned point and create a proposal, if you have not yet created that config file/environment