szaimen / aio-local-ai

GNU Affero General Public License v3.0
4 stars 2 forks source link

Allow limiting THREADS via the host's environment #6

Closed M0E-lnx closed 1 year ago

M0E-lnx commented 1 year ago

Makes the THREADS variable use a pre-existing value instead of forcing all available horsepower by default.

szaimen commented 1 year ago

Hi, may I ask why you need this config option? Why not simply use the un-customized image from https://github.com/mudler/LocalAI?

M0E-lnx commented 1 year ago

Hi, may I ask why you need this config option? Why not simply use the un-customized image from https://github.com/mudler/LocalAI?

I prefer to self-host what I can, but I must admit I have less than ideal hardware. I really didnt want to dig into how to tie the un-customized image from mudler/LocalAI with my nextcloud

szaimen commented 1 year ago

I prefer to self-host what I can, but I must admit I have less than ideal hardware. I really didnt want to dig into how to tie the un-customized image from mudler/LocalAI with my nextcloud

I see but even if we add this config here you will not be able to configure it in AIO. There is currently no way to adjust this config permanently in AIO.

M0E-lnx commented 1 year ago

Would it be possible to create the stack on a static compose file instead? I had been running a standalone nextcloud for a while but just found this AIO setup which helps with an issue I had with Talk. If there is another way to deploy the stack to allow managing resources from a compose file, that would be awesome.

szaimen commented 1 year ago

Would it be possible to create the stack on a static compose file instead?

Currently it is not and there are no plans to introduce this

M0E-lnx commented 1 year ago

Would it be possible to create the stack on a static compose file instead?

Currently it is not and there are no plans to introduce this

Understood. Feel free to reject the PR btw. This is merely a wish to enhance my setup, but it looks like I have a bigger problem with it ... I guess this here would only be helpful if I could actually get the AI to fire up. See #5