Closed M0E-lnx closed 1 year ago
Hi, may I ask why you need this config option? Why not simply use the un-customized image from https://github.com/mudler/LocalAI?
Hi, may I ask why you need this config option? Why not simply use the un-customized image from https://github.com/mudler/LocalAI?
I prefer to self-host what I can, but I must admit I have less than ideal hardware. I really didnt want to dig into how to tie the un-customized image from mudler/LocalAI with my nextcloud
I prefer to self-host what I can, but I must admit I have less than ideal hardware. I really didnt want to dig into how to tie the un-customized image from mudler/LocalAI with my nextcloud
I see but even if we add this config here you will not be able to configure it in AIO. There is currently no way to adjust this config permanently in AIO.
Would it be possible to create the stack on a static compose file instead? I had been running a standalone nextcloud for a while but just found this AIO setup which helps with an issue I had with Talk. If there is another way to deploy the stack to allow managing resources from a compose file, that would be awesome.
Would it be possible to create the stack on a static compose file instead?
Currently it is not and there are no plans to introduce this
Would it be possible to create the stack on a static compose file instead?
Currently it is not and there are no plans to introduce this
Understood. Feel free to reject the PR btw. This is merely a wish to enhance my setup, but it looks like I have a bigger problem with it ... I guess this here would only be helpful if I could actually get the AI to fire up. See #5
Makes the THREADS variable use a pre-existing value instead of forcing all available horsepower by default.