Closed foster999 closed 11 months ago
Did you try without setting any package in include_modules
? I mean without setting include_modules
at all.
By default Lithops includes all the modules it detects that are missing in the container.
In any case I will check if there is an issue with the include_modules
config param.
In my experience, it is always a good idea to include all the required packages in the runtime itself, even if Lithops is able to transfer them. By including all the required packages, you will improve invocation and execution times.
Yep, with the default I still see 2023-10-26 17:48:59,002 [DEBUG] serialize.py:101 -- Modules to transmit: None
That makes sense. Are there any examples of extending the default runtime that I could follow?
Edit: Ignore me, found the docs here
I've got a custom runtime ready, but do you know how I can authorise IBM cloud functions to pull the docker container from a private container registry?
Edit: I've just spotted that IBM are depreciating cloud functions, so I'll look to change to code engine
In #1199 I fixed the include_modules
config parameter so that now it should always include all the modules set in it
Sorry for raising yet another! This one might be down to my incorrect use.
Following from #1179, the error I get back from the workers says that my local package/code isn't transferred to the worker. As:
Engineering is a locally developed package that is used in my sklearn pipeline.
I've tried configuring lithops to include dependencies using:
But in the debug logs I see:
Suggesting that they are not transferred or installed on the workers.
Is this the right approach? Or would I need to create a custom container with the dependencies installed to do this?