Closed JonHolman closed 8 months ago
We'll add a Lambda OCI (Container) example as well :)
Thanks! I was playing around with copying bootstrap into the al2023 provided image, but got lost adding node_modules.
@JonHolman it would be helpful if you could give a bit of context of what are you trying to do? Why are you looking to package LLRT in a container?
@richarddavison sure. I think having a container option would assist in making LLRT available for more use cases. In what I am trying to do right now and would like to evaluate if LLRT is an option and would improve performance. I'm trying to run llama.cpp with a small LLM within a lambda function. My simple WIP can be seen at https://github.com/JonHolman/wip-sam-node-llama-cpp
@JonHolman when we release a new version, you should be able to deploy a container optimized version of LLRT. This should have significantly lower cold starts compared with adding the once that exists today.
Will there be a container base image that we can use instead of public.ecr.aws/lambda/nodejs:20 ?