It would be a powerful addition to the Kotlin Ecosystem if this feature were to allow Kotlin Apps to load and run inference on Meta's recently released LLaMA model.
These projects may provide inspiration for what a KotlinDL integration could look like:
vanilla-llama - Forked version of Meta's own LLaMA inference implementation using PyTorch.
It would be a powerful addition to the Kotlin Ecosystem if this feature were to allow Kotlin Apps to load and run inference on Meta's recently released LLaMA model.
These projects may provide inspiration for what a KotlinDL integration could look like: