This repo contains a script for converting a LaMa (aka cute, fuzzy 🦙) model to Apple's Core ML model format. More specifically, it converts the implementation of LaMa from Lama Cleaner.
This repo also includes a simple example of how to use the Core ML model for prediction. See Sample.
Create a Conda environment for CoreMLaMa:
conda create -n coremlama python=3.10 # works with mamba, too
conda activate coremlama
pip install -r requirements.txt
Run the conversion script:
python convert_lama.py
This script will download and convert Big LaMa to a Core ML package named LaMa.mlpackage.
The Core ML model this script produces was designed for macOS deployments. It runs well on macOS, on the GPU. I have received several reports of unsuccessful attempts to run this model on iOS, especially with fp16 precision on the Neural Engine. Conversely, I have not received any reports of successful deployments to iOS.
It may very well be possible to run this model on iOS with some tuning in the conversion process. I simply have not attempted this. I would very much welcome a PR and give credit to anyone who is able to convert this model and run it with great results on iOS.
Thanks to the authors of LaMa:
[Project page] [arXiv] [Supplementary] [BibTeX] [Casual GAN Papers Summary]
CoreMLaMa uses the LaMa model and supporting code from Lama Cleaner. Lama Cleaner makes this project much simpler.