mallman / CoreMLaMa

LaMa for Core ML
Apache License 2.0
86 stars 11 forks source link

Don't lose quality using the model on iOS #2

Closed vittorionapoli closed 10 months ago

vittorionapoli commented 11 months ago

Hi, starting used the CoreML model inside a photo editor application, main problem is the 800x800 limit (tried to increase it but having problem on memory). I tried to split the image to 800x800 and then do the erasing but the result isn't good.

Do you have thoughts?

mallman commented 10 months ago

Hi Vittorio,

I don't have experience using this model on iOS, and I don't have any advice for doing so.

With respect to the fixed crop size, I have not been able to configure the model conversion script to work with arbitrary crop sizes. While CoreML supports image input with arbitrary sizes, it's not sophisticated enough to support the particular use case for LaMa. That is as of CoreML tools version 6.3.

CoreML tools 7.0 advertises new model compression and optimization capabilities that may make a deployment to limited-resource devices like the iPhone more feasible. Since my particular use case for CoreMLaMa is for macOS only and I have many other projects to work on, I don't anticipate working on optimizing CoreMLaMa for iOS.

vittorionapoli commented 10 months ago

Hi mailman, thanks for your response! How do you use the model for processing non square images on your macOS App?

mallman commented 10 months ago

Hi Vittorio,

I crop an 800x800 image from the full image for processing. The crop surrounds the mask that the user draws. Once the image and mask are processed by LaMa, you just replace the original cropped area of your input image with the inpainted crop.

This is what the sample program does. Have you seen it? main.swift

If I have some time I'll put together a clearer illustration of the process.