joeyballentine / Material-Map-Generator

Easily create AI generated Normal maps, Displacement maps, and Roughness maps.
Apache License 2.0
290 stars 28 forks source link

CoreML Conversion #9

Open Treata11 opened 18 hours ago

Treata11 commented 18 hours ago

Greetings,

I've tested the models on an M1 Mac & it works perfectly fine using the CPU. (Gets extremely slow for images with res higher than 512*512 though which I think is expected...) I plan to use the models in a free iOS app & in order to do that, I have to convert the pth files to CoreML (so that it would be compatible in XCode). Utilizing CoreMLTools, I did the necessary steps explained in the docs in order to convert the model:

The following lines of code were added to the end of generate.py file:


import coremltools as ct

def convert_to_coreml(torch_model, model_name): imgShape = (1, 3, 384, 384) example_input = torch.rand(*imgShape) # Example input, needed by jit tracer. traced_model = torch.jit.trace(torch_model, example_input)

Convert the model: https://apple.github.io/coremltools/source/coremltools.converters.convert.html

coreml_model = ct.convert(
    traced_model, # Traced Torch Exported Model
    # inputs=[ct.ImageType(name="Image Texture", shape=example_input.shape)],
    inputs=[ct.ImageType(name="Image_Texture", shape=imgShape, color_layout=ct.colorlayout.RGB)],
    outputs=[ct.ImageType(name="Texture_Map", color_layout=ct.colorlayout.RGB)],
    # skip_model_load=True,
    source='pytorch',  # Specify the source framework
    convert_to='mlprogram',
    minimum_deployment_target=ct.target.macOS13
) 

# Save model in a Core ML `mlmodel` file
coreml_model.save(f"{model_name}.mlpackage")
print(f"Model saved as {model_name}.mlpackage")

Convert each model to MLPackage/CoreML

for idx, model in enumerate(models): names = ['1x_NormalMapGenerator-CX-Lite_200000_G', '1x_FrankenMapGenerator-CX-Lite_215000_G'] convert_to_coreml(model, names[idx])


And it was successful... **almost**.
The generated textures of the converted-CoreML model differs from the original Pytorch models... It's like that there's a red cover overlayed on top of the image:

![example_normal](https://github.com/user-attachments/assets/0e255ae2-dad2-4add-80a4-5423edb09f30)
> Generated Normal map

![example](https://github.com/user-attachments/assets/dbee3c58-9d46-4a20-9e9c-0e1ef23aff08)
> Either the Displacement or the Roughness map (not sure)

That aside, the converted `1x_FrankenMapGenerator-CX-Lite_215000_G` model only generates a single map & not two as in the original model.

I'm hoping that the maintainers might be able to find out what's wrong here. I'm guessing that there's a step not taking place in the converted models (it could have something to do with preprocessing the image, model parameters, or numerical precision).

I would be happy to use your experience on this issue, and I thank you in advance.
Treata11 commented 18 hours ago

By the way, this is the terminal log when I ran the generate.py script:

$ python generate.py --cpu
/Users/Treata/Developer/Git/Contribution/Material-Map-Generator/generate.py:70: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  state_dict = torch.load(model_path)
1 ssInput
Converting PyTorch Frontend ==> MIL Ops: 100%|▉| 1502/1503 [00:00<00:00, 8498.18
Running MIL frontend_pytorch pipeline: 100%|█| 5/5 [00:00<00:00, 79.62 passes/s]
Running MIL default pipeline: 100%|████████| 89/89 [00:01<00:00, 47.04 passes/s]
Running MIL backend_mlprogram pipeline: 100%|█| 12/12 [00:00<00:00, 83.94 passes
Model saved as 1x_NormalMapGenerator-CX-Lite_200000_G.mlpackage
Converting PyTorch Frontend ==> MIL Ops: 100%|▉| 1502/1503 [00:00<00:00, 6292.29
Running MIL frontend_pytorch pipeline: 100%|█| 5/5 [00:00<00:00, 78.49 passes/s]
Running MIL default pipeline: 100%|████████| 89/89 [00:01<00:00, 48.21 passes/s]
Running MIL backend_mlprogram pipeline: 100%|█| 12/12 [00:00<00:00, 83.43 passes
Model saved as 1x_FrankenMapGenerator-CX-Lite_215000_G.mlpackage

It's worth noting that the converted CoreML models take half the space on disk compared to the original ones. Might be an indication that a layer of the original models is not translated...