xuebinqin / DIS

This is the repo for our new project Highly Accurate Dichotomous Image Segmentation
Apache License 2.0
2.23k stars 259 forks source link

DIS by general use model is pretty good! #40

Open daisymind opened 2 years ago

daisymind commented 2 years ago

Hi, I have adapted DIS by general use model for my iPhone App, ClipEdge.

I imagine that without V2, the learning is not yet optimized for human and animals, but contrary to my expectations, it recognizes human and animals in a good way too. Does this mean that as long as they are recognized as the central object, there is no problem?

xuebinqin commented 2 years ago

Hello, Daisy,

Glad to know that our project is helpful to you. It is hard to quantify the reason. But deep neural networks usually fit a very comprehensive features, in which the "centered" cues may also be encoded there. But it also depends on the overall shape, structure, texture, intensities and etc. Many researchers try to figure it out. But currently it is still obscure and it is also one of our goals.

BR, Xuebin

On Fri, Sep 2, 2022 at 6:33 PM DaisyMind @.***> wrote:

Hi, I have adapted DIS by general use model for my iPhone App, ClipEdge.

I imagine that without V2, the learning is not yet optimized for human and animals, but contrary to my expectations, it recognizes human and animals in a good way too. Does this mean that as long as they are recognized as the central object, there is no problem?

— Reply to this email directly, view it on GitHub https://github.com/xuebinqin/DIS/issues/40, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADSGORNTHI7FHQ2RLDQ7TTLV4KTHVANCNFSM6AAAAAAQDWEYGU . You are receiving this because you are subscribed to this thread.Message ID: @.***>

-- Xuebin Qin PhD Department of Computing Science University of Alberta, Edmonton, AB, Canada Homepage: https://xuebinqin.github.io/

daisymind commented 2 years ago

Thanks for the reply. I know that the dataset has a big impact, however, I was wondering because this IS-Net general use model works so well without human or animal data. As you say, maybe there is a network made that can detect some central object.

I look forward to your future research, and the V2 dataset.

Best Regards,

sgebr01 commented 2 years ago

@daisymind Did you make this an API, or implement a CoreML Model?

daisymind commented 2 years ago

Hi, sgebr01.

I am using the original model 'isnet-general-use.pth' converted to a CoreML model '.mlmodel'.

You can get the CoreML model converted by jphon-rocky and the conversion script at the following URL. Find IS-Net (= DIS) on the page. https://github.com/john-rocky/CoreML-Models

And more, for your reference..., the mlmodel is 8bit quantized in the iphone app for compactness.

# Quantization sample code :Python with Apple coremltools
import coremltools as ct
from coremltools.models.neural_network import quantization_utils

# load full precision model = 32bit
model = ct.models.MLModel('ISNet_general_use.mlmodel')

# in case of 8bit quantization
quantized_model = quantization_utils.quantize_weights(model, 8)
quantized_model.save('quantized08_ISNet_general_use.mlmodel')

see https://developer.apple.com/documentation/coreml/model_customization/reducing_the_size_of_your_core_ml_app and https://coremltools.readme.io/docs/quantization

sgebr01 commented 2 years ago

Great thank you, I appreciate it.

roimulia2 commented 2 years ago

Hey @daisymind! Thank you for the info :) Any chance you can send the quantatized MLModel here? I'm having an issue setup the environment on my mac

daisymind commented 2 years ago

Hi roimulia2. You can get the one from here. https://drive.google.com/file/d/1-8FIZGVIXOHrAd8VljkwF0Sv8lIkBpdR/view?usp=sharing

Appendix: Simply on Google Colab, mlmodel can be converted to 8bits-quantize file with below .ipynb code.

from google.colab import drive
drive.mount('/content/drive')

!pip install coremltools
basePath = '/content/drive/MyDrive/CoreML'   # This line, edit for your work Path

import coremltools as ct
from coremltools.models.neural_network import quantization_utils

# load full precision model = 32bit
model = ct.models.MLModel(basePath+'/ISNet_general_use.mlmodel') # orign mlmodel file name

# Quantize  8bits
nbits = 8
quantized_model = quantization_utils.quantize_weights(model, nbits)
quantized_model.save(basePath+'/quantized08_ISNet_general_use.mlmodel') # save to quantized file name
print('...Finished: quantized 8bit')
roimulia2 commented 1 year ago

Perfect, thank you so much :)