plemeri / transparent-background

This is a background removing tool powered by InSPyReNet (ACCV 2022)
MIT License
731 stars 80 forks source link

Too much heavy load #4

Closed me-devms closed 1 year ago

me-devms commented 1 year ago

I have tried this with 4vcpu and even with 2 Dedicated CPU servers for this but it barely handle 2 rerquest at a time.

There is lack of optimization

plemeri commented 1 year ago

Our method is dedicated to the GPU usage. CPU usage simply is not considered yet.

me-devms commented 1 year ago

So any suggestions of any cloud provider for it ? Which can be cost effective as well

On Fri, 2 Dec 2022, 1:16 pm Taehun Kim, @.***> wrote:

Our method is dedicated to the GPU usage. CPU usage simply is not considered yet.

— Reply to this email directly, view it on GitHub https://github.com/plemeri/transparent-background/issues/4#issuecomment-1334867803, or unsubscribe https://github.com/notifications/unsubscribe-auth/AL3YOBKRQVS2ONIULNZDENDWLGSN7ANCNFSM6AAAAAASRT76GQ . You are receiving this because you authored the thread.Message ID: @.***>

plemeri commented 1 year ago

So any suggestions of any cloud provider for it ? Which can be cost effective as well On Fri, 2 Dec 2022, 1:16 pm Taehun Kim, @.> wrote: Our method is dedicated to the GPU usage. CPU usage simply is not considered yet. — Reply to this email directly, view it on GitHub <#4 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AL3YOBKRQVS2ONIULNZDENDWLGSN7ANCNFSM6AAAAAASRT76GQ . You are receiving this because you authored the thread.Message ID: @.>

We tried using optimization tools like ONNX or TensorRT for CPU usage, but I'm not familiar with those frameworks, so I'm not sure that I can make it available soon. Meanwhile, I think you can choose to use a remote GPU powered server to split such workload which is quite common these days.

me-devms commented 1 year ago

I think you should do it quickly because in actual usecase senario GPU based servers are too much costly

plemeri commented 1 year ago

I just realize that our model can be converted to ONNX model with the latest pytorch and tested with the sample image. With CPU (i9-9900K), before it took 16 seconds for single image while ONNX runtime shows 9 seconds, but shows some defects compared to the original result because of the quantization I guess.

Original ONNX
aeroplane_map test

If you want this method anyway, please let me know.

me-devms commented 1 year ago

Yes, I would like to use it, Please release a update for this or anything which can be done

On Fri, 2 Dec 2022, 4:19 pm Taehun Kim, @.***> wrote:

I just realize that our model can be converted to ONNX model with the latest pytorch and tested with the sample image. With CPU (i9-9900K), before it took 16 seconds for single image while ONNX runtime shows 9 seconds, but shows some defects compared to the original result because of the quantization I guess.

  • Best viewed by zooming in

Original ONNX [image: aeroplane_map] https://user-images.githubusercontent.com/40786892/205275341-d809b6b6-54ba-47a8-ad1c-0fbe71ab4d3c.png [image: test] https://user-images.githubusercontent.com/40786892/205275356-4c04851c-6733-4295-a9f1-54cea4f2a81c.png

If you want this method anyway, please let me know.

— Reply to this email directly, view it on GitHub https://github.com/plemeri/transparent-background/issues/4#issuecomment-1335066339, or unsubscribe https://github.com/notifications/unsubscribe-auth/AL3YOBOFO4MCDGWLFKCZLT3WLHH2JANCNFSM6AAAAAASRT76GQ . You are receiving this because you authored the thread.Message ID: @.***>

plemeri commented 1 year ago

I'm currently having issue with converting fast model to ONNX, so I uploaded a pre-release version as transparent-background-dev which only support base model with onnxruntime.

pip install transparent-background-dev

Use with --onnx argument for command-line tool and onnx=True for python API usage.

same as other usages



This feature is experimental and will not be maintained as official feature since the conversion to ONNX is not stable at this point. Also, GPU servers are not expensive these days especially for the cloud services such as google Colab Pro.  You may try using those resources since other background removal tools are already utilizing such thing for their service for the optimal results.

If this result is still not pleasant for you, then I'm afraid that I cannot help you further with additional optimization support since it's not my research goal.

Thank you.
me-devms commented 1 year ago

Traceback (most recent call last): File "C:\Users\ilikewebsite\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\ilikewebsite\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Users\ilikewebsite\AppData\Local\Programs\Python\Python39\Scripts\transparent-background.exe__main__.py", line 7, in File "C:\Users\ilikewebsite\AppData\Local\Programs\Python\Python39\lib\site-packages\transparent_background\Remover.py", line 226, in console out = remover.process(img, type=args.type) File "C:\Users\ilikewebsite\AppData\Local\Programs\Python\Python39\lib\site-packages\transparent_background\Remover.py", line 151, in process r, g, b = cv2.split(img) cv2.error: OpenCV(4.6.0) :-1: error: (-5:Bad argument) in function 'split'

Overload resolution failed:

  • m is not a numpy array, neither a scalar
  • Expected Ptr for argument 'm'
plemeri commented 1 year ago

As we mentioned in usage.py, the input for remover.process should be Image.Image type, not np.array. Also, briefly speaking, ONNX fix the input image shape in order to speed up inference speed. If you want a dynamic image size, use pytorch backend instead.

me-devms commented 1 year ago

Yes I am doing exactly same even I tried with CLI same error

plemeri commented 1 year ago

I'm sorry for the misunderstanding. It's fixed now, so please reinstall with the command below.

pip install transparent-background-dev --upgrade

Thanks

me-devms commented 1 year ago

Thanks ! It works preety well, I think you should also add this in your main branch as well, Bevause It will help a lot of devs, Because I don't see any librery including your dataset like yours works perfectly

plemeri commented 1 year ago

Thank you so much for your support. I'm currently working on converting fast model to ONNX and then merging into the main branch.

Always happy to help the community 😄

oylz commented 1 year ago

@plemeri