ucbrise / clipper

A low-latency prediction-serving system
http://clipper.ai
Apache License 2.0
1.4k stars 280 forks source link

Why not use torch.save(model, PATH) to serialize the entire model? #566

Open wnagchenghku opened 6 years ago

wnagchenghku commented 6 years ago

Hi, in https://github.com/ucbrise/clipper/blob/develop/clipper_admin/clipper_admin/deployers/pytorch.py#L201, I found clipper serializes the model and weight separately using cloudpickle and torch.save(the_model.state_dict(), PATH). I'm curious why does clipper do this serialization separately? From https://pytorch.org/docs/stable/notes/serialization.html, pytorch can actually serialize the entire model in one step. I tested that loading the entire model using torch.load is much much much faster than cloudpickle.

rkooo567 commented 5 years ago

@simon-mo @withsmilo Any thought?

simon-mo commented 5 years ago

We should. Can you submit a PR?

rkooo567 commented 5 years ago

@simon-mo Yep sounds good. I will wait for the author for a couple days and start if he doesn't respond.

@wnagchenghku Would you want to create a PR for this?