Open Ishihara-Masabumi opened 2 years ago
modify build_model( ) from networks.models like this.
def build_model(weights):
import io
weights = weights.weights
net_encoder = fba_encoder()
net_decoder = fba_decoder()
model = MattingModule(net_encoder, net_decoder)
if(weights != 'default'):
with open(weights, 'rb') as f:
buffer = io.BytesIO(f.read())
# sd = torch.load(weights)
sd = torch.load(buffer)
model.load_state_dict(sd, strict=True)
return model
the same, useless
Got the solution?
Anyone found the solution??
It still work at colab. If you want to modify and apply module(e.g. .py) at colab, first modify module and second restart runtime.
modify build_model( ) from networks.models like this.
def build_model(weights): import io weights = weights.weights net_encoder = fba_encoder() net_decoder = fba_decoder() model = MattingModule(net_encoder, net_decoder) if(weights != 'default'): with open(weights, 'rb') as f: buffer = io.BytesIO(f.read()) # sd = torch.load(weights) sd = torch.load(buffer) model.load_state_dict(sd, strict=True) return model
Using FBA Matting.ipynb in Gooble Colab, at the source code below,
the following error occurred.
Please tell me the cause of the error and how to fix it.