Closed huohuohuohuohuohuohuohuo closed 8 months ago
Hi @huohuohuohuohuohuohuohuo . It seems like the SparseTensor._caches is not correctly passed to the new SparseTensor you constructed, and thus the new SparseTensor does not have any information about the kmap
. From the information you provided, one thing I suggest you try is to modify you func as:
def func(self, data, feats):
data_Q = SparseTensor(
feats=feats,
coords=data.C
)
data_Q._caches = data._caches
return data_Q
Another method to do this is to just make copy of data (data_Q) and modify data_Q.F, without explicitly constructing a new SparseTensor.
Thanks for your replying. Can I just assign the variable feats to the Sparse Tensor and return the data ?
data.F = feats
return data
Sure. It's ok to do that.
It seems like that the copy of data (data_Q) is a shallow copy. The change of values of data_Q will affect the values of data. How do I make a deep copy ?
data_Q._caches = data._caches will not copy the stride and spatial range. Do I need copy this variables separately ?
Is there an existing issue for this?
Current Behavior
I am using features generated from a model to replace the original features of SparseTensors. I would like to ask how to make the correct replacement to ensure the subsequent transposed convolution successful. (I made several sparse convs with stride of 2 before). I was wondering the stride may also be replaced. Here is my code:
And I got errors when using the data_Q to implement the transposed convolution. File "/home/hx/PycharmProjects/pcgc/PCGCv2_Resnet_ts/pcc_model.py", line 119, in forward out_cls_list, out = self.decoder(y_q, training) File "/home/hx/anaconda3/envs/torchsparse/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, *kwargs) File "/home/hx/PycharmProjects/pcgc/PCGCv2_Resnet_ts/autoencoder.py", line 289, in forward out = self.relu(self.conv0(self.up0(x))) File "/home/hx/anaconda3/envs/torchsparse/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(input, **kwargs) File "/home/hx/anaconda3/envs/torchsparse/lib/python3.9/site-packages/torchsparse-2.1.0-py3.9-linux-x86_64.egg/torchsparse/nn/modules/conv.py", line 98, in forward return F.conv3d( File "/home/hx/anaconda3/envs/torchsparse/lib/python3.9/site-packages/torchsparse-2.1.0-py3.9-linux-x86_64.egg/torchsparse/nn/functional/conv/conv.py", line 138, in conv3d kmap = F.transpose_kernel_map( File "/home/hx/anaconda3/envs/torchsparse/lib/python3.9/site-packages/torchsparse-2.1.0-py3.9-linux-x86_64.egg/torchsparse/nn/functional/conv/kmap/build_kmap.py", line 233, in transpose_kernel_map kmap["out_in_map"], make_divisible(kmap["sizes"][0], cta_M) TypeError: 'NoneType' object is not subscriptable
Similar operation in Minkowski Engine is shown below and it worked.
Expected Behavior
The transposed convolution can be successfully implemented.
Environment
Anything else?
No response