facebookresearch / SparseConvNet

Submanifold sparse convolutional networks
https://github.com/facebookresearch/SparseConvNet
Other
2.04k stars 332 forks source link

RuntimeError: expected scalar type Long but found Float #236

Closed LeopoldACC closed 1 year ago

LeopoldACC commented 1 year ago

my environment

ubuntu 18.04
python 3.7.15
cuda 10.2
cudnn 7.6.5
torch 1.7.1

I find the error position is sparseconvnet.SCN.Convolution_updateOutput in lib/sparseconvnet/convolution.py

> /data/projects/JS3C-Net-main/models/SubSparseConv.py(42)forward()
-> x = self.sparseModel(batch_x)
(Pdb) c
> /data/projects/JS3C-Net-main/lib/sparseconvnet/convolution.py(40)forward()
-> output.features = ConvolutionFunction.apply(
(Pdb) c
> /data/projects/JS3C-Net-main/lib/sparseconvnet/convolution.py(96)forward()
-> sparseconvnet.forward_pass_multiplyAdd_count +=\
(Pdb) sparseconvnet.forward_pass_multiplyAdd_count +=\
            sparseconvnet.SCN.Convolution_updateOutput(
                input_spatial_size,
                output_spatial_size,
                filter_size,
                filter_stride,
                input_metadata,
                input_features,
                output_features,
                weight,
                bias)
*** RuntimeError: expected scalar type Long but found Float

and print the feature before forward, is float32

(Pdb) sparseconvnet.forward_pass_multiplyAdd_count +=\
            sparseconvnet.SCN.Convolution_updateOutput(
                input_spatial_size,
                output_spatial_size,
                filter_size,
                filter_stride,
                input_metadata,
                input_features,
                output_features,
                weight,
                bias)
*** RuntimeError: expected scalar type Long but found Float
(Pdb) pp input_metadata.shape
*** AttributeError: 'sparseconvnet.SCN.Metadata_3' object has no attribute 'shape'
(Pdb) pp input_metadata.dtype
*** AttributeError: 'sparseconvnet.SCN.Metadata_3' object has no attribute 'dtype'
(Pdb) pp input_features.dtype
torch.float32
(Pdb) pp output_features.dtype
torch.float32
(Pdb) pp input_features.shape
torch.Size([131815, 16])
(Pdb) pp input_features.dtype
torch.float32
(Pdb) pp weight.shape
torch.Size([8, 1, 16, 32])
(Pdb) pp weight.dtype
torch.float32

run demo meet the same issue

Input SparseConvNetTensor: SparseConvNetTensor<<features=tensor([[1.],
        [1.],                                        
        [1.],                                                                                        
        [1.],                              
        [1.],                          
        [1.],                                   
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],                                                                       
        [1.],                                                                                        
        [1.],        
        [1.],                                                                                        
        [1.],                                                                    
        [1.],                                                                              
        [1.],                                                                             
        [1.],                                                                                        
        [1.],                                                         
        [1.],                                        
        [1.],                                                                                        
        [1.],                              
        [1.],                          
        [1.],                                   
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],                                                                       
        [1.],                                                                                        
        [1.],        
        [1.],                                                                                        
        [1.],                                                                    
        [1.],                                                                              
        [1.],                                                                             
        [1.],                                                                                        
        [1.],                                                         
        [1.],                                        
        [1.],                                                                                        
        [1.],                              
        [1.],                          
        [1.],                                   
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],
        [1.],                                                                       
        [1.]], device='cuda:0'),features.shape=torch.Size([132, 1]),batch_locations=tensor([[ 0,  1,  0],
        [ 0,  5,  0],
        [ 0,  8,  0],                                                                                         
        [ 0,  9,  0],                                                            
        [ 0, 10,  0],                                                                      
        [ 0, 13,  0],                                                                     
        [ 0, 18,  0],                                                                                      
        [ 0, 23,  0],                                                 
        [ 0, 24,  0],                                
        [ 0, 30,  0],                                                                                         
        [ 0, 38,  0],                      
        [ 0, 42,  0],                  
        [ 0, 43,  0],                           
        [ 0, 47,  0],
        [ 0, 48,  0],
        [ 0, 49,  0],
        [ 0, 53,  0],
        [ 0, 58,  0],
        [ 0, 59,  0],
        [ 0, 60,  0],
        [ 1,  1,  0],
        [ 1,  5,  0],
        [ 1,  8,  0],
        [ 1, 13,  0],
        [ 1, 18,  0],
        [ 1, 22,  0],
        [ 1, 25,  0],
        [ 1, 30,  0],
        [ 1, 38,  0],
        [ 1, 41,  0],
        [ 1, 44,  0],
        [ 1, 47,  0],
        [ 1, 50,  0],
        [ 1, 53,  0],
        [ 1, 58,  0],
        [ 1, 61,  0],
        [ 2,  1,  0],
        [ 2,  2,  0],
        [ 2,  3,  0],
        [ 2,  4,  0],
        [ 2,  5,  0],
        [ 2,  8,  0],
        [ 2,  9,  0],
        [ 2, 13,  0],
        [ 2, 18,  0],
        [ 2, 22,  0],                                                 
        [ 2, 25,  0],                                                 
        [ 2, 30,  0],
        [ 2, 34,  0],                                                 
        [ 2, 38,  0],                                                 
        [ 2, 41,  0],                                                 
        [ 2, 44,  0],                                                 
        [ 2, 47,  0],                                                 
        [ 2, 48,  0],                                                 
        [ 2, 49,  0],                                
        [ 2, 53,  0],                                                 
        [ 2, 58,  0],                      
        [ 2, 62,  0],                  
        [ 3,  1,  0],                           
        [ 3,  5,  0],
        [ 3,  8,  0],
        [ 3, 13,  0],
        [ 3, 18,  0],
        [ 3, 22,  0],
        [ 3, 25,  0],
        [ 3, 31,  0],
        [ 3, 33,  0],
        [ 3, 35,  0],
        [ 3, 37,  0],
        [ 3, 41,  0],
        [ 3, 44,  0],
        [ 3, 47,  0],
        [ 3, 50,  0],
        [ 3, 53,  0],
        [ 3, 58,  0],
        [ 3, 61,  0],
        [ 4,  1,  0],
        [ 4,  5,  0],
        [ 4,  8,  0],
        [ 4,  9,  0],
        [ 4, 10,  0],
        [ 4, 13,  0],
        [ 4, 14,  0],
        [ 4, 15,  0],
        [ 4, 18,  0],
        [ 4, 19,  0],
        [ 4, 20,  0],
        [ 4, 23,  0],
        [ 4, 24,  0],
        [ 4, 32,  0],
        [ 4, 36,  0],
        [ 4, 42,  0],                                                               
        [ 4, 43,  0],                                                                                    
        [ 4, 47,  0],
        [ 4, 50,  0],                                                                                                
        [ 4, 53,  0],                                                            
        [ 4, 54,  0],                                                                      
        [ 4, 55,  0],                                                                     
        [ 4, 58,  0],                                                                                      
        [ 4, 59,  0],                                                 
        [ 4, 60,  0],                                
        [ 0,  1,  1],                                                                                              
        [ 0,  2,  1],                      
        [ 0,  3,  1],                  
        [ 0, 18,  1],                           
        [ 0, 19,  1],
        [ 0, 20,  1],
        [ 0, 21,  1],
        [ 0, 22,  1],
        [ 1,  1,  1],
        [ 1,  4,  1],
        [ 1,  7,  1],
        [ 1, 11,  1],
        [ 1, 12,  1],
        [ 1, 13,  1],
        [ 1, 21,  1],
        [ 2,  1,  1],
        [ 2,  2,  1],
        [ 2,  3,  1],
        [ 2, 20,  1],
        [ 3,  1,  1],
        [ 3,  7,  1],
        [ 3, 11,  1],
        [ 3, 12,  1],
        [ 3, 13,  1],
        [ 3, 21,  1],
        [ 4,  1,  1],
        [ 4,  7,  1],
        [ 4, 18,  1],
        [ 4, 19,  1],
        [ 4, 20,  1],
        [ 4, 21,  1]]),batch_locations.shape=torch.Size([132, 3]),spatial size=tensor([87, 87])>>
Traceback (most recent call last):
  File "test_sparseconvnet.py", line 51, in <module>
    output = model(input)
  File "/data/anaconda3/envs/occld/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/data/anaconda3/envs/occld/lib/python3.7/site-packages/torch/nn/modules/container.py", line 117, in forward
    input = module(input)
  File "/data/anaconda3/envs/occld/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/data/anaconda3/envs/occld/lib/python3.7/site-packages/torch/nn/modules/container.py", line 117, in forward
    input = module(input)
  File "/data/anaconda3/envs/occld/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/data/projects/JS3C-Net-main/lib/sparseconvnet/maxPooling.py", line 94, in forward       
    self.nFeaturesToDrop)         
  File "/data/projects/JS3C-Net-main/lib/sparseconvnet/maxPooling.py", line 38, in forward
    nFeaturesToDrop)     
RuntimeError: expected scalar type Long but found Float                                                            
jinglinglingling commented 1 year ago

I have the same problem. Have you solved it yet

LeopoldACC commented 1 year ago

I have the same problem. Have you solved it yet

I solved it by downgrade torch version to 1.3.1 and cuda version to 10.1

jinglinglingling commented 1 year ago

What is your version of gcc? I've tried installing torch 1.3.1 and it won't even compile

lyhsieh commented 1 year ago

I have the same problem, too.

mattmdjaga commented 1 year ago

I've had the same issue and my guess is that it's caused by cuda 11 as I've tried different pytorch versions and I had it working with cuda 10 previously, but I've changed gpus to 3080 from 2070 so I need cuda 11.