I am trying to use Federated Learning with DNNs.
1) Split the data and sent it to Alice and Bob (2 virtual workers)
2) Made a loop which selects datasets for each worker and sends the model to them.
3) On passing the datasets to the model(), I receive a
RuntimeError: dimension specified as 0, but tensor has no dimensions
This error occurs in the forward function of my DNN, where I am trying to create Embeddings.
I do not know if PySyft is compatible with DNNs and embeddings etc. as of now. Any insight would help.
Line of code causing the error: This is inside the forward function in my DNN class.
emb = [getattr(self, 'emblayer'+col)(X_d[:,self.deep_columnidx[col]].long())
for col,,_ in self.embeddings_input]
This is basically trying to get a list of embedding columns by indexing each required column and passing it through the embedding created with proper dimensions, example,
emb_layer_userId = nn.Embedding(a, b) (Created in the init function in the same class)
Also, this code works without PySyft.
PyTorch version 0.3.1
Edit: Tested with test_emb = Embedding(15, 60)
and Variable containing FloatTensor of dimension 667 (Column has 15 unique values, hence used embedding input as 15)
I am trying to use Federated Learning with DNNs. 1) Split the data and sent it to Alice and Bob (2 virtual workers) 2) Made a loop which selects datasets for each worker and sends the model to them. 3) On passing the datasets to the model(), I receive a RuntimeError: dimension specified as 0, but tensor has no dimensions
This error occurs in the forward function of my DNN, where I am trying to create Embeddings. I do not know if PySyft is compatible with DNNs and embeddings etc. as of now. Any insight would help.
Line of code causing the error: This is inside the forward function in my DNN class. emb = [getattr(self, 'emblayer'+col)(X_d[:,self.deep_columnidx[col]].long()) for col,,_ in self.embeddings_input]
This is basically trying to get a list of embedding columns by indexing each required column and passing it through the embedding created with proper dimensions, example, emb_layer_userId = nn.Embedding(a, b) (Created in the init function in the same class)
Also, this code works without PySyft. PyTorch version 0.3.1 Edit: Tested with test_emb = Embedding(15, 60) and Variable containing FloatTensor of dimension 667 (Column has 15 unique values, hence used embedding input as 15)