OpenMined / KotlinSyft

The official Syft worker for secure on-device machine learning
https://www.openmined.org
Apache License 2.0
86 stars 27 forks source link

Save ReLu activation values in a file #299

Closed mustansarsaeed closed 3 years ago

mustansarsaeed commented 3 years ago

Hi, Thank you for the great library. I want to save the activations as we are doing for the weights parameters. My current model is:

INPUT_DIM = 10
CLASSES = 5
# th.set_default_dtype(th.float64)
class SyntheticNet(nn.Module):
    def __init__(self):
        super(SyntheticNet, self).__init__()
        self.fc1 = nn.Linear(INPUT_DIM, 15)
        self.fc2 = nn.Linear(15, CLASSES)

    def forward(self, x):
        l1_output = self.fc1(x)
        relu_activations = nn.functional.relu(l1_output)
        output = self.fc2(relu_activations)
        return output, relu_activations

model = SyntheticNet()

I want to save relu_activations for processing purpose. Is there any way we can do as we are doing for weights. I have seen the code for weights that are being loaded from the file as:

fun loadModelState(modelFile: String) {
        modelSyftState = SyftState.loadSyftState(modelFile)
        Log.d(TAG, "Model loaded from $modelFile")
}

This modelSyftState includes the model parameters. I want something like this to save and load activations. Any help will be appreciated.

vkkhare commented 3 years ago

You can easily save it. When you take the activations as output, you can create a syft state object and then can serialize it into protobuf file for sending it over to PyGrid

mustansarsaeed commented 3 years ago

Thanks @vkkhare for the reply . Let me implement the guide your provided and will get back in case of any missing element.

mustansarsaeed commented 3 years ago

Thanks @vkkhare . It worked as you suggested.