Closed EnergeticChubby closed 3 weeks ago
Hi!
Thanks for showing interest in our project and also for the thorough report of your issue with our codebase!
My guess is that when iterating in the _global_mask
function (i.e. when doing for mask, param in self.masked_parameters:
) the LSTM parameters and masks are skipped, since those are not included by default in the associated generator (self.masked_parameters
)
A fix could be going inside the lib/generator.py
file and add inside the prunable
function also the layers.LSTM_
module. Something like this:
def prunable(module, batchnorm, residual):
r"""Returns boolean whether a module is prunable.
"""
isprunable = isinstance(module, (layers.Linear, layers.Conv2d, layers.LSTM_)) # <<< Here I added your custom LSTM module
if batchnorm:
isprunable |= isinstance(module, (layers.BatchNorm1d, layers.BatchNorm2d))
if residual:
isprunable |= isinstance(module, (layers.Identity1d, layers.Identity2d))
return isprunable
Hope this can help. If you need further assistance feel free to reach out at any time!
Issue Description
I encountered an issue when trying to add the
LSTM_
class tolayers.py
while using the PX pruner. Specifically, the pruner fails to successfully obtain the scores.Implementation Details
Below is the implementation of the
LSTM_
class:The issue occurs when the
_global_mask
function in the PX pruner tries to get the scores, but fails.PX Pruner
_global_mask
FunctionModel Structure
Here is the structure of the model:
Debugging Information
I attempted to print the
masked_parameters
and their shapes, which indicated that the parameters were not being handled correctly. Here are some debugging outputs:masked_parameters
: (output count)masked_parameters
: (output shapes)Model parameters are shown in the following images:
Request for Help
Could someone provide insights on why the PX pruner is failing to obtain the scores and suggest potential fixes? Any guidance or suggestions would be greatly appreciated. Thank you!