mindspore-ai / mindspore

MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
https://gitee.com/mindspore/mindspore
Apache License 2.0
4.31k stars 709 forks source link

Request for output_padding Parameter Support in Conv1dTranspose and Conv2dTranspose Layers #292

Open PhyllisJi opened 5 months ago

PhyllisJi commented 5 months ago

Software Environment:

Describe the expected behavior

I am writing to request the addition of the output_padding parameter to the Conv1dTranspose and Conv2dTranspose layers in MindSpore. This feature is crucial for the following reasons:

Precise Output Shape Control: The output_padding parameter allows for precise control over the output shape of transposed convolution layers. This is particularly important when the desired output size cannot be achieved directly through the current parameters (such as stride, padding, and kernel size).

Consistency with Other Frameworks: Many deep learning frameworks, such as TensorFlow and PyTorch, include output_padding in their transposed convolution layers. Adding this feature to MindSpore would enhance compatibility and ease the transition for users migrating from these platforms. Flexibility in Model Design:

The ability to fine-tune the output shape using output_padding offers greater flexibility in model architecture design. It simplifies the process of aligning tensor dimensions, which is essential for complex neural network architectures. Reduced Post-Processing Overhead:

Without output_padding, users may need to apply additional operations (such as padding) to achieve the desired output shape. This adds unnecessary complexity and computation overhead. Integrating output_padding directly into the layer would streamline model implementation and improve performance.