Closed EmmaRenauld closed 5 months ago
Hello @EmmaRenauld, Thank you for submitting the Pull Request !
dwi_ml/models/projects/transformer_sublayers.py
:Line 81:80: E501 line too long (106 > 79 characters) Line 87:80: E501 line too long (85 > 79 characters) Line 94:80: E501 line too long (126 > 79 characters) Line 116:80: E501 line too long (91 > 79 characters) Line 119:80: E501 line too long (97 > 79 characters) Line 121:80: E501 line too long (86 > 79 characters) Line 135:80: E501 line too long (89 > 79 characters)
dwi_ml/models/utils/transformers_from_torch.py
:Line 68:80: E501 line too long (82 > 79 characters) Line 70:80: E501 line too long (92 > 79 characters) Line 72:80: E501 line too long (82 > 79 characters) Line 74:80: E501 line too long (106 > 79 characters) Line 78:80: E501 line too long (110 > 79 characters) Line 79:80: E501 line too long (116 > 79 characters) Line 83:80: E501 line too long (91 > 79 characters) Line 103:80: E501 line too long (112 > 79 characters) Line 105:80: E501 line too long (86 > 79 characters) Line 107:80: E501 line too long (101 > 79 characters) Line 108:80: E501 line too long (87 > 79 characters) Line 109:80: E501 line too long (97 > 79 characters) Line 110:80: E501 line too long (104 > 79 characters) Line 112:80: E501 line too long (87 > 79 characters) Line 114:80: E501 line too long (117 > 79 characters) Line 163:80: E501 line too long (80 > 79 characters)
Do see the PEP 8 -- Style Guide for Python Code
Updated Transformers using new torch code.
Tested: 100% the same values as before.