PaddlePaddle / Paddle

PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
http://www.paddlepaddle.org/
Apache License 2.0
21.77k stars 5.47k forks source link

[Type Hints] 为公开 API 标注类型提示信息 #65008

Open megemini opened 2 weeks ago

megemini commented 2 weeks ago

commit https://github.com/PaddlePaddle/Paddle/commit/ed8168db34c1c520533b14c11869e920c64118bb

#### 🔛 第 1 批 | 序号 | 文件 | API 数量 | 认领人 Github id | PR 链接 | | ------ | ------------------------------------------------------ | ------ | --------------------------------------- | ------------------------- | | ✅A-1 | paddle/tensor/array.py | 4 | ✅@zrr1999 | PaddlePaddle/Paddle#65009 | | ✅A-2 | paddle/tensor/attribute.py | 7 | ✅@zrr1999 | PaddlePaddle/Paddle#65255 | | ✅A-3 | paddle/tensor/creation.py | 28 | ✅@zrr1999 | PaddlePaddle/Paddle#65082 | | ✅A-4 | paddle/tensor/einsum.py | 7 | ✅@zrr1999 | PaddlePaddle/Paddle#65255 | | ✅A-5 | paddle/tensor/linalg.py | 40 | ✅@gouzil | PaddlePaddle/Paddle#65274 | | ✅A-6 | paddle/tensor/logic.py | 35 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65300 | | ✅A-7 | paddle/tensor/manipulation.py | 77 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65351 | | ✅A-8 | paddle/tensor/math.py | 192 | ✅@SigureMo | PaddlePaddle/Paddle#65073 | | ✅A-9 | paddle/tensor/ops.py | 43 | ✅@gouzil | PaddlePaddle/Paddle#65249 | | ✅A-10 | paddle/tensor/random.py | 15 | ✅@ooooo-create | PaddlePaddle/Paddle#65272 | | ✅A-11 | paddle/tensor/search.py | 16 | ✅@ooooo-create | PaddlePaddle/Paddle#65354 | | ✅A-12 | paddle/tensor/stat.py | 9 | ✅@ooooo-create | PaddlePaddle/Paddle#65337 | | ✅A-13 | paddle/tensor/to_string.py | 2 | ✅@gouzil | PaddlePaddle/Paddle#65042 | | ✅A-14 | paddle/nn/layer/activation.py | 11 | ✅@ooooo-create | PaddlePaddle/Paddle#65372 | | ✅A-15 | paddle/nn/layer/common.py | 20 | ✅@megemini | PaddlePaddle/Paddle#65197 | | ✅A-16 | paddle/nn/layer/container.py | 2 | ✅@SigureMo
🙋@liyongchao911
| PaddlePaddle/Paddle#65190 | | ✅A-17 | paddle/nn/layer/conv.py | 7 | ✅@liyongchao911 | PaddlePaddle/Paddle#65183 | | ✅A-18 | paddle/nn/layer/distance.py | 2 | ✅@liyongchao911 | PaddlePaddle/Paddle#65127 | | ✅A-19 | paddle/nn/layer/layers.py | 1 | ✅@SigureMo
🙋@liyongchao911
| PaddlePaddle/Paddle#65190 | | ✅A-20 | paddle/nn/layer/loss.py | 21 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65376 | | ✅A-21 | paddle/nn/layer/norm.py | 9 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65454 | | ✅A-22 | paddle/nn/layer/pooling.py | 18 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65460 | | ✅A-23 | paddle/nn/layer/rnn.py | 2 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65375 | | ✅A-24 | paddle/nn/layer/transformer.py | 4 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65457 | | ✅A-25 | paddle/nn/layer/vision.py | 4 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65455 | | ✅A-26 | paddle/vision/transforms/transforms.py | 22 | ✅@ooooo-create | PaddlePaddle/Paddle#65378 | | ✅A-27 | paddle/nn/initializer/assign.py | 3 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-28 | paddle/nn/initializer/bilinear.py | 2 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-29 | paddle/nn/initializer/constant.py | 3 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65095 | | ✅A-30 | paddle/nn/initializer/dirac.py | 2 | ✅@gouzil | PaddlePaddle/Paddle#65087 | | ✅A-31 | paddle/nn/initializer/initializer.py | 2 | ✅@gouzil | PaddlePaddle/Paddle#65087 | | ✅A-32 | paddle/nn/initializer/kaiming.py | 5 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-33 | paddle/nn/initializer/normal.py | 5 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-34 | paddle/nn/initializer/orthogonal.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65125 | | ✅A-35 | paddle/nn/initializer/uniform.py | 3 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-36 | paddle/nn/initializer/xavier.py | 4 | ✅@gouzil | PaddlePaddle/Paddle#65206 | | ✅A-37 | paddle/optimizer/adadelta.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65464 | | ✅A-38 | paddle/optimizer/adagrad.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65464 | | ✅A-39 | paddle/optimizer/adam.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65076 | | ✅A-40 | paddle/optimizer/adamax.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65236 | | ✅A-41 | paddle/optimizer/adamw.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65236 | | ✅A-42 | paddle/optimizer/asgd.py | 2 | ✅@ooooo-create | PaddlePaddle/Paddle#65236 | | ✅A-43 | paddle/optimizer/lamb.py | 2 | ✅@gouzil | PaddlePaddle/Paddle#65247 | | ✅A-44 | paddle/optimizer/lbfgs.py | 2 | ✅@enkilee | PaddlePaddle/Paddle#65308 | | ✅A-45 | paddle/optimizer/momentum.py | 2 | ✅@enkilee | PaddlePaddle/Paddle#65284 | | ✅A-46 | paddle/optimizer/nadam.py | 2 | ✅@enkilee | PaddlePaddle/Paddle#65273 | | ✅A-47 | paddle/optimizer/optimizer.py | 1 | ✅@ooooo-create | PaddlePaddle/Paddle#65076 | | ✅A-48 | paddle/optimizer/radam.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65085 | | ✅A-49 | paddle/optimizer/rmsprop.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65085 | | ✅A-50 | paddle/optimizer/rprop.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65085 | | ✅A-51 | paddle/optimizer/sgd.py | 2 | ✅@ooooo-create
🙋@DrRyanHuang
| PaddlePaddle/Paddle#65076 | | 🔵A-52 | paddle/hapi/model.py | 2 | | | | ✅A-53 | paddle/hapi/model_summary.py | 1 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65086 | | ✅A-54 | paddle/nn/functional/activation.py | 36 | ✅@gsq7474741 | PaddlePaddle/Paddle#65191 | | ✅A-55 | paddle/nn/functional/common.py | 15 | ✅@gsq7474741 | PaddlePaddle/Paddle#65191 | | ✅A-56 | paddle/nn/functional/conv.py | 6 | ✅@gsq7474741 | PaddlePaddle/Paddle#65191 | | ✅A-57 | paddle/nn/functional/distance.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65071 | | ✅A-58 | paddle/nn/functional/extension.py | 4 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65380 | | ✅A-59 | paddle/nn/functional/flash_attention.py | 6 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65380 | | ✅A-60 | paddle/nn/functional/input.py | 2 | ✅@sunzhongkai588 | PaddlePaddle/Paddle#65317 | | ✅A-61 | paddle/nn/functional/loss.py | 29 | 🙋@gsq7474741
✅@Asthestarsfalll
| PaddlePaddle/Paddle#65376 | | ✅A-62 | paddle/nn/functional/norm.py | 6 | 🙋@gsq7474741
✅@Asthestarsfalll
| PaddlePaddle/Paddle#65454 | | ✅A-63 | paddle/nn/functional/pooling.py | 17 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65460 | | ✅A-64 | paddle/nn/functional/sparse_attention.py | 1 | ✅@Liyulingyue | PaddlePaddle/Paddle#65064 | | ✅A-65 | paddle/nn/functional/vision.py | 5 | 🙋@gsq7474741
✅@Asthestarsfalll
| PaddlePaddle/Paddle#65455 | | ✅A-66 | paddle/base/dygraph/math_op_patch.py | 12 | ✅@SigureMo | PaddlePaddle/Paddle#65201 | | ✅A-67 | paddle/base/dygraph/tensor_patch_methods.py | 20 | ✅@SigureMo | PaddlePaddle/Paddle#65201 | | ✅A-68 | paddle/regularizer.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65226 | | ✅A-69 | python/paddle/optimizer/lr.py | 18 | ✅@SigureMo | PaddlePaddle/Paddle#65209 | | ✅A-70 | python/paddle/hub.py | 3 | ✅@SigureMo | PaddlePaddle/Paddle#65238 | | ✅A-71 | python/paddle/sysconfig.py | 2 | ✅@SigureMo | PaddlePaddle/Paddle#65238 | | ✅A-72 | setup.py & python/setup.pyi | 8 | ✅@SigureMo | PaddlePaddle/Paddle#65244 | | ✅A-73 | python/paddle/vision/models/alexnet.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65283 | | ✅A-74 | python/paddle/vision/models/densenet.py | 6 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65486 | | ✅A-75 | python/paddle/vision/models/googlenet.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65290 | | ✅A-76 | python/paddle/vision/models/inceptionv3.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65292 | | ✅A-77 | python/paddle/vision/models/lenet.py | 1 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65283 | | ✅A-78 | python/paddle/vision/models/mobilenetv1.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65323 | | ✅A-79 | python/paddle/vision/models/mobilenetv2.py | 2 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65326 | | ✅A-80 | python/paddle/vision/models/mobilenetv3.py | 4 | ✅@enkilee | PaddlePaddle/Paddle#65366 | | ✅A-81 | python/paddle/vision/models/resnet.py | 14 | ✅@Asthestarsfalll | PaddlePaddle/Paddle#65487 | | 🟢A-82 | python/paddle/vision/models/shufflenetv2.py | 8 | 🟢@Asthestarsfalll | PaddlePaddle/Paddle#65559 | | 🚧A-83 | python/paddle/vision/transforms/functional.py | 16 | 🚧@Asthestarsfalll | PaddlePaddle/Paddle#65560 | | ✅A-84 | python/paddle/vision/models/squeezenet.py | 3 | ✅@DrRyanHuang | PaddlePaddle/Paddle#65332 | | ✅A-85 | python/paddle/vision/models/vgg.py | 5 | ✅@86kkd | PaddlePaddle/Paddle#65381 | | ✅A-86 | python/paddle/vision/datasets/cifar.py | 2 | ✅@86kkd | PaddlePaddle/Paddle#65386 | | ✅A-87 | python/paddle/vision/datasets/flowers.py | 1 | ✅@enkilee | PaddlePaddle/Paddle#65504 | | 🚧A-88 | python/paddle/vision/datasets/folder.py | 2 | 🚧@enkilee | PaddlePaddle/Paddle#65532 | | 🚧A-89 | python/paddle/vision/datasets/mnist.py | 2 | 🚧@enkilee | PaddlePaddle/Paddle#65553 | | 🔵A-90 | python/paddle/vision/datasets/voc2012.py | 1 | | | | 🔵A-91 | python/paddle/metric/metrics.py | 6 | | | | ✅A-92 | python/paddle/vision/image.py | 3 | ✅@86kkd | PaddlePaddle/Paddle#65386 | | 🔵A-93 | python/paddle/vision/ops.py | 18 | | | | 🔵A-94 | python/paddle/signal.py | 2 | | | | 🔵A-95 | python/paddle/fft.py | 22 | | | | 🔵A-96 | python/paddle/hapi/callbacks.py | 8 | | | | 🔵A-97 | python/paddle/io/reader.py | 1 | | | | 🔵A-98 | python/paddle/io/dataloader/batch_sampler.py | 2 | | | | 🔵A-99 | python/paddle/io/dataloader/dataset.py | 8 | | | | 🔵A-100 | python/paddle/io/dataloader/sampler.py | 5 | | | | 🔵A-101 | python/paddle/io/dataloader/worker.py | 1 | | | | 🔵A-102 | python/paddle/distribution/bernoulli.py | 2 | | | | 🔵A-103 | python/paddle/distribution/beta.py | 3 | | | | 🔵A-104 | python/paddle/distribution/binomial.py | 2 | | | | 🔵A-105 | python/paddle/distribution/categorical.py | 3 | | | | 🔵A-106 | python/paddle/distribution/cauchy.py | 2 | | | | 🔵A-107 | python/paddle/distribution/chi2.py | 1 | | | | 🔵A-108 | python/paddle/distribution/continuous_bernoulli.py | 2 | | | | 🔵A-109 | python/paddle/distribution/dirichlet.py | 2 | | | | 🔵A-110 | python/paddle/distribution/distribution.py | 1 | | | | 🔵A-111 | python/paddle/distribution/exponential.py | 3 | | | | 🔵A-112 | python/paddle/distribution/exponential_family.py | 2 | | | | 🔵A-113 | python/paddle/distribution/gamma.py | 3 | | | | 🔵A-114 | python/paddle/distribution/geometric.py | 2 | | | | 🔵A-115 | python/paddle/distribution/gumbel.py | 2 | | | | 🔵A-116 | python/paddle/distribution/independent.py | 2 | | | | 🔵A-117 | python/paddle/distribution/kl.py | 2 | | | | 🔵A-118 | python/paddle/distribution/laplace.py | 2 | | | | 🔵A-119 | python/paddle/distribution/lkj_cholesky.py | 1 | | | | 🔵A-120 | python/paddle/distribution/lognormal.py | 4 | | | | 🔵A-121 | python/paddle/distribution/multinomial.py | 3 | | | | 🔵A-122 | python/paddle/distribution/multivariate_normal.py | 2 | | | | 🔵A-123 | python/paddle/distribution/normal.py | 2 | | | | 🔵A-124 | python/paddle/distribution/poisson.py | 2 | | | | 🔵A-125 | python/paddle/distribution/student_t.py | 3 | | | | 🔵A-126 | python/paddle/distribution/transform.py | 13 | | | | 🔵A-127 | python/paddle/distribution/transformed_distribution.py | 4 | | | | 🔵A-128 | python/paddle/distribution/uniform.py | 2 | | | | 🔵A-129 | python/paddle/distribution/variable.py | 2 | | | | 🔵A-130 | python/paddle/device/`__init__.py` | 22 | | | | 🔵A-131 | python/paddle/device/cuda/`__init__.py` | 14 | | | | 🔵A-132 | python/paddle/device/xpu/`__init__.py` | 1 | | | | 🔵A-133 | python/paddle/amp/amp_lists.py | 2 | | | | 🔵A-134 | python/paddle/amp/auto_cast.py | 7 | | | | 🔵A-135 | python/paddle/amp/debugging.py | 10 | | | | 🔵A-136 | python/paddle/amp/grad_scaler.py | 4 | | | | 🔵A-137 | python/paddle/amp/`__init__.py` | 2 | | | | 🔵A-138 | python/paddle/autograd/autograd.py | 2 | | | | 🙋A-139 | python/paddle/autograd/saved_tensors_hooks.py | 1 | 🙋@DrRyanHuang | | | 🔵A-140 | python/paddle/autograd/backward_mode.py | 1 | | | | 🔵A-141 | python/paddle/autograd/ir_backward.py | 3 | | | | 🔵A-142 | python/paddle/autograd/py_layer.py | 2 | | |
#### 🔜 第 2 批

注意:上述 API 数量仅为参考,若多个模块之间相互引用,会导致统计数量增多。


⭐️ 提交PR 模版 ⭐️:

或者多个任务:

[Typing][A-1,A-2,A-3] Add type annotations for `paddle/tensor/*`

⭐️ 认领方式 ⭐️: 请大家以 comment 的形式认领任务,如:

【报名】:A-1、A-3

状态介绍: ✅:已经完全迁移,所有单测都OK! 🟢:审核完毕待合入,合入之后完全迁移! 🔵:可认领! 🟡:当前阶段不需要人力继续跟进,下阶段推进 🚧:迁移中,单测还没有过,还没有审核完。

大致正常流程为: 🔵 -> 🚧 -> 🟢 -> ✅

异常流程为: 🔵 -> 🚧 -> 🟡

看板信息

任务数量 🔵可认领 🚧迁移中 🟢待合入 ✅完成 🟡下阶段推进 🏁完成率
144 52 3 1 85 0 59.9%

排名不分先后 @zrr1999(4) @gouzil(12) @Asthestarsfalll(16) @SigureMo(9) @ooooo-create(13) @megemini(1) @liyongchao911(2) @DrRyanHuang(15) @enkilee(5) @gsq7474741(3) @sunzhongkai588(1) @Liyulingyue(1) @86kkd(3)

megemini commented 2 weeks ago

✨️ 大家好!✨️

此次任务为 为 Paddle 框架 API 添加类型提示(Type Hints) 的子任务。

即,将原本的 函数:

def log(x, name=None):
    ...

标注为:

def log(x: paddle.Tensor, name: str | None = None) -> paddle.Tensor:
    ...

Python 在 3.5 版本通过 PEP 484 – Type Hints 正式规范了 类型提示 功能,可以提升开发者的使用体验并提高代码质量。Paddle 目前最低支持的 Python 版本 3.8 已经可以较好的支持 类型提示,特发起此次任务,旨在完成 Paddle 目前公开 API 的类型标注!

欢迎大家参与! 非常感谢!:) 🎉🎉🎉

此次参与的流程大致为:

✨ 点击下列标题查看详情!✨

#### ✨ 任务认领 直接在 ISSUE 下回复认领的任务 ID 即可,如: ``` text 【报名】:A-25 ```
#### ✨ 修改接口 Python 的类型标注是个较为庞杂的体系,此次任务主要做以下几个事情: - 添加接口的类型标注 - 统一接口的类型标注与文档中的类型说明 - 通过类型检查工具对于相关接口示例代码的类型检测 ##### ➡️ 参考示例 以 `paddle.log` 这个接口为例,原代码为: ``` python def log(x, name=None): r""" Calculates the natural log of the given input Tensor, element-wise. .. math:: Out = \ln(x) Args: x (Tensor): Input Tensor. Must be one of the following types: int32, int64, float16, bfloat16, float32, float64, complex64, complex128. name (str|None): The default value is None. Normally there is no need for user to set this property. For more information, please refer to :ref:`api_guide_Name` Returns: Tensor: The natural log of the input Tensor computed element-wise. Examples: ... """ ... ``` 需要修改为: ``` python from __future__ import annotations ... def log(x: paddle.Tensor, name: str | None = None) -> paddle.Tensor: r""" Calculates the natural log of the given input Tensor, element-wise. .. math:: Out = \ln(x) Args: x (Tensor): Input Tensor. Must be one of the following types: int32, int64, float16, bfloat16, float32, float64, complex64, complex128. name (str|None, optional): The default value is None. Normally there is no need for user to set this property. For more information, please refer to :ref:`api_guide_Name` Returns: Tensor: The natural log of the input Tensor computed element-wise. Examples: ... """ ... ``` 此处需要注意以下几个地方: - 添加 `from __future__ import annotations` 由于目前 Paddle 的最低 Python 支持版本为 `3.8`,而此次任务希望尽可能的使用较新 Python 版本的标注特性,因此,有必要添加此模块。 - 为 `def log(x, name=None)` 添加类型标注 输入参数与输出参数均需要添加。 - 对齐 docstring 中的类型与实际参数类型 此处,`name=None` ,实际的参数标注为 `name: str | None = None`,对应 docstring 应为 `name (str|None, optional)` 。 注意,Python 中原有的类型 `Optional` 与 docstring 中的 `optional` 意义不同: - 前者,表示输入可以为 `None` - 后者,表示此参数有默认值。此默认值可以是 `None` ,也可以是任意值。 另外,docstring 的写法应注意: - `Args` 尽量保持简洁,如 `paddle.Tensor` 可写为 `Tensor` - `Returns` 的格式应写为: `return type, description` 的形式,如 `Tensor, The natural log of the input Tensor computed element-wise.` 更多详细的说明,请参考本 ISSUE 下面的 `《标注 Q&A》` 与 `《常用类型参照》`。 ##### ➡️ 相关工具 - https://github.com/megemini/ArgsTyping 利用这个工具,可以初步的进行类型标注,如: ``` shell > python args_typing.py -i /home/shun/Documents/Projects/paddle/megemini/Paddle/python/paddle/tensor/math.py ``` > 注意:此工具通过解析 docstring 中的 `Args/Parameters` 与 `Returns/Yields` 进行类型标注,因此,存在标注错误与不完整的可能。 - tools/type_checking.py 利用这个工具,可以在本地对修改的接口进行检查,如: ``` shell > python type_checking.py paddle.abs ``` > 注意:此用法依赖 PR https://github.com/PaddlePaddle/Paddle/pull/64991 的完成。
#### ✨ 提交 PR 每个任务需要提交至少一个 PR: PR 标题: - **[Typing][A-1] Add type annotations for `paddle/tensor/array.py`** 一个 PR 里提交多个任务可以使用 - **[Typing][A-1,A-2,A-3] Add type annotations for `paddle/tensor/*`** 上面的 `xxx` 可以是其他补充信息,如,多次提交 PR 的话,可以补充此 PR 的主要文件。 > 注意:标题中请务必标注 `[Typing]` ,以触发 CI 流水线检查。另外,请务必统一标题格式,以方便后续进行统计。 提交 PR 可直接复制以下模板,修改相应的 `xxx` 部分,也可补充说明: ``` markdown ### PR Category User Experience ### PR Types Improvements ### Description 类型标注: - xxx.py - xxx.py ### Related links - https://github.com/PaddlePaddle/Paddle/issues/65008 @SigureMo @megemini ``` 另外,由于类型标注存在接口依赖的可能,因此,如果 PR 由于其他接口而阻塞,请告知 reviewer 安排处理。
#### ✨ 任务收尾 PR 提交之后,需要在 CI 流水线中确认: - `PR-CI-Static-Check` - 是否修改的文件,其中 **所有** 的 API 都已进行测试 `type checking`。 - 是否上面 **所有** 的测试都已通过。 最后,任务收尾: - `Merge/Close` Paddle 代码的 PR 顺便提醒 reviewer 在此 ISSUE 中 check 任务的状态~ 至此,一个任务就算是圆满完工!🎉🎉🎉

最后,再次感谢大家的参与与贡献 ~ 🏆️🏆️🏆️

参考项目:

优秀 PR 赏析:

关联链接:

Python 文档:

@SigureMo @zrr1999 @Asthestarsfalll @gouzil @gsq7474741 @sunzhongkai588 @luotao1

megemini commented 2 weeks ago

标注 Q&A

##### **问:** 我该如何下手? 答:Python 的类型标注特性一直在完善,目前已经是个相对庞大的体系了。 可以先学习一下 Python 官方的文档:[Static Typing with Python](https://typing.readthedocs.io/en/latest/),熟悉一下相关的 PEP 。 以 `通过 CI 检查` 作为最基础的实现目标。 另外,目前 Paddle 添加了 `_typing` 模块,对于一些常用的公用类型做了统一整理,如: ``` pyton # python/paddle/_typing/layout.py DataLayout2D: TypeAlias = Literal["NCHW", "NHCW"] DataLayout3D: TypeAlias = Literal["NCDHW", "NDHWC"] ``` 标注时应尽量使用 `_typing` 模块中的类型,以方便后续维护。
##### **问:** docstring 中的 Args 与 type annotation 有什么区别? 答:Paddle 之前的版本未统一进行类型标注,而在 docstring 中描述了参数类型。 docstring 中 Args 的参数类型以方便用户理解为目的,在与 type annotation 不冲突的前提下,可以保持简洁。如: ``` python def test(a: int | list[int] | tuple[int, ...]) -> None: """ ... Args: a (int|list|tuple): xxx Returns: None, xxx ... """ ```
##### **问:** docstring 中的 Args 与 type annotation 不一致怎么办? 答:首先需要保证 type annotation 的正确性,如果 docstring 原有 Args 中的类型不正确,需要进行修改,并且,同时检查此接口的 `中文文档` (即 `docs`)是否正确,如发现错误,需要对 `docs` 单独提 PR 进行修改。
##### **问:** 该使用 `Union` 还是 `|` ? 答:尽可能的使用 `|`。 由于目前 Paddle 支持的 Python 最低版本为 `3.8` ,因此,`|` 只能在类型标注的情况下使用,而不能在表达式中使用,如: ``` python from __future__ import annotations def test(a: int | str): ... ``` 而在表达式中仍使用 `Union` : ``` python from typing import Union t = Union[int, str] ```
##### **问:** 如果测试无法通过怎么办? 答:可以使用 `# type: ignore` 进行规避。 此次任务通过工具 (如 `mypy`) 对接口的示例代码进行检查,进而保证类型标注的正确性。 类型标注的过程中,难免产生接口依赖问题,如果依赖的是 `私有接口` 或 `外部接口` ,则可以使用 `# type: ignore` 规避相应的类型检查,如: ``` python >>> import abcde # type: ignore >>> print('ok') ``` 或者规避整个代码检查: ``` python >>> # type: ignore >>> import abcde >>> print('ok') ```
##### **问:** 能否使用 `Any` 类型? 答:可以,但应尽量避免。
##### **问:** 如果出现 `circular import` 错误怎么办? 答:出现此情况可以参考以下处理方法: - 添加 `from __future__ import annotations` - 将类型单独通过 `typing.TYPE_CHECKING` 引入,如: ``` python from typing import TYPE_CHECKING if TYPE_CHECKING: import paddle.xxx as xxx def tmp() -> xxx: ... ``` 另外,如果标注的类型仅用作 type hints,也尽可能的使用 `TYPE_CHECKING` ,以减少不必要的模块导入。
##### **问:** 使用 `Tensor` 还是 `Variable`? 答:尽量使用 `Tensor` ,不将静态图的 `Variable/Value` 概念暴露给用户。 更详细的讨论可以参考 https://github.com/PaddlePaddle/community/pull/858#discussion_r1564552690
##### **问:** 如果遇到需要根据不同输入类型有不同输出类型的函数怎么办? 答:出现此情况可以参考以下处理方法: - 添加 `from typing import overload` - 标注多个同名函数,并用装饰器装饰,如: ``` python from typing import overload @overload def array_length(array: list[Any]) -> int:... @overload def array_length(array: paddle.Tensor) -> paddle.Tensor:... def array_length(array): ... # 具体实现的代码,不再进行标注 ```
##### **问:** 什么时候用 `Sequence` ,什么时候用 `list` 和 `tuple`? 答:Python 的 PEP 中有提示: > Note: Dict, DefaultDict, List, Set and FrozenSet are mainly useful for annotating return values. For arguments, prefer the abstract collection types defined below, e.g. Mapping, Sequence or AbstractSet. 也就是说,输入中用 `Sequence` ,返回值用 `list` 。 但是,如果代码中使用到了 `list` 的方法,如 `append` ,或者明确表示此输入只能是 `list` ,则不应再使用 `Sequence` 。
##### **问:** 标注的时候用 `Tensor` 还是 `paddle.Tensor`? 答:两者皆可。 若文件中出现较多 `paddle.Tensor` ,出于简洁的考虑,可以使用 `Tensor` 代替,但是需要在导入包时注意: ``` python if TYPE_CHECKING: from paddle import Tensor ``` 可参考讨论:https://github.com/PaddlePaddle/Paddle/pull/65073#discussion_r1636116450
##### **问:** 该用 `paddle.framework.Block` 还是 `paddle.pir.Block`? 答:统一使用 `paddle.pir.Block`。 可参考讨论:https://github.com/PaddlePaddle/Paddle/pull/65095#discussion_r1637570850
megemini commented 2 weeks ago

常用类型标注参照

变量 标注
name=None name: str \| None = None
liyongchao911 commented 2 weeks ago

【报名】:A16,A17,A18,A19

sunzhongkai588 commented 2 weeks ago

【报名】:A-60

zrr1999 commented 2 weeks ago

【报名】:A-2,A-4

Liyulingyue commented 2 weeks ago

【报名】:A-64

ooooo-create commented 2 weeks ago

【报名】:A-47

DrRyanHuang commented 2 weeks ago

【报名】:A-57

SigureMo commented 2 weeks ago

【报名】:A-8

DrRyanHuang commented 2 weeks ago

【报名】:A-48、A-49、A-50、A-51

zrr1999 commented 2 weeks ago

【报名】:A-3

zrr1999 commented 2 weeks ago

【报名】:A-3

megemini commented 2 weeks ago

@Liyulingyue @liyongchao911 @DrRyanHuang @ooooo-create @zrr1999 @gouzil @sunzhongkai588

之前 @SigureMo 有一个开源项目,是单独的 stub 标注文件的,可以参考一下里面已经标注的部分 ~

https://github.com/cattidea/paddlepaddle-stubs

DrRyanHuang commented 2 weeks ago

【报名】:A-29、A-53、A-34

ooooo-create commented 2 weeks ago

【报名】:A-39、A-51

SigureMo commented 2 weeks ago

【报名】:A-16、A-19

megemini commented 2 weeks ago

【报名】:A-15

gsq7474741 commented 2 weeks ago

【报名】:A-54、A-55、A-56、A-61、A-62、A-65

DrRyanHuang commented 1 week ago

【报名】:A-68

ooooo-create commented 1 week ago

【报名】:A-40、A-41、A-42

ooooo-create commented 1 week ago

【报名】:A10、A11、A12

gouzil commented 1 week ago

【报名】:A-5

Asthestarsfalll commented 1 week ago

【报名】:A-6、A-7

DrRyanHuang commented 1 week ago

【报名】:A-77、A-73

DrRyanHuang commented 1 week ago

【报名】:A-75、A-76、A-78、A-79、A-84

Asthestarsfalll commented 1 week ago

【报名】:A-20、A-21、A-22、A-23、A-24、A-25

ooooo-create commented 1 week ago

【报名】:A-14、A-26、A-37、A-38

Asthestarsfalll commented 1 week ago

佬们,写了一个解析docstring自动生成type hiting的脚本,虽然代码写得拉但是实测基本是可用的。不过格式会乱,需要单独生成再把函数签名复制过去。

代码如下:

here ```python from __future__ import annotations import ast import inspect import re import typing from collections import defaultdict from types import ModuleType from typing import Any, Callable, Dict, List, Optional, Tuple, Type import astor import paddle from astpretty import pprint NoneType = Type[None] class ReduceMode: ... class SizeD: ... # 是否通过参数名覆盖docsting中的类型 OVERWRITE = {"kernel_size"} # 参数名与类型的映射 ARGS_NAME_MAPPING: dict[str, list] = { "input": [paddle.Tensor], "label": [paddle.Tensor], "logit": [paddle.Tensor], "reduction": [ReduceMode], "x": [paddle.Tensor], "y": [paddle.Tensor], "kernel_size": [SizeD], } # docstring 中的字符与类型的映射 TYPE_MAPPING = { "Tensor": paddle.Tensor, "float": float, "int": int, "list": list, "tuple": tuple, "bool": bool, "str": str, "string": str, "None": NoneType, "function": Callable, "ParamAttr": paddle.base.param_attr.ParamAttr, } # 函数与返回值的映射 RETURN_MAPPING = { "__init__" : "None", "__str__" : "str", "__repr__": "str", "forward": "Tensor", "extra_repr": "str" } for name in typing.__all__: if not name[0].isupper(): continue TYPE_MAPPING[name] = getattr(typing, name) IMPORT_TEMPLETE = """ from typing import Any, Dict, List, Optional, Tuple, Union, Callable, TYPE_CHECKING from collections.abc import Iterable if TYPE_CHECKING: from paddle import Tensor """ DEFAULT_RETURN = "Tensor" HAS_IMPORTED = defaultdict(bool) MODUELS = [paddle.nn.layer.loss] GLOBAL_FILE_PATH = None SOURCE_FILE = None args_pattern = re.compile( r"(Args|Parameters)\s*:\s*(.*?)(?=\s*(Returns|Examples|$))", re.DOTALL ) pattern = re.compile(r"(.*\(.*\)):") def load_ast(file_path: str): with open(file_path, "r") as file: code_str = file.read() tree = ast.parse(code_str) return tree def find_function_by_name(tree, func_name): for node in ast.walk(tree): if isinstance(node, ast.FunctionDef) and node.name == func_name: return node def find_method_in_class(tree, class_name, method_name): for node in ast.walk(tree): if isinstance(node, ast.ClassDef) and node.name == class_name: for subnode in node.body: if isinstance(subnode, ast.FunctionDef) and subnode.name == method_name: return subnode def write_code(file_path, code): with open(file_path, "w") as f: f.write("") if not HAS_IMPORTED[file_path]: with open(file_path, "w") as f: f.write(IMPORT_TEMPLETE) HAS_IMPORTED[file_path] = True with open(file_path, "a") as f: f.write(code) global SOURCE_FILE SOURCE_FILE = file_path def gname(name): return ast.Name(name, ast.Load()) def gsub(name, type_nodes): return ast.Subscript( value=gname(name), slice=ast.Index( value=( ast.Tuple(elts=type_nodes) if not isinstance(type_nodes, ast.AST) else type_nodes ) ), ctx=ast.Load(), ) def gimport(): pass def gbitor(node): left = node[0] for i in range(len(node) - 1): right = node[i + 1] res = ast.BinOp(op=ast.BitOr(), left=left, right=right) left = res return res def gnone(): return ast.Constant(value=None, kind=None) def gen_annotation(type_info: List[Any], is_optional): try: node = [gname(getattr(i, "__name__", str(i).split(".")[-1])) for i in type_info] except: breakpoint() if len(node) > 1: node = gbitor(node) elif len(node) == 0: print("No Valid Type") return None else: node = node[0] if is_optional: node = gbitor([node, gnone()]) return node def get_type_by_name(name: str): t = TYPE_MAPPING.get(name, None) idx = 0 while t is None: t = getattr(MODUELS[idx], name, None) idx += 1 if idx >= len(MODUELS): break if t is None: print(f"CANNOT FIND TYPE OF {name}") return t def parse_args_from_docstring(docstring): if docstring is None: print("NO docstring") return {}, {} matches = args_pattern.findall(docstring) if len(matches) == 0: print("No Args") return {}, {} matches: List[str] = pattern.findall(matches[0][1]) matches = [m.strip() for m in matches] args = {} is_optional = defaultdict(bool) for m in matches: k, v = m.split("(", 1) k = k.strip() v = v[:-1].split(",")[0].split("|") v = [get_type_by_name(i) for i in v] v = [i for i in v if i is not None] if NoneType in v: is_optional["k"] = True v.remove(NoneType) args[k] = v return args, is_optional def determine_by_default_value(): ... def convert_func( m, func_name, class_name=None, args_and_type: Optional[Dict[str, List[Any]]] = None, ): if args_and_type is None: args_and_type = parse_args_from_docstring(m.__doc__) args_and_type, args_optional = args_and_type parameters = inspect.signature(m).parameters try: # file_path = inspect.getsourcefile(m) assert SOURCE_FILE is not None tree = load_ast(SOURCE_FILE) except: print(f"can not load ast of {func_name}, {class_name}") return if class_name is None: node = find_function_by_name(tree, func_name) else: node = find_method_in_class(tree, class_name, func_name) if node is None: print("Cant no find code difinitions") return node_args = {i.arg: i for i in node.args.args} for param_name, param_info in parameters.items(): if param_name == "self": continue if param_name not in args_and_type and param_name not in ARGS_NAME_MAPPING: print(f"CANNOT FIND param in docstring {param_name}") continue try: node_arg = node_args[param_name] except KeyError: return default_value = param_info.default type_info = [] if other_info:=ARGS_NAME_MAPPING.get(param_name, False): type_info.extend(other_info) if param_name not in OVERWRITE: type_info.extend(args_and_type.get(param_name, [])) node_arg.annotation = gen_annotation( set(type_info), args_optional.get(param_name, False) or default_value is None ) if func_name in RETURN_MAPPING: node.returns = gname(RETURN_MAPPING[func_name]) elif DEFAULT_RETURN is not None: node.returns = gname(DEFAULT_RETURN) # code_str = astor.to_source(tree) code_str = ast.unparse(tree) write_code(GLOBAL_FILE_PATH or file_path, code_str) def convert_class(m, class_name): args_and_type = parse_args_from_docstring(m.__doc__) convert_func(m.__init__, "__init__", class_name, args_and_type) # __str__ ...... methods = inspect.getmembers(m, predicate=inspect.isfunction) parent_class = m.__mro__[1] non_inheritted_methods = [ (name, func) for name, func in methods if func != getattr(parent_class, name, None) ] for name, method in non_inheritted_methods: if name == "__init__": continue if filter_by_name(name): continue print(f"CONVERT {class_name}.{name}") convert_func(method, name, class_name) print() def convert_var(m): ... def filter_by_name(name: str) -> bool: if name.startswith("_") and not name.startswith("__"): return True if name in [ "Tensor", "Variable", "LayerHelper", "default_main_program", "check_type", "_C_ops", "TYPE_CHECKING", "check_variable_and_dtype", ]: return True if name in typing.__all__: return True if name.startswith("in"): # for in_dynamic_or_pir_mode .... return True return False def is_function_defined_in_file(m): try: source_file = inspect.getsourcefile(m) except TypeError: return False return source_file != SOURCE_FILE def filter(name: str, m) -> bool: if filter_by_name(name): return True # if is_function_defined_in_file(m): # return True return False def convert_module(module: ModuleType, target_file:str): assert isinstance(module, ModuleType) global SOURCE_FILE, GLOBAL_FILE_PATH SOURCE_FILE = module.__file__ GLOBAL_FILE_PATH = target_file members = inspect.getmembers(module) for name, m in members: if filter(name, m): # print(f"SKIP {name}") continue if inspect.isfunction(m): print(f"CONVERT {name}") convert_func(m, name) elif inspect.isclass(m): print(f"CONVERT {name}") convert_class(m, name) elif inspect.isdatadescriptor(m): ... else: convert_var(m) if __name__ == "__main__": convert_module(paddle.nn.functional.pooling, './pooling.py') ```
86kkd commented 6 days ago

【报名】:A-85

DrRyanHuang commented 6 days ago

【报名】:A-139

86kkd commented 6 days ago

【报名】:A-86