apache / tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators
https://tvm.apache.org/
Apache License 2.0
11.65k stars 3.45k forks source link

[Tracking Issue] pylint the tests! #11414

Closed areusch closed 1 week ago

areusch commented 2 years ago

This is a tracking issue for our effort to enable pylint on all Python testing code underneath tests/python. Currently we don't lint scripts in that directory. We're going to take an incremental approach here and anyone is welcomed to help with this. Here's how:

  1. Pick a directory where to start. I suggest choosing the directory based on the number of Python files.
  2. Add a line to tests/lint/pylint.sh containing that directory similar to https://github.com/apache/tvm/pull/11625.
  3. Run docker/lint.sh pylint locally and fix the lint issues.
  4. Submit the PR and reference this issue in the PR so we know to update it.

Here are the tests that need linting ๐Ÿ˜ธ (we will try to tag the PR which fixes them when we merge them):

areusch commented 2 years ago

@gigiblender is going to look at this one

gigiblender commented 2 years ago

I enabled pylint in #11600. However, there are a lot of tests that contain formatting errors (see the file attached to the PR). I am not sure if I should start fixing those as part of this issue?

@areusch

Mousius commented 2 years ago

@gigiblender can you put together a list of tests which need linting so we can put a checklist on this issue? then people can pick them off when they get time ๐Ÿ˜ธ

gigiblender commented 2 years ago

@gigiblender can you put together a list of tests which need linting so we can put a checklist on this issue? then people can pick them off when they get time smile_cat

Checklist attached: checklist.txt

Mousius commented 2 years ago

Thanks @gigiblender, I've put them up here, if we can reference this issue in PRs then we'll get through this!

alanmacd commented 2 years ago

I'll start with these:

tests/python/unittest/test_crt.py
tests/python/unittest/test_micro_model_library_format.py
tests/python/unittest/test_micro_project_api.py
tests/python/unittest/test_micro_transport.py
tests/python/unittest/test_node_reflection.py
tests/python/unittest/test_runtime_container.py
tests/python/unittest/test_runtime_error.py
tests/python/unittest/test_runtime_extension.py
tests/python/unittest/test_runtime_graph_cuda_graph.py
tests/python/unittest/test_runtime_graph_debug.py
tests/python/unittest/test_runtime_graph.py
tests/python/unittest/test_runtime_measure.py
tests/python/unittest/test_runtime_module_based_interface.py
tests/python/unittest/test_runtime_module_export.py
tests/python/unittest/test_runtime_module_load.py
tests/python/unittest/test_runtime_profiling.py
tests/python/unittest/test_runtime_rpc.py
tests/python/unittest/test_runtime_trace.py
tests/python/unittest/test_runtime_vm_profiler.py
blackkker commented 2 years ago

I want to try these.

 tests/python/frontend/caffe/test_forward.py
 tests/python/frontend/caffe2/model_zoo/init.py
 tests/python/frontend/caffe2/test_forward.py
 tests/python/frontend/coreml/model_zoo/init.py
 tests/python/frontend/coreml/test_forward.py
 tests/python/frontend/darknet/test_forward.py
 tests/python/frontend/keras/test_forward.py
 tests/python/frontend/mxnet/model_zoo/init.py
 tests/python/frontend/mxnet/model_zoo/dqn.py
 tests/python/frontend/mxnet/model_zoo/inception_v3.py
 tests/python/frontend/mxnet/model_zoo/mlp.py
 tests/python/frontend/mxnet/model_zoo/resnet.py
 tests/python/frontend/mxnet/model_zoo/squeezenet.py
 tests/python/frontend/mxnet/model_zoo/vgg.py
 tests/python/frontend/mxnet/test_forward.py
 tests/python/frontend/mxnet/test_graph.py
 tests/python/frontend/mxnet/test_qnn_ops_utils.py
 tests/python/frontend/oneflow/test_forward.py
 tests/python/frontend/oneflow/test_vision_models.py
 tests/python/frontend/onnx/test_forward.py
 tests/python/frontend/paddlepaddle/test_forward.py
 tests/python/frontend/pytorch/qnn_test.py
 tests/python/frontend/pytorch/test_forward.py
 tests/python/frontend/pytorch/test_fx_quant.py
 tests/python/frontend/pytorch/test_lstm.py
 tests/python/frontend/pytorch/test_object_detection.py
 tests/python/frontend/pytorch/test_rnns.py
 tests/python/frontend/tensorflow/test_bn_dynamic.py
 tests/python/frontend/tensorflow/test_control_flow.py
 tests/python/frontend/tensorflow/test_debugging.py
 tests/python/frontend/tensorflow/test_forward.py
 tests/python/frontend/tensorflow/test_no_op.py
 tests/python/frontend/tensorflow2/common.py
 tests/python/frontend/tensorflow2/test_functional_models.py
 tests/python/frontend/tensorflow2/test_sequential_models.py
 tests/python/frontend/test_common.py
 tests/python/frontend/tflite/test_forward.py
AndrewZhaoLuo commented 2 years ago

Hey folks, a lot of the files I see have one character variable names, mostly scheduling code. I think this is reasonable in many situations so think we should upgrade pylint, and use new pylint features to allow for more variable names. See https://github.com/apache/tvm/pull/11712. Please let me know what you think.

alanmacd commented 2 years ago

@AndrewZhaoLuo agreed, additionally, pylint seems to be inconsistent as far as which one (or two) character variable names it allows and which it doesn't.

alanmacd commented 2 years ago

I also hit an issue in tests/python/unittest/test_micro_project_api.py related to pytest fixtures and the redefining name from outer scope warning, which is further complicated by tvm.testing.fixture() not supporting the name argument. For now I just disabled the warning. https://stackoverflow.com/questions/46089480/pytest-fixtures-redefining-name-from-outer-scope-pylint/54045715

Mousius commented 2 years ago

@alanmacd I removed the offending code rather than ignoring the warning as it can mean we manage to hide other errors?

See: https://github.com/apache/tvm/pull/11657

quic-sanirudh commented 2 years ago

I can try to fix all the test_hexagon tests if no one else has started working on it. The list is

 tests/python/contrib/test_hexagon/benchmark_elemwise_add.py
 tests/python/contrib/test_hexagon/benchmark_util.py
 tests/python/contrib/test_hexagon/conftest.py
 tests/python/contrib/test_hexagon/conv2d/test_conv2d_blocked.py
 tests/python/contrib/test_hexagon/conv2d/test_conv2d_conv2d.py
 tests/python/contrib/test_hexagon/infrastructure.py
 tests/python/contrib/test_hexagon/test_2d_physical_buffers.py
 tests/python/contrib/test_hexagon/test_autotvm.py
 tests/python/contrib/test_hexagon/test_cache_read_write.py
 tests/python/contrib/test_hexagon/test_launcher.py
 tests/python/contrib/test_hexagon/test_maxpool2d_blocked.py
 tests/python/contrib/test_hexagon/test_models.py
 tests/python/contrib/test_hexagon/test_run_unit_tests.py
 tests/python/contrib/test_hexagon/test_thread_pool.py
 tests/python/contrib/test_hexagon/test_usmp.py
 tests/python/contrib/test_hexagon/topi/test_batch_matmul.py
 tests/python/contrib/test_hexagon/topi/test_conv2d_nchw.py
 tests/python/contrib/test_hexagon/topi/test_conv2d_nhwc.py
 tests/python/contrib/test_hexagon/topi/test_conv2d_transpose.py
 tests/python/contrib/test_hexagon/topi/test_dense.py
 tests/python/contrib/test_hexagon/topi/test_depthwise_conv2d.py
 tests/python/contrib/test_hexagon/topi/test_pooling.py
 tests/python/contrib/test_hexagon/topi/test_reduce.py
 tests/python/contrib/test_hexagon/topi/test_softmax.py
quic-sanirudh commented 2 years ago

@alanmacd I removed the offending code rather than ignoring the warning as it can mean we manage to hide other errors?

See: #11657

I ran into the redefined-outer-name error as well with pytest fixtures. While searching around, I stumbled upon this pylint issue and their recommendation seems to be a pylint plugin that's separately maintained. Should this be considered, as it seems to handle multiple such quirkiness related to pylint-pytest compatibility

alanmacd commented 2 years ago

tvm.testing.fixture()

Unfortunately, in my case, it's using tvm.testing.fixture() not pytest.fixture() so I can't use the name workaround. I also don't see how removing the code completely is viable. You can also disable/enable warning in small blocks to limit the scope of it.

Mousius commented 2 years ago

tvm.testing.fixture()

Unfortunately, in my case, it's using tvm.testing.fixture() not pytest.fixture() so I can't use the name workaround. I also don't see how removing the code completely is viable. You can also disable/enable warning in small blocks to limit the scope of it.

I removed the tvm.testing.fixture entirely and replaced it with pytest.mark.parameterize, if it's not compliant with the linting rules we likely need to re-think it or trial something like @quic-sanirudh suggested - I think that can be resolved separately?

I'm cautious of disabling the linter to enable us to run pylint given that still leaves us with un-linted tests?

alanmacd commented 2 years ago

One recommendation, based on my past experience with large linting or security scan efforts is to open a work item for areas that aren't easily resolved, and leave a comment with the work item number, then address them later, that way we get 90+% of the coverage on existing work, and then can address corner cases individually.

I'm cautious of disabling the linter to enable us to run pylint given that still leaves us with un-linted tests?

In most cases, we're only disabling one type of lint warning for contained blocks or lines.

Mousius commented 2 years ago

One recommendation, based on my past experience with large linting or security scan efforts is to open a work item for areas that aren't easily resolved, and leave a comment with the work item number, then address them later, that way we get 90+% of the coverage on existing work, and then can address corner cases individually.

Yip, didn't mean to give the wrong impression here, I agree with disabling some bits to enable us to have the linter running ๐Ÿ˜ธ I think all the linting PRs will go in with some pylint disable blocks we can use to find them later, that seemed close enough to tracking the work items to me, do you think we need to raise issues for it?

As far as I know, replacing tvm.testing.fixture with the native pytest.mark.parameterize is in the easily resolved category as one is a sugar on top of the other? That way we can use our linter disabling cards for trickier areas?

blackkker commented 2 years ago

Update my issue from #11703 to #12028 @areusch @AndrewZhaoLuo

I want to try these.

 tests/python/frontend/caffe/test_forward.py
 tests/python/frontend/caffe2/model_zoo/init.py
 tests/python/frontend/caffe2/test_forward.py
 tests/python/frontend/coreml/model_zoo/init.py
 tests/python/frontend/coreml/test_forward.py
 tests/python/frontend/darknet/test_forward.py
 tests/python/frontend/keras/test_forward.py
 tests/python/frontend/mxnet/model_zoo/init.py
 tests/python/frontend/mxnet/model_zoo/dqn.py
 tests/python/frontend/mxnet/model_zoo/inception_v3.py
 tests/python/frontend/mxnet/model_zoo/mlp.py
 tests/python/frontend/mxnet/model_zoo/resnet.py
 tests/python/frontend/mxnet/model_zoo/squeezenet.py
 tests/python/frontend/mxnet/model_zoo/vgg.py
 tests/python/frontend/mxnet/test_forward.py
 tests/python/frontend/mxnet/test_graph.py
 tests/python/frontend/mxnet/test_qnn_ops_utils.py
 tests/python/frontend/oneflow/test_forward.py
 tests/python/frontend/oneflow/test_vision_models.py
 tests/python/frontend/onnx/test_forward.py
 tests/python/frontend/paddlepaddle/test_forward.py
 tests/python/frontend/pytorch/qnn_test.py
 tests/python/frontend/pytorch/test_forward.py
 tests/python/frontend/pytorch/test_fx_quant.py
 tests/python/frontend/pytorch/test_lstm.py
 tests/python/frontend/pytorch/test_object_detection.py
 tests/python/frontend/pytorch/test_rnns.py
 tests/python/frontend/tensorflow/test_bn_dynamic.py
 tests/python/frontend/tensorflow/test_control_flow.py
 tests/python/frontend/tensorflow/test_debugging.py
 tests/python/frontend/tensorflow/test_forward.py
 tests/python/frontend/tensorflow/test_no_op.py
 tests/python/frontend/tensorflow2/common.py
 tests/python/frontend/tensorflow2/test_functional_models.py
 tests/python/frontend/tensorflow2/test_sequential_models.py
 tests/python/frontend/test_common.py
 tests/python/frontend/tflite/test_forward.py
alanmacd commented 2 years ago

Has anyone found a way to address the redefined-outer-name warning when tvm.testing.parameters is used? For example see test_crt.py::test_aot_executor_usmp_const_pool

[W0621(redefined-outer-name), test_aot_executor_usmp_const_pool] Redefining name 'enable_usmp' from outer scope 
[W0621(redefined-outer-name), test_aot_executor_usmp_const_pool] Redefining name 'expect_exception' from outer scope

Also, there's a related issue with the actual variable names as pylint wants the globals to be constants:

[C0103(invalid-name), ] Constant name "enable_usmp" doesn't conform to '(([A-Z_][A-Z0-9_]*)|(__.*__))$'
[C0103(invalid-name), ] Constant name "expect_exception" doesn't conform to '(([A-Z_][A-Z0-9_]*)|(__.*__))$' 

This is probably the same situation as to what we discussed previously above, but just checking before I disable these two warnings.

Mousius commented 2 years ago

@alanmacd I don't think there's any good solution found yet ๐Ÿ˜ž I went with replacing them in the tests I fixed up (actually the same enable_usmp parameter ๐Ÿ˜ธ ) to save my pylint exclusions for other things:

https://github.com/apache/tvm/pull/11657/files#diff-02f266539dcadf3d83ef783e83c089ffb87f87061bb68d21b5795f36f92361b4R117-R118

alanmacd commented 2 years ago

@Mousius

Thanks, that works! It was slightly more complicated, but I think I figured it out. ๐Ÿ˜„ @pytest.mark.parametrize("enable_usmp,expect_exception", [(True, True), (False, False)])

quic-sanirudh commented 2 years ago

Would it make sense to create multiple PRs with a few files fixed in each PR?

I'm working on the hexagon tests and a number of them are being regularly updated, so I think it might be easier to do them in parts.

areusch commented 2 years ago

@quic-sanirudh yeah that's totally fine. you'll just need to slowly add those per-pr to the lint script.

quic-sanirudh commented 2 years ago

@quic-sanirudh yeah that's totally fine. you'll just need to slowly add those per-pr to the lint script.

Thanks for the reply @areusch. I've raised the patch for the first set of files in #12082

DongqiShen commented 2 years ago

Hi, I got a naive question about the issue and hope you to tell me whether my understanding is right or not. I followed your steps and run commands. Actually, on my local vscode environment, there are still some warnings on the fixed .py file. However, the output told me that it is 10.00/10. I also add some un-fixed .py files, and there are a lot of warnings from output and the score is not 10. So, my question is that what I need to do is to make sure that fixed all the warnings from output and the rate will be 10.00/10. That means I made things right. I think I can do something about it if my understanding is correct. Thanks

areusch commented 2 years ago

ccing @Lunderberg for visibility on the pylint/parametrize issue