openvinotoolkit / nncf

Neural Network Compression Framework for enhanced OpenVINO™ inference
Apache License 2.0
904 stars 226 forks source link

[Good First Issue] [NNCF] Cover NNCF code with least coverage percentage with unit tests to at least 80% #2496

Open vshampor opened 7 months ago

vshampor commented 7 months ago

Context

NNCF uses coverage to run its precommit checks and currently has an admirable coverage rate of 90% across the repository. However, there are still some files that have <80% coverage, some - with no coverage at all. Some of these are located in the experimental folder and do not have a strict coverage requirement, some are special files such as __init__.py or setup.py which are not easily unit-tested, but most have actual production code inside and many of those have exactly 0.00% coverage. We need such files covered by precommit-scope unit tests in order not to let potential bugs get away.

Task

  1. Review the current per-file coverage percentage at Codecov (sort by coverage) https://app.codecov.io/gh/openvinotoolkit/nncf?search=&displayType=list

  2. For the following files in the list (click to see per-line coverage for each on Codecov), add unit tests (using pytest style) under the tests directory (in a subdirectory corresponding to the tested backend, e.g. a test for nncf/torch/foo.py would go to tests/torch/test_foo.py, for nncf/common/bar.py - to tests/torch/test_bar.py).

  1. Open a PR into NNCF repo, wait for the automatic pre-commit checks to pass and for the Codecov to update its automatic status comment with the coverage difference introduced by the PR. Make sure that the Codecov status comment shows an increase of coverage in a given code file (or files) to at least 80% coverage.

Tips

MaximProshin commented 7 months ago

Feel free to split this task into multiple tasks by e.g. implementing it per file. We can help to create sub-tasks in such case.

Saharsh365 commented 7 months ago

@vshampor @MaximProshin I'd like to be assigned this issue. Also, since this will be my first time contributing, do you have any advice for me? And, is there a discord or other means for communication for the community I can join?

Saharsh365 commented 7 months ago

.take

github-actions[bot] commented 7 months ago

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

mlukasze commented 7 months ago

And, is there a discord or other means for communication for the community I can join?

yes sir! Discord

MaximProshin commented 7 months ago

@Saharsh365 , thanks for your interest! The description is quit detailed and there are also tips. My recommendation would be to start from the analysis of the current codecov results, selecting one of the weak files and implementing the first simple test. If it works, the rest could be done by analogy. @vshampor will support you in case of questions. Don't hesitate to bring your questions.

Saharsh365 commented 7 months ago

@vshampor The code coverage analysis reports are not working for me anymore. Could you please help me solve this issue

vshampor commented 7 months ago

@Saharsh365 we won't be able to help you unless you describe the problem you are encountering in a more specific fashion. I checked that the reports are visible for non-authenticated viewers.

Saharsh365 commented 7 months ago

@Saharsh365 we won't be able to help you unless you describe the problem you are encountering in a more specific fashion. I checked that the reports are visible for non-authenticated viewers.

Oh, I'm not sure what happened but I believe that from Friday evening to yesterday the reports were not being displayed. I can however see them now and I believe that may have been due to the weekend? Apologies for taking up your time.

Saharsh365 commented 7 months ago

I would like to drop this issue since I am not sure about how to proceed with this task. I used multiple resources to research how to write code coverage tests and I am still unsure about how to apply my knowledge to this codebase to write coverage tests

p-wysocki commented 7 months ago

Hello @Saharsh365, I'm unassigning you, but feel free to repick the task if you wish to - we're here to answer questions, perhaps asking them on Intel DevHub channel would be easier.

Candyzorua commented 7 months ago

.take

github-actions[bot] commented 7 months ago

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

Candyzorua commented 7 months ago

Hello! I am a first-time contributor and I'm excited to help!

I have written unit tests for nncf/common/utils/decorators.py and nncf/common/utils/os.py.

May I know if there are other ways to check the code coverage before making the PR, other than coverage.py? Would it be advised for me to open a PR for just these 2 files, to check if I am on the right track? :)

I'm also partway through making test for nncf/common/utils/patcher.py and I'd like to ask some questions on it. Particularly, I'm not clear regarding the patch method.

For the patch method, it seems that there are 3 separate cases: module, class, and instance-level functions. However, I'm not sure when the instance-level case should be triggered, as when an instance method is passed as the 2nd argument to patch, the class-level case is still triggered due to line 44, obj_cls, fn_name = self.import_obj(obj_cls) making obj_cls into a Class object and line 55 always evaluating to True.

Could this method possibly be bugged or am I understanding it incorrectly? Thank you for reading, really appreciate any help.

vshampor commented 7 months ago

Greetings!

The implementation may have subtle bugs, in particular because the coverage is missing. I suggest that you add a test case for it. Here's how the "instance method" patching supposed to start from:

class Model:
    def forward(self, x):
        return x

model = Model()
assert model.forward(7) == 7  # `forward` is looked up on the Model.__dict__, i.e. the class object

# now someone (not us) patches the forward with its own implementation:
def new_forward(x):
    return 0

model.forward = new_forward  # creates a new *instance* attribute `forward` which masks the original function definition from the class

assert model.forward(7) == 0

# now `model.forward` is what we call here an "instance method", and now we want to patch it

def our_forward(x):
    return 42

patcher = Patcher()
patcher.patch(model.forward, our_forward)

assert model.forward(7) == 42 

Try implementing that test case, maybe debugging the flow of the patcher for this case to see if it works as expected (or not). If you have further questions, feel free to post these. I think @nikita-savelyevv might also pitch in with some help on these.

nikita-savelyevv commented 6 months ago

Hi @Candyzorua. Regarding your question, if you call patch() on a method of the instantiated class object then it should fall into the branch you mention. This for example will happen if you call

PATCHER.patch(test_obj.assert_wrapper_stack_method, wrapper1)

for the test_obj which is created at https://github.com/openvinotoolkit/nncf/blob/develop/tests/common/test_monkey_patching.py#L51 .

Candyzorua commented 6 months ago

@vshampor @nikita-savelyevv Thank you both for your input. Duly noted and I will implement the ideas you have mentioned at my earliest availability.

Candyzorua commented 6 months ago

Hello, I wrote unit tests for the first 5 files and have made the following PR, am continuing with the rest of the tests :)

MaximProshin commented 6 months ago

Hello, I wrote unit tests for the first 5 files and have made the following PR, am continuing with the rest of the tests :)

https://github.com/openvinotoolkit/nncf/pull/2468 has been merged

Candyzorua commented 6 months ago

Hello again,

I have been writing tests for nncf/quantization recently and encountered an issue while making tests for nncf/quantization/algorithms/accuracy_control/algorithm.py.

The function I am testing is _collect_original_biases_and_weights. The test code is as such:

image

I get a synthetic linear model from nncf/tests/openvino/native/models.py and quantize it. However, when I pass this to _collect_original_biases_and_weights I get an error because the nodes of the quantized graph's have None for layer attributes.

Not sure if the nodes of the quantized graph's having None for layer attributes is a bug, or if I have done something incorrect in the testing setup. Any help would be appreciated!

Candyzorua commented 6 months ago

In addition, I would like to clarify if the method through which I am obtaining synthetic models and quantizing them is good practice, or if I should hardcode quantized models.

vshampor commented 6 months ago

@andrey-churkin to advise on the accuracy control algorithm requirements

Candyzorua commented 6 months ago

Hello, I would like to follow-up on this, please :)

andrey-churkin commented 6 months ago

Hi @Candyzorua, Sorry for the delay in response. An instance of the QuantizationAccuracyRestorer class works with a graph of a quantized model for which weights compression was not applied. The nncf.quantize() method compresses weights by default for the OpenVINO backend. Therefore, to resolve the issue, we should turn off weight compression. To do this, please use the following code snippet:

from nncf.openvino.quantization.backend_parameters import BackendParameters

quantized_model = nncf.quantize(
    initial_model,
    dataset,
    advanced_parameters=nncf.AdvancedQuantizationParameters(backend_params={BackendParameters.COMPRESS_WEIGHTS: False})
)

Please let me know if it works for you.

Candyzorua commented 6 months ago

Hi @andrey-churkin, thank you very much for your help. With your suggestion, I was able to test _collect_original_biases_and_weights. 😄 Most functions in algorithm.py are now covered, however I am finding it quite difficult to test the longer and more complex apply function so I am leaving it out for now.

I have opened a PR with tests for 5 more files regarding quantization and accuracy control functions.

RitikaxShakya commented 5 months ago

Hello! @Candyzorua Are all tests/Tasked covered ? If not can i pick some ? @vshampor I wish to work on this issue :). As said by @MaximProshin that we can split this among sub tasks. Thank you.

Candyzorua commented 5 months ago

Hi @RitikaxShakya, apologies for the late reply. I haven't covered all the files. Here is a list that shows which files have already been covered. If the checkbox is ticked, it means the file has been covered.

I think it's a great idea to split up the work. Perhaps you can work on the remaining quantization-related files or the TensorFlow ones? I am continuing work on the PyTorch ones for now. Please let me know, thank you!

RitikaxShakya commented 5 months ago

@Candyzorua Hello! Thank you for replying :), That's great! i was already looking for to pick up TensorFlow and quantization related files, so i will pick up working on these files! Thank you!

DaniAffCH commented 5 months ago

Hi, so if @RitikaxShakya takes Tensorflow and Quantization I'll cover the remaining ones

Candyzorua commented 5 months ago

Hi @DaniAffCH , since I have started on nncf/torch/quantization, I think it would be awesome if you could work on nncf/torch/sparsity, nncf/torch/pruning, as well as nncf/telemetry. I think nncf/torch/binarization has been removed. Please let me know what you think, thanks!

DaniAffCH commented 5 months ago

@Candyzorua thanks for letting me know! Yes it's ok, I'll cover the unassigned files :)

anzr299 commented 4 months ago

@vshampor @Candyzorua Hi, what's the status of the files covered? I would like to contribute as well.

RitikaxShakya commented 4 months ago

Hello! I've almost covered up TensorFlow related files, and finishing on Quantization files.

Candyzorua commented 4 months ago

Hi @anzr299 , sorry for the late reply. Please feel free to take over nncf/torch/quantization! I will be stopping work on this issue right now due to work responsibilities. Thank you!

anzr299 commented 4 months ago

Hi @anzr299 , sorry for the late reply. Please feel free to take over nncf/torch/quantization! I will be stopping work on this issue right now due to work responsibilities. Thank you!

Sure, no problem. I will work on it.