neuralmagic / deepsparse

Sparsity-aware deep learning inference runtime for CPUs
https://neuralmagic.com/deepsparse/
Other
2.94k stars 169 forks source link

[BugFixes] Fix Broken `deepsparse.eval` #1511

Closed rahul-tuli closed 6 months ago

rahul-tuli commented 6 months ago

Bugfixes for deepsparse.eval

This PR fixes a few minor bugs found in deepsparse.eval

Bugs

Bug 1

deepsparse.eval was pointing to the wrong script for it's cli

rahul at office-desktop in ~/projects/deepsparse (fix-bugs) (.base_venv) 
$ deepsparse.eval --help                                                                                                                                                (fix-bugs|✔)
Traceback (most recent call last):
  File "/home/rahul/venvs/.base_venv/bin/deepsparse.eval", line 5, in <module>
    from deepsparse.evaluate.evaluator.cli import main
ModuleNotFoundError: No module named 'deepsparse.evaluate'

Bug 2

The engine_type argument was passing None to click.Choice, which is invalid. This can be fixed by passing a list of valid choices to click.Choice

$ deepsparse.eval --help                                                                                                                                                (fix-bugs|✚1)
Traceback (most recent call last):
  File "/home/rahul/venvs/.base_venv/bin/deepsparse.eval", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1077, in main
    with self.make_context(prog_name, args, **extra) as ctx:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 943, in make_context
    self.parse_args(ctx, args)
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1408, in parse_args
    value, args = param.handle_parse_result(ctx, opts, args)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 2400, in handle_parse_result
    value = self.process_value(ctx, value)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 2362, in process_value
    value = self.callback(ctx, self, value)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1300, in show_help
    echo(ctx.get_help(), color=ctx.color)
         ^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 704, in get_help
    return self.command.get_help(self)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1325, in get_help
    self.format_help(ctx, formatter)
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1358, in format_help
    self.format_options(ctx, formatter)
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 1382, in format_options
    rv = param.get_help_record(ctx)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 2753, in get_help_record
    rv = [_write_opts(self.opts)]
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 2749, in _write_opts
    rv += f" {self.make_metavar()}"
              ^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/core.py", line 2217, in make_metavar
    metavar = self.type.get_metavar(self)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rahul/venvs/.base_venv/lib/python3.11/site-packages/click/types.py", line 256, in get_metavar
    choices_str = "|".join(self.choices)
                  ^^^^^^^^^^^^^^^^^^^^^^

Current PR fixes both these bugs.

Test Plan

deepsparse.eval --help should work without any errors.

$ deepsparse.eval --help                                                                                                                                                (fix-bugs|✚1)
Usage: deepsparse.eval [OPTIONS] [INTEGRATION_ARGS]...

Options:
  --target PATH                   A path to a remote or local directory
                                  containing ONNX/torch model (including all
                                  the auxiliary files) or a SparseZoo stub
                                  [required]
  -d, --dataset TEXT              The name of dataset to evaluate on. The user
                                  may pass multiple datasets names by passing
                                  the option multiple times.
  -i, --integration TEXT          Optional name of the evaluation integration
                                  to use. Must be a valid integration name
                                  that is registered in the evaluation
                                  registry
  -e, --engine_type [deepsparse|onnxruntime|torchscript]
                                  The engine to use for the evaluation. The
                                  default is the DeepSparse engine. If the
                                  evaluation should be run without
                                  initializing a pipeline (e.g. for the
                                  evaluation of a torch model), the engine
                                  type should be set to None
  -s, --save_path TEXT            The path to save the evaluation results. The
                                  results will be saved under the name
                                  'result.yaml`/'result.json' depending on the
                                  serialization type. If argument is not
                                  provided, the results will be saved in the
                                  current directory
  -t, --type_serialization [yaml|json]
                                  The serialization type to use save the
                                  evaluation results. The default is json
  -b, --batch_size INTEGER        The batch size to use for the evaluation.
                                  Must be greater than 0
  -splits, --splits UNION         The name of the splits to evaluate on. Can
                                  be a string for a single split or a list of
                                  strings for multiple splits.
  -metrics, --metrics UNION       The name of the metrics to evaluate on. Can
                                  be a string for a single metric or a list of
                                  strings for multiple metrics.
  --help                          Show this message and exit.