bloomberg / pytest-memray

pytest plugin for easy integration of memray memory profiler
https://pytest-memray.readthedocs.io/en/latest/
Apache License 2.0
342 stars 24 forks source link

Generate single output for all test cases #98

Closed jay0129 closed 11 months ago

jay0129 commented 11 months ago

Feature Request

Is your feature request related to a problem?

I have an issue inspecting test results, because too many bin files and html files generated after single commend of pytest --memray --memray-bin-path memray. Note that there are usually many test cases under CI environment.

Describe the solution you'd like A clear and concise description of what you want to

happen. Add any considered drawbacks. Generate single artifact file for all tests

Describe alternatives you've considered

Support merging artifacts into single form of html or bin.

Teachability, Documentation, Adoption, Migration Strategy

pytest --memray --memray-bin-path memray --unify-memray-reports Generates memray/65fbe6dfc0d44a75831ebf7b5b6ea2a5-unified.bin

godlygeek commented 11 months ago

I'm a bit confused - if you run pytest --memray without any other arguments, you shouldn't get any bin files or html files. I don't think pytest-memray ever generates an HTML file, and any bin files it creates would be created in a temporary directory that's removed when the test suite finishes, unless you're providing the --memray-bin-path argument

godlygeek commented 11 months ago

Whoops, that closing was a mis-click

jay0129 commented 11 months ago

Thanks, You are correct. I have specified --memray-bin-path, and I fixed the issue content.

However, I still need summarized information for all tests in order to compare memory usage across every PR regardless of --memray-bin-path.

godlygeek commented 11 months ago

What would you do with the unified .bin file, or the unified HTML file, if it could be created?

However, I still need summarized information for all tests in order to compare memory usage across every PR

That makes it sound like what you're really looking for is #7 - "compare memory usage between pytest runs".

jay0129 commented 11 months ago

By the way, I found --json option for memray stats. I would like to print the total sum of memory usage as shown below:

       - name: Summarize Stats
        run: |
          total_allocations=0
          total_memory_allocated=0

          for bin_file in ${MEMRAY_BIN_PATH}/*.bin; do
            memray stats --json "$bin_file"
            bin_file_name=$(basename "$bin_file")
            json_file="${MEMRAY_BIN_PATH}/memray-stats-${bin_file_name}.json"

            # Extract total allocations and memory allocated from the JSON file
            allocations=$(jq '.total_num_allocations' < "$json_file")
            memory_allocated=$(jq '.total_bytes_allocated' < "$json_file")

            total_allocations=$((total_allocations + allocations))
            total_memory_allocated=$((total_memory_allocated + memory_allocated))
          done

          # Print the TOTAL SUM
          echo "TOTAL SUM of allocations: $total_allocations"
          echo "TOTAL SUM of memory allocated: $total_memory_allocated B"

This way, I can compare and check for any regressions between each PR."

I think https://github.com/bloomberg/pytest-memray/issues/7 - "compare memory usage between pytest runs". is also a good way too.

If merging bin files is not a common practice, I think it is okay for it to remain unchanged.

pablogsal commented 11 months ago

Yeah, I think I am going to close this issue as merging bin files is something quite tricky, as normally they correspond to completely different runs with very different conditions and most of the time they can't be easily integrated into a coherent unit. When we release #7 you can give it a go and see if that solves your problem :)