Open ReubenFrankel opened 1 month ago
Taps (catalog, state, tests, etc.)
How do you feel about using workflow commands to create annotations for things like the UserWarning messages that are output? This could be useful in to highlight issues at a glance, rather than having to trawl through the Pytest output log.
UserWarning
Here's a POC I put together for tap-spotify:
tap-spotify
- name: Test with pytest run: | __generate_annotations () { level=$1 pattern=$2 title=${3:-'\1'} message=${4:-'\0'} test -z "$level" \ && echo 'Error: level is required' \ && exit 1 test "$level" != debug \ && test "$level" != notice \ && test "$level" != warning \ && test "$level" != error \ && echo 'Error: level must be one of [ debug | notice | warning | error ]' \ && exit 1 test -z "$pattern" \ && echo 'Error: pattern is required' \ && exit 1 sed -E "s/.*($pattern)/::$level title=${{ github.job }} (${{ matrix.python-version }}): $title::$message/" } debug () { __generate_annotations $FUNCNAME "$@"; } notice () { __generate_annotations $FUNCNAME "$@"; } warning () { __generate_annotations $FUNCNAME "$@"; } error () { __generate_annotations $FUNCNAME "$@"; } poetry run pytest | warning UserWarning
https://github.com/Matatika/tap-spotify/actions/runs/11362287222
Something like that would indeed be nice.
I'll try to revive https://github.com/pytest-dev/pytest-github-actions-annotate-failures/pull/68.
Feature scope
Taps (catalog, state, tests, etc.)
Description
How do you feel about using workflow commands to create annotations for things like the
UserWarning
messages that are output? This could be useful in to highlight issues at a glance, rather than having to trawl through the Pytest output log.Here's a POC I put together for
tap-spotify
:https://github.com/Matatika/tap-spotify/actions/runs/11362287222