numirias / pytest-json-report

🗒️ A pytest plugin to report test results as JSON
MIT License
147 stars 39 forks source link

Is there any easy way to add metadata in test stage? #40

Closed hdw868 closed 5 years ago

hdw868 commented 5 years ago

Hi team,

This plugins is awesome , I love the well organized structure and very detailed information it collected. However, it seems we don't have an easy way (such as stage_metadata hook in pytest-json) to allow adding extra information at test stage? I'm currently trying to add the calling datetime in the metadata, so that when things goes wrong, I can search the timestamp in our application logs.

Please correct me if I was wrong.

Thanks, Wayne

hdw868 commented 5 years ago

Looking through the code, I found one possible solution would be:

from datetime import datetime

import pytest

@pytest.hookimpl(hookwrapper=True)
def pytest_runtest_makereport(item, call):
    outcome = yield
    report = outcome.get_result()
    if report.when == 'call':
        metadata = item._json_report_extra.setdefault('metadata', {})
        metadata['start_utc'] = str(
            datetime.utcfromtimestamp(call.start))
        metadata['end_utc'] = str(
            datetime.utcfromtimestamp(call.stop))

But I think it would be more nice to implement it in hook.

numirias commented 5 years ago

Sorry for getting back late, I was on vacation. And thanks for your kind feedback.

How about leveraging the existing json_metadata fixture, like so:

import pytest
from datetime import datetime

@pytest.fixture(autouse=True)
def metadata_timestamp(json_metadata):
    json_metadata['start'] = str(datetime.now())
    yield
    json_metadata['stop'] = str(datetime.now())

def test_foo():
    import time
    time.sleep(2)

Running the file would give a report like this:

{
    ...
    "tests": [
        {
            "nodeid": "foo.py::test_foo",
            ...
            "metadata": {
                "start": "2019-06-29 19:09:36.989592",
                "stop": "2019-06-29 19:09:38.992025"
            },
            "setup": {
                "duration": 0.00027298927307128906,
                "outcome": "passed"
            },
            "call": {
                "duration": 2.0011777877807617,
                "outcome": "passed"
            },
            "teardown": {
                "duration": 0.0005156993865966797,
                "outcome": "passed"
            }
        }
    ]
}

This seems more straightforward to me than implementing yet another hook. Does that help?

hdw868 commented 5 years ago

Sorry for getting back late, I was on vacation. And thanks for your kind feedback.

How about leveraging the existing json_metadata fixture, like so:

import pytest
from datetime import datetime

@pytest.fixture(autouse=True)
def metadata_timestamp(json_metadata):
    json_metadata['start'] = str(datetime.now())
    yield
    json_metadata['stop'] = str(datetime.now())

def test_foo():
    import time
    time.sleep(2)

Running the file would give a report like this:

{
    ...
    "tests": [
        {
            "nodeid": "foo.py::test_foo",
            ...
            "metadata": {
                "start": "2019-06-29 19:09:36.989592",
                "stop": "2019-06-29 19:09:38.992025"
            },
            "setup": {
                "duration": 0.00027298927307128906,
                "outcome": "passed"
            },
            "call": {
                "duration": 2.0011777877807617,
                "outcome": "passed"
            },
            "teardown": {
                "duration": 0.0005156993865966797,
                "outcome": "passed"
            }
        }
    ]
}

This seems more straightforward to me than implementing yet another hook. Does that help?

Thanks for your reply, apparently there are more than one way to do this, I just haven't think about that. I still think provide a hook is a much more common way as most plugins does. Anyway, you may close this issue or document somewhere : )

numirias commented 5 years ago

However, it seems we don't have an easy way (such as stage_metadata hook in pytest-json) to allow adding extra information at test stage?

pytest-json doesn't actually provde a special hook for it. stage_metadata is just an attribute of the report object, just like this plugin uses _json_report_extra to store information in an attribute of the report.

Currently I can't think of a good hook implementation that would actually make adding stage metadata easier without adding unnecessary complexity. But I am happy to discuss a specific suggestion.

I might also add solutions we discussed here as recipes to the readme.

numirias commented 5 years ago

@hdw868 I revisited your suggestion and included new hooks in the latest release.

You now have an even easier way to add metadata from the current test call. For your own use case, it may look something like this:

def pytest_json_runtest_metadata(item, call):
    if call.when != 'call':
        return {}
    return {'start': call.start, 'stop': call.stop}

(The dict you return will be merged with already existing metadata.)

hdw868 commented 5 years ago

@hdw868 I revisited your suggestion and included new hooks in the latest release.

You now have an even easier way to add metadata from the current test call. For your own use case, it may look something like this:

def pytest_json_runtest_metadata(item, call):
    if call.when != 'call':
        return {}
    return {'start': call.start, 'stop': call.stop}

(The dict you return will be merged with already existing metadata.)

That's just awesome! Thanks for adding this feature in a short time!

dnltz commented 4 years ago

@numirias - I added the runtest_metadata hook in my conftest.py to collect metadata from my test. If I do not pass the --json-report parameter to pytest, pluggy will fail because the hook is unknown. INTERNALERROR> % (name, hookimpl.plugin), INTERNALERROR> pluggy.manager.PluginValidationError: unknown hook 'pytest_json_runtest_metadata' in plugin <module 'src.basic.hooks' from '/home/schultz/work/ptf-ci-dev/src/basic/hooks.py'> Any idea how to fix this? Maybe empty hooks should be added here: https://github.com/numirias/pytest-json-report/blob/11a66d5289a4aa9928029c5d72fd465b0eec323e/pytest_jsonreport/plugin.py#L360

numirias commented 4 years ago

@dnltz You should declare the hook implementation optional with @pytest.hookimpl(optionalhook=True). I agree this can be confusing, so I added a note in the readme.

Maybe empty hooks should be added here:

I prefer not to register hooks unconditionally. I feel that if the plugin isn't active, it also shouldn't interact with pluggy's hook registry. Also, once you uninstall the plugin, you'd run into the same problem again. So, declaring hook implementations optional seems like the most robust approach here.

dnltz commented 4 years ago

@numirias Ah yes, thanks! I just copied the function from the docu and didn't know to add this line... sorry for hijacking this issue and thanks for updating the README so fast.

Great plugin btw :)