theY4Kman / pycharm-pytest-imp

PyCharm pytest improvements plugin
https://plugins.jetbrains.com/plugin/14202-pytest-imp/
17 stars 2 forks source link

Pycharm 2021.2 and pytest-imp 0.5.1 causing Fatal Error on Pycharm startup #8

Closed thetreythomas closed 3 years ago

thetreythomas commented 3 years ago

Just updated Pycharm from 2021.1.3 to 2021.2. After the update, I had to update several plugins, including pytest-imp. After reloading the IDE, I get two fatal errors caused by pytest-imp.

I didn't catch what version pytest-imp was at before the update. I was before 0.5.0 for sure.

image

image

image

theY4Kman commented 3 years ago

Ouch, that's not great. I just changed the INI parser to a Python ConfigParser-compatible one. I reckon its behaviour is different wrt missing keys.

As a workaround, before I fix this (and finally add config tests), you could probably add python_functions and python_classes entries to your pytest.ini; or, if you'd rather not, the 0.5.0 version can be downloaded from the releases page and manually installed.

Sorry for the trouble!

On Wed, Aug 4, 2021, 13:56 Trey @.***> wrote:

Just updated Pycharm from 2021.1.3 to 2021.2. After the update, I had to update several plugins, including pytest-imp. After reloading the IDE, I get two fatal errors caused by pytest-imp.

I didn't catch what version pytest-imp was at before the update. I was before 0.5.0 for sure.

[image: image] https://user-images.githubusercontent.com/15376624/128230122-2e4f5e5a-b399-44ea-9739-2b6c21ab82a6.png

[image: image] https://user-images.githubusercontent.com/15376624/128230323-b8f71562-f3ae-4e7c-8c8c-96a114642343.png

[image: image] https://user-images.githubusercontent.com/15376624/128230367-8ddd7bb2-327f-49d7-9390-1266c3525dbb.png

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/theY4Kman/pycharm-pytest-imp/issues/8, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAIIMG5O7VCRZFH5XMKDULT3F5LHANCNFSM5BRVS4GQ .

thetreythomas commented 3 years ago

Doing some more digging, it is not liking one specific file. It's a fixture with an inner function so I can pass arguments from my test to this fixture.

I was in the middle of making some changes to this file when I did the whole Pycharm update, and the rule_name in the inner function hadn't been defined yet in the code. This was causing the initial error.

Then when I tried setting x = rule_name, it threw another error.

#######################################
#   Fixture File for json Comparison
#######################################
"""compare_json.py contains the fixture to perform the expected vs actual comparison
"""

import pytest
import json
from jsoncomparison import Compare, NO_DIFF
from loguru import logger
from pathlib import Path

@pytest.fixture()
def compare_response(host_info, request):
    """

    Args:
        host_info:
        request:

    Returns:

    """
    logger.info("Starting the expected vs actual comparison")

    if not host_info.rules:
        logger.warning('There are no rules loaded for this host to use when comparing.')
    else:
        logger.info('Rules for compare found in host_info')
        rules = host_info.rules

    def _expected_vs_actual(response, expected, rule_name):
        """

        Args:
            response (Request object): The Requests object from the API call
            expected (str): Name of the JSON file to be used as expected value in comparison
            rule_name (str): Name of the rule to use for the JSON comparison

        Returns:
            bool: Returns either True or False for the diff check

        """

        # This is to configure JSONcomparison to print the comparison result to the console
        # https://github.com/rugleb/JsonCompare
        config = {
            "output": {
                "console": True,
                "file": {}
            }
        }

        x = rule_name

        logger.info(f'The response being used to compare is a status code: {response.status_code}')
        actual = response.json()
        logger.info(f'Actual is loaded, and is of type {type(actual)}')

        logger.info(f'Will be using the JSON file {expected} to compare against, if found in /json folder')

        logger.info(f'Current working directory is {Path.cwd()}')
        logger.info(f'pytest node path is {request.node.fspath}')
        full_path_to_json_dir = Path(request.node.fspath).parent
        logger.info(f'Current JSON directory is {full_path_to_json_dir}')

        json_file = f'{expected}.json'
        logger.info(f'Attempting to use expected file {json_file}')

        _expected = full_path_to_json_dir / 'json' / json_file
        logger.info(f'File to be used located at {_expected}')

        logger.info(f'Does the file exist? : {_expected.exists()}')

        if not _expected.exists():
            logger.error(f'Expected file at location cannot be found --> {_expected}')
            raise Exception(f'Expected file at location cannot be found --> {_expected}')
        else:
            with _expected.open(mode='r') as file:
                exp = json.load(file)
            logger.info(f'Expected file loaded successfully. File is of type {type(exp)}')

        name_of_rule = f'rules_{expected}'
        logger.info(f'Searching if rule {name_of_rule} exists for this API')

        # Get the specific rule from the host_info.rules dict
        rule = None
        if name_of_rule in rules:
            rule = rules[name_of_rule]
            logger.info(f'Using rule --> {rule}')
        else:
            logger.warning(f'Rule {name_of_rule} not found in rules.')

        # Run the compare, checking if a rule file needs to be used
        if rule:
            logger.info(f'Running Compare() with rule file.')
            difference = Compare(config, rules=rule).check(exp, actual)
        else:
            logger.info(f'Running Compare() with NO rule file.')
            difference = Compare(config).check(exp, actual)

        if difference == NO_DIFF:
            return True
        else:
            return False

    return _expected_vs_actual
theY4Kman commented 3 years ago

Ah, man, I left you hanging for so long. I went to update the plugin for the 2021.3 EAP, and discovered my uncommitted changes to fix this. Sorry to leave you high and dry like that.

For some good news: I published the fix in the 0.5.2 release. It's awaiting Jetbrains's approval for release in the plugins repository, but in the meantime, you can snag the .zip from the GH release and install the plugin manually.