PaulMcInnis / JobFunnel

Scrape job websites into a single spreadsheet with no duplicates.
MIT License
1.85k stars 215 forks source link

Increase test coverage #55

Closed markkvdb closed 4 years ago

markkvdb commented 4 years ago

Increase test coverage

Description

Since a few weeks, testing has been introduced to the JobFunnel software but it only covers a fraction of the entire code base. I think increasing the quality of the testing framework with unit and integration tests can make it easier for the reviewers to assess new pull requests and gives new contributors more feedback whether their changes to the code base does not break functionality.

I came up with two ideas to provide a clear and productive environment to increase test coverage:

  1. Adding test coverage (next to the building status) could be a good way to keep track of the current state of the test environment.
  2. Have a checklist of all functions in the code base and whether this function is covered by tests.

Note that checking whether a function is 'covered' by tests can be a bit tricky because some functions are rather long, e.g.parse_config. Hence, true unit tests are very difficult for these long functions but can be easily implemented for shorter functions. Realising that unit tests are complicated for certain functions might also be a sign that these functions should be broken into smaller parts

Test coverage

glassdoor.py

indeed.py

monster.py

jobfunnel.py

tools

delay.py

filters.py

tools.py

config

parser.py

validate.py

Want to contribute?

Do you like this project and do you want to make it even better? Feel free to discuss below if you want to contribute to this project. All help is welcome 👍.

Want to start with something (relatively) easy? The functions in validate.py, tools.py and delay.py are relatively easy to test (at first inspection)!

thebigG commented 4 years ago

Hi @markkvdb, hope you're doing well! Thanks so much for this testing template! I'm not super familiar with the pytest API, but I had a look at it this afternoon and looks pretty straight forward. This is probably because I have used other testing frameworks. Anyway, I'll be working on test_validate.py for the next couple of days. Please let me know of anything I should be aware in terms of best testing practices.

markkvdb commented 4 years ago

@thebigG great that you want to contribute to the testing framework. As you mention that you have experience with (unit) testing I think the testing suite is quite self-explanatory. If there's one thing to look out for it would be to remember to test for cases that definitely pass but to also think hard about edge-cases which should definitely fail. In case of the functions described in test_validate.py, it is important to remember that these functions validate config files provided by the users. Try to think of creative but realistic user inputs that should be tested. An example is what happens when the user forgets to provide certain settings or what happens when users provide non-sense values.

markkvdb commented 4 years ago

Update: @thebigG added the first tests for the validate.py functions in #65! 👍

thebigG commented 4 years ago

Update:As of now I am working ontest_tools.py. I am also working on putting together a conftest.py file for all of our fixtures/mocking code to be in one place. This should avoid code duplication in the future for testing. Please let me know any thoughts/feedback you may have on the conftest.py file. I'll be working on it in the coming days.

PaulMcInnis commented 4 years ago

excellent! conftest.py is the best way to do this :)

:100:

we can discuss this further of course but we should probably start a new thread