neuroinformatics-unit / python-cookiecutter

Utility to create a basic Python project structure with tests, CI etc.
BSD 3-Clause "New" or "Revised" License
20 stars 3 forks source link

Add docs #30

Closed niksirbi closed 1 year ago

niksirbi commented 1 year ago

Addsdocs subfolder for holding sphinx documentation for the package. Also adds the necessary GitHub workflows for checking that docs build (upon PR) and for publishing them on a separate gh-pages branch (upon pushes to main). The content of gh-pages (build docs) then gets rendered on the default GitHub pages URL: "http(s)://.github.io/package_name}}>"

niksirbi commented 1 year ago

So the linting issue is hard to resolve. The problem is that black also tries to read its config from the {{cookiecutter.package_name}}/pyproject.toml file (and not just from the higher level pyproject.toml. That lower-level pyproject.toml contains the weird cookicutter syntax for conditionals, which black is unable to parse and throws an error. I don't know why this hadn't come up before (or it was resolved, if it had come up)

adamltyson commented 1 year ago

I don't think this has come up before, indeed the problem doesn't seem to occur on main. To my understanding, black should only use the first configuration file it finds, which is the one without the cookiecutter syntax.

To make this more complicated, the error doesn't occur for me when running either black ./ or pre-commit run --all-files, only when actually running as a pre-commit hook.

I don't know why changing some unrelated files causes this problem. It's possible there's some kind of bug in the black/pre-commit interaction.

niksirbi commented 1 year ago

I can only confirm what you said. The problem only arises in the hook, but I can't for the life of me understand why.

niksirbi commented 1 year ago

I am still not sure what's causing this error, but I managed to fix it by explicitly passing the config file to black in .pre-commit-config.yaml:

    - repo: https://github.com/psf/black
      rev: 22.10.0
      hooks:
          - id: black
            args: [--config=pyproject.toml]

NOTE: this concerns the pre-commit hooks of this repo, and should not affect the repo produced when using this template.

Now I can finally continue working on this PR.

niksirbi commented 1 year ago

To demonstrate the API docs autogeneration facilities of sphinx, I added some example modules to the package and configured sphinx accordingly. Some wrinkles still need to be ironed out, but @JoeZiminski the current version of this branch should implement the basic API autogeneration functionality. The critical files to edit are:

The auto-generated API docs got to docs/source/api_generated.

NOTE: The building of the documentation website is now only triggered when a tag is pushed to main (i.e. when a new version is released). This behaviour can be modified by editing .github/workflows/publish_docs.yml

JoeZiminski commented 1 year ago

Thanks a lot for this Niko, has been really useful. Some things I've found when using for datashuttle (happy to make a commit with these, just wanted to run them by you before making any changes)

1) I think we can also exclude docs from the installed package, in pyproject.toml add docs to exclude, and also add to check manifest: ` [tool.setuptools.packages.find] include = ["{{cookiecutter.module_name}}"] exclude = ["tests", "docs*"]

[tool.check-manifest] ignore = [ ".yaml", ".bumpversion.cfg", "tox.ini", "tests/", "tests/test_unit/", "tests/test_integration/", "docs/", "docs/source/", ".flake8" ]

3) can add sphinx_autodoc_typehints to extensions in conf.py and requirements

4) add exclude: 'conf.py' to the top of pre-commit config to avoid formatting checks on this file

5) in conf.py html_baseurl i changed http to https as SonicCloud was throwing a warning

6) for API docs, it seems importing the documented class is required during doc build. This means all dependencies for the code are also required. For CI, this means it can fail with dependency error unless all the package dependencies are added to the requirements.txt file. I tried to make the CI first install package dependencies, then run the NIU action, but I understood correctly what was going on the NIU action checks out to a new environment so it didn't seem to help.

niksirbi commented 1 year ago

Hey Joe, thanks for your feedback on this. The first 5 points seem completely reasonable to me, so feel free to commit these here. The 6th point is a bit more serious. I vaguely recall that there is a way to build the API docs without including your own modules as dependencies. Let me have a look at that one.

niksirbi commented 1 year ago

I am still not sure what's causing this error, but I managed to fix it by explicitly passing the config file to black in .pre-commit-config.yaml

So this problem came back and my quick fix above did not solve anything Did some more digging, and here is what I discovered:

So far so good, now the interesting part:

I managed to solve the problem locally by pre-commit uninstall, followed by a fresh pre-commit install. Now git commit correctly runs only the "outer" hooks. I still don't know where I messed up. And I am still unsure whether the problem will reappear on GitHub (I will push a commit to find out).

I don't know if any of this rumbling makes sense to you @adamltyson.

Summary

Having two .pre-commit-config.yaml files is a pretty unique situation for this project and causes some weird behaviours.

Update

After pushing some commits, the problem appears to be solved 😃

niksirbi commented 1 year ago

6. for API docs, it seems importing the documented class is required during doc build. This means all dependencies for the code are also required. For CI, this means it can fail with dependency error unless all the package dependencies are added to the requirements.txt file. I tried to make the CI first install package dependencies, then run the NIU action, but I understood correctly what was going on the NIU action checks out to a new environment so it didn't seem to help.

@JoeZiminski this problem did not appear in my testing, because the mock functions I wrote are pure Python, without any dependencies. The problem you are referring to is also described in this stack overflow thread.

A possible solution would be to mock the imports during the build. Sphinx appears to have that ability. As I understand it, you have to add the following to conf.py:

autodoc_mock_imports = ["numpy", "scipy"]

Obviously replace numpy and scipy with the actual dependencies that are imported by your module. The module itself (i.e. datashuttle) doesn't need to be imported, because its path is included in sys at the top of the conf.py file.

Let me know if that works for datashuttle. If yes then it's only a matter of documentation. We could also conclude numpy as a dependency of the created package (which would also solve this issue) and write an additional mock function that imports numpy, to demonstrate these functionalities to the user.

JoeZiminski commented 1 year ago

I added an option to not use docs in the repo when installing the cookie cutter, if no this will delete:

workflows/check_docs.yml workflows/publish_docs.yml conf.py from pre-commit exclude docs/ module_name/greetings.py module_name/math.py

did I miss anything? I also saw the listing versions were changed in test_and_deploy.yml, I didn't think this change related to docs but can add back to old version if that is better when there are no docs.

JoeZiminski commented 1 year ago

Started adding tests, will finish Monday @JoeZiminski (I wonder if this will work)

niksirbi commented 1 year ago

I added an option to not use docs in the repo when installing the cookie cutter, if no this will delete:

workflows/check_docs.yml workflows/publish_docs.yml conf.py from pre-commit exclude docs/ module_name/greetings.py module_name/math.py

Great! I wonder whether we should keep the example modules, regardless of docs, just to demonstrate how it's done. Users can easily replace these with their own modules.

adamltyson commented 1 year ago

Great! I wonder whether we should keep the example modules, regardless of docs, just to demonstrate how it's done. Users can easily replace these with their own modules.

Although this would help total beginners, it would be a bit annoying to have to delete these modules every time we wanted to use the cookiecutter. I would vote for more docs. (it might be worth making a proper docs site with a tutorial rather than just the readme).

niksirbi commented 1 year ago

I checked the changes you've made @JoeZiminski , looking good so far. I also made some additional changes of my own:

After you are done with the test and resolve the linting issues, I will expand the README to "document the documentation".

JoeZiminski commented 1 year ago

Thanks Niko, I've updated the tests now, these create two cookiecutter projects, one with and one without, and checks present directories / settings are as expected. cc @dstansby as I made some adjustments to the structure of the fixtures to add the new test, in case this can be optimised. Otherwise I think nearly there!

niksirbi commented 1 year ago

I'm holding off on merging this until I address this actions issue. We cannot rely on the currently used check_sphinx_docs action.

adamltyson commented 1 year ago

Now https://github.com/neuroinformatics-unit/actions/issues/8 is closed, can this PR be merged?

niksirbi commented 1 year ago

This is good to go as far as I am concerned. The last thing I did was adding some information about the docs (and how to use them) in the README. @adamltyson re-requesting review from you (hopefully for the last time in this PR).