Closed niksirbi closed 1 year ago
So the linting issue is hard to resolve. The problem is that black
also tries to read its config from the {{cookiecutter.package_name}}/pyproject.toml
file (and not just from the higher level pyproject.toml
. That lower-level pyproject.toml
contains the weird cookicutter syntax for conditionals, which black is unable to parse and throws an error.
I don't know why this hadn't come up before (or it was resolved, if it had come up)
I don't think this has come up before, indeed the problem doesn't seem to occur on main
. To my understanding, black should only use the first configuration file it finds, which is the one without the cookiecutter syntax.
To make this more complicated, the error doesn't occur for me when running either black ./
or pre-commit run --all-files
, only when actually running as a pre-commit hook.
I don't know why changing some unrelated files causes this problem. It's possible there's some kind of bug in the black
/pre-commit
interaction.
I can only confirm what you said. The problem only arises in the hook, but I can't for the life of me understand why.
I am still not sure what's causing this error, but I managed to fix it by explicitly passing the config file to black in .pre-commit-config.yaml
:
- repo: https://github.com/psf/black
rev: 22.10.0
hooks:
- id: black
args: [--config=pyproject.toml]
NOTE: this concerns the pre-commit hooks of this repo, and should not affect the repo produced when using this template.
Now I can finally continue working on this PR.
To demonstrate the API docs autogeneration facilities of sphinx
, I added some example modules to the package and configured sphinx accordingly. Some wrinkles still need to be ironed out, but @JoeZiminski the current version of this branch should implement the basic API autogeneration functionality.
The critical files to edit are:
docs/source/index.rst
docs/source/api_index.rst
The auto-generated API docs got to docs/source/api_generated
.
NOTE: The building of the documentation website is now only triggered when a tag is pushed to main
(i.e. when a new version is released). This behaviour can be modified by editing .github/workflows/publish_docs.yml
Thanks a lot for this Niko, has been really useful. Some things I've found when using for datashuttle (happy to make a commit with these, just wanted to run them by you before making any changes)
1) I think we can also exclude docs from the installed package, in pyproject.toml add docs to exclude, and also add to check manifest: ` [tool.setuptools.packages.find] include = ["{{cookiecutter.module_name}}"] exclude = ["tests", "docs*"]
[tool.check-manifest] ignore = [ ".yaml", ".bumpversion.cfg", "tox.ini", "tests/", "tests/test_unit/", "tests/test_integration/", "docs/", "docs/source/", ".flake8" ]
3) can add sphinx_autodoc_typehints to extensions in conf.py and requirements
4) add exclude: 'conf.py' to the top of pre-commit config to avoid formatting checks on this file
5) in conf.py html_baseurl i changed http to https as SonicCloud was throwing a warning
6) for API docs, it seems importing the documented class is required during doc build. This means all dependencies for the code are also required. For CI, this means it can fail with dependency error unless all the package dependencies are added to the requirements.txt file. I tried to make the CI first install package dependencies, then run the NIU action, but I understood correctly what was going on the NIU action checks out to a new environment so it didn't seem to help.
Hey Joe, thanks for your feedback on this. The first 5 points seem completely reasonable to me, so feel free to commit these here. The 6th point is a bit more serious. I vaguely recall that there is a way to build the API docs without including your own modules as dependencies. Let me have a look at that one.
I am still not sure what's causing this error, but I managed to fix it by explicitly passing the config file to black in
.pre-commit-config.yaml
So this problem came back and my quick fix above did not solve anything Did some more digging, and here is what I discovered:
pre-commit run --all
in the root directory of this repo, it passed without errors, because it read its configuration from the "outer" .pre-commit-config.yaml
file in the root.{{cookiecutter.package_name}}
folder, it failed because it read the .pre-commit-config.yaml
from that folder. Multiple hooks failed then: e.g. check toml
correctly identified pyproject.toml
as invalid, while mypy
justifiably complained that {{cookiecutter.package_name}}
is not a valid package name. So far so good, now the interesting part:
git commit
I expected only the "outer" pre-commit to be run, but this was not the case. The hooks included check toml
which is only specified by the "inner" pre-commit-config.yaml
file. This was causing those failures, since the "inner" black
tried to read the "inner" pyproject.toml
.I managed to solve the problem locally by pre-commit uninstall
, followed by a fresh pre-commit install
. Now git commit
correctly runs only the "outer" hooks. I still don't know where I messed up. And I am still unsure whether the problem will reappear on GitHub (I will push a commit to find out).
I don't know if any of this rumbling makes sense to you @adamltyson.
Having two .pre-commit-config.yaml
files is a pretty unique situation for this project and causes some weird behaviours.
After pushing some commits, the problem appears to be solved 😃
6. for API docs, it seems importing the documented class is required during doc build. This means all dependencies for the code are also required. For CI, this means it can fail with dependency error unless all the package dependencies are added to the requirements.txt file. I tried to make the CI first install package dependencies, then run the NIU action, but I understood correctly what was going on the NIU action checks out to a new environment so it didn't seem to help.
@JoeZiminski this problem did not appear in my testing, because the mock functions I wrote are pure Python, without any dependencies. The problem you are referring to is also described in this stack overflow thread.
A possible solution would be to mock the imports during the build. Sphinx
appears to have that ability. As I understand it, you have to add the following to conf.py
:
autodoc_mock_imports = ["numpy", "scipy"]
Obviously replace numpy
and scipy
with the actual dependencies that are imported by your module. The module itself (i.e. datashuttle
) doesn't need to be imported, because its path is included in sys at the top of the conf.py
file.
Let me know if that works for datashuttle
. If yes then it's only a matter of documentation. We could also conclude numpy
as a dependency of the created package (which would also solve this issue) and write an additional mock function that imports numpy, to demonstrate these functionalities to the user.
I added an option to not use docs in the repo when installing the cookie cutter, if no this will delete:
workflows/check_docs.yml workflows/publish_docs.yml conf.py from pre-commit exclude docs/ module_name/greetings.py module_name/math.py
did I miss anything? I also saw the listing versions were changed in test_and_deploy.yml, I didn't think this change related to docs but can add back to old version if that is better when there are no docs.
Started adding tests, will finish Monday @JoeZiminski (I wonder if this will work)
I added an option to not use docs in the repo when installing the cookie cutter, if no this will delete:
workflows/check_docs.yml workflows/publish_docs.yml conf.py from pre-commit exclude docs/ module_name/greetings.py module_name/math.py
Great! I wonder whether we should keep the example modules, regardless of docs, just to demonstrate how it's done. Users can easily replace these with their own modules.
Great! I wonder whether we should keep the example modules, regardless of docs, just to demonstrate how it's done. Users can easily replace these with their own modules.
Although this would help total beginners, it would be a bit annoying to have to delete these modules every time we wanted to use the cookiecutter. I would vote for more docs. (it might be worth making a proper docs site with a tutorial rather than just the readme).
I checked the changes you've made @JoeZiminski , looking good so far. I also made some additional changes of my own:
napoleon
extension. Previously we had both napoleon
and numpydoc
installed, which do similar things, causing some confusing issues in the API docs. I removed numpydoc
and moved napoleon
to be the first extension that is being loaded.After you are done with the test and resolve the linting issues, I will expand the README to "document the documentation".
Thanks Niko, I've updated the tests now, these create two cookiecutter projects, one with and one without, and checks present directories / settings are as expected. cc @dstansby as I made some adjustments to the structure of the fixtures to add the new test, in case this can be optimised. Otherwise I think nearly there!
I'm holding off on merging this until I address this actions issue. We cannot rely on the currently used check_sphinx_docs
action.
Now https://github.com/neuroinformatics-unit/actions/issues/8 is closed, can this PR be merged?
This is good to go as far as I am concerned. The last thing I did was adding some information about the docs (and how to use them) in the README. @adamltyson re-requesting review from you (hopefully for the last time in this PR).
Adds.github.io/package_name}}>"
docs
subfolder for holding sphinx documentation for the package. Also adds the necessary GitHub workflows for checking that docs build (upon PR) and for publishing them on a separategh-pages
branch (upon pushes to main). The content ofgh-pages
(build docs) then gets rendered on the default GitHub pages URL: "http(s)://