Closed lukasturcani closed 3 months ago
Attention: Patch coverage is 70.48346%
with 116 lines
in your changes missing coverage. Please review.
Project coverage is 90.36%. Comparing base (
e605638
) to head (ecaac06
). Report is 4 commits behind head on main.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
🚨 Try these New Features:
This all looks great to me. Converting to draft until @jezsadler merges some ruff and mypy fixes in.
This PR does a couple of things to clean up the boilerplate related to packaging OMLT, see sections below for detailed explanations of the changes.
setup.cfg
,setup.py
,docs/requirements.txt
,tox.ini
in favour ofpyproject.toml
.conda
requirements intoenvironment.yml
tests.yml
andpublish_release.yml
ruff
,mypy
,doctest
just
for developer experienceDevelopment
section ofREADME
to talk aboutjust
conf.py
pull_request_template.md
Other comments
main
branch nice and cleanUsing
pyproject.toml
pyrpoject.toml
is the simplest way to provide package metadata for a Python package. It is easy to read and also provides sections for configurating tools such aspytest
,ruff
andmypy
all in one place. It works seamlessly with the modern Python ecosystem.I set up
pyproject.toml
to automactically detect the version of the code from git tags. No need to duplicate version numbers across the repo. Just add a new tag and everything will be updated. In addition, when a new git tag is pushed to the GitHub repo, the newpublish_release
workflow will be triggered and a new PYPI version released. (See more on this below).I also set it up so that the version is automatically added to a file called
src/omlt/_version.py
which holds the__version__
variable. this file is autogenerated and therefore added to.gitignore
. The__version__
veriable is then re-exported insrc/omlt/__init__.py
so that our users have access to it.I tried to perserve all the information stored in the
setup.cfg
and other deleted files -- let me know if there is something i missed!Optional dependencies
The
pyproject.toml
file allows the creation of optional dependencies. For example, our users can installOfc any combination of optional dependencies is valid too. This allows our users to install the dependencies specific to their use case. Note that:
onnx
andonnxruntime
a required dependency because from my understanding it is almost always useddev
which developers can use to install all developer tools and all dependencies -- you need this to run all the tests for exampledev-gpu
which installs the GPU version of tensorflow in case the developer has a GPUThe available optional dependencies are:
linear-tree
, installs the linear tree dependencykeras
, installs tensorflow and keraskeras-gpu
, installs tensorflow for the gpu and kerastorch
, installs torch and torch geometricdev-tools
- this is not to be used directly but allows easy re-use of dev tools in other optional dependencies, namely dev and dev-gpudocs
- installs dependencies required to compile docsdev
- dependecies needed for developing the project, such toolingdev-gpu
- same as dev but installed with gpu supportOur documentation probably needs to be updated to tell users they wanna install omlt with some combination of
linear-tree
,keras
,keras-gpu
,torch
optional dependencies depending on what features of the package they are usingQuality checks with
ruff
,mypy
anddoctest
I've enabled
ruff
,mypy
anddoctest
. Currently there are no doctests, but its good to have it set up so that it runs in case any are added in the future.Both
ruff
andmypy
are failing because there are a number of things which need to fixed. For bothruff
andmypy
I have disabled some checks which it would be good to enable eventually but are probably a fair amount of work to fix -- these have comments inpyproject.toml
. The remaining failing checks are ones which I would reccomend fixing ASAP. There's two approaches, merge now and fix these errors later. Or keep a separate branch where these are incrementally fixed. Up to you to decide what you prefer.I told ruff to check for
google
style docstrings. I think these are the best because they have good readbility and work the best with type hints in my opinion.Using
just
instead oftox
https://github.com/casey/just is a simple command runner. It allows the developers to define and re-use common operations, for example I can define a
check
recipe and then runin my command line and it will run all the tests. The beauty of this is that
just
is extremely simple. If you read the file its basically a sequence of bash instructions for each recipe. This makes therecipes
really transparent, and easy to understand, and works as code-as-documentation. Users can just read the recipe and run the commands one by one to get the same effect without havingjust
installed. There is no magic which helps with debugging issues. It's also language agnostic.just
comes as a small stand-alone binary, which makes it a very non-intrusive tool to have on your computer that does not need any dependencies.The downside is that it does not provide automatic management for Python environments, which I belive tox does provide. The other side of this is that we allow developers to use their favorite tools for managing venvs rather than proscribing certain tools for this repo. (the difference with
just
being that it is essentially optional tool and also serving as documentation)I may be overly opinionated on this one, so feel free to push back.
Cleaning up
docs/conf.py
I removed a bunch of the commented out code. This makes it easier to see what the configuration is and also prevents the commented out options from becoming out of date when a new release of sphinx is made.
Moving
pull_request_template.md
I moved this into the
.github
folder because it is GitHub configuration. Very optional, but makes more sense to me.readthedocs
automated actionthis guide https://docs.readthedocs.io/en/stable/guides/pull-requests.html shows how to set it up. requires admin permissions on readthedocs -- can jump on a call to help with this
publishing with to
PYPI
with a git tagfor this an API key for PYPI needs to be created and added to the repos secrets -- can jump on a call to help with this
consider
_internal
package structureOne way to make it easier to manage private vs public code in a repository is to create an
_internal
folder where all the code goes. This way all code can be shared easily and moved between modules and its by default private, so changes to internal code does not break users. Public modules then just re-export code in the_internal
submodules. You can see an example of this structure here https://github.com/lukasturcani/stk. Not a huge issue but I find it very helpful for managing what things are actually exposed to users the code-base grows.Legal Acknowledgement\ By contributing to this software project, I agree my contributions are submitted under the BSD license. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.