Documentation: https://ahida-development.github.io/ols-py
Source Code: https://github.com/ahida-development/ols-py
PyPI: https://pypi.org/project/ols-py/
Python client for the Ontology Lookup Service
Current status:
Features:
from ols_py.client import Ols4Client
client = Ols4Client()
resp = client.search("MC1R", params={"ontology": "go"})
term = resp.response.docs[0]
print(term)
# SearchResultItem(
# id=None, annotations=None, annotations_trimmed=None,
# description=['A rhodopsin-like G-protein ... gamma-melanocyte-stimulating hormone.'],
# iri='http://purl.obolibrary.org/obo/PR_000001146', label='melanocortin receptor',
# obo_id='PR:000001146', ontology_name='go', ontology_prefix='GO', subset=None,
# short_form='PR_000001146', synonyms=None, type='class'
#)
print(term.iri)
# http://purl.obolibrary.org/obo/PR_000001146
pip install ols-py
poetry install
poetry shell
pytest
The documentation is automatically generated from the content of the docs directory and from the docstrings of the public signatures of the source code. The documentation is updated and published as a Github project page automatically as part each release.
Trigger the Draft release workflow (press Run workflow). This will update the changelog & version and create a GitHub release which is in Draft state.
Find the draft release from the GitHub releases and publish it. When a release is published, it'll trigger release workflow which creates PyPI release and deploys updated documentation.
Pre-commit hooks run all the auto-formatters (e.g. black
, isort
), linters (e.g. mypy
, flake8
), and other quality
checks to make sure the changeset is in good shape before a commit/push happens.
You can install the hooks with (runs for each commit):
pre-commit install
Or if you want them to run only for each push:
pre-commit install -t pre-push
Or if you want e.g. want to run all checks manually for all files:
pre-commit run --all-files
This project was generated using the wolt-python-package-cookiecutter template.