AllenNeuralDynamics / aind-codeocean-pipeline-monitor

Package to manage monitoring a Code Ocean pipeline
MIT License
0 stars 0 forks source link

aind-codeocean-pipeline-monitor

License Code Style semantic-release: angular Interrogate Coverage Python

Package for starting a pipeline, waiting for it to finish, and optionally capturing the results as a data asset.

Usage

from aind_codeocean_pipeline_monitor.job import PipelineMonitorJob
from aind_codeocean_pipeline_monitor.models import (
    CaptureSettings,
    PipelineMonitorSettings,
)

from codeocean.capsule import Capsules
from codeocean.data_asset import DataAssets
from codeocean.computation import (
    Computations,
    DataAssetsRunParam,
    RunParams,
)
from codeocean import CodeOcean
import os
from urllib3.util import Retry
from requests.adapters import HTTPAdapter

domain = os.getenv("CODEOCEAN_DOMAIN")
token = os.getenv("CODEOCEAN_TOKEN")
client = CodeOcean(domain=domain, token=token)
# Recommend adding retry strategy to requests session
retry = Retry(
    total=5,
    backoff_jitter=0.5,
    backoff_factor=1,
    status_forcelist=[429, 500, 502, 503, 504],
)
adapter = HTTPAdapter(max_retries=retry)
client.session.mount(domain, adapter)
client.capsules = Capsules(client.session)
client.computations = Computations(client.session)
client.data_assets = DataAssets(client.session)

# Please consult Code Ocean docs for info about RunParams and DataAssetParams
settings = PipelineMonitorSettings(
    run_params=RunParams(
        capsule_id="<your capsule id>",
        data_assets=[
            DataAssetsRunParam(
                id="<your input data asset id>",
                mount="<your input data mount>",
            )
        ],
    ),
    capture_settings=CaptureSettings(
        tags=["derived"]
    ),  # 'tags' is the only required field
)

job = PipelineMonitorJob(job_settings=settings, client=client)
job.run_job()

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

Contributing

Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

coverage run -m unittest discover && coverage report
interrogate .

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

Semantic Release

The table below, from semantic release, shows which commit message gets you which release type when semantic-release runs (using the default configuration):

Commit message Release type
fix(pencil): stop graphite breaking when too much pressure applied Patch Fix Release, Default release
feat(pencil): add 'graphiteWidth' option Minor Feature Release
perf(pencil): remove graphiteWidth option

BREAKING CHANGE: The graphiteWidth option has been removed.
The default graphite width of 10mm is always used for performance reasons.
Major Breaking Release
(Note that the BREAKING CHANGE: token must be in the footer of the commit)

Documentation

To generate the rst files source files for documentation, run

sphinx-apidoc -o docs/source/ src

Then to create the documentation HTML files, run

sphinx-build -b html docs/source/ docs/build/html

More info on sphinx installation can be found here.