PipelineAI / pipeline

PipelineAI
https://generativeaionaws.com
Apache License 2.0
4.17k stars 972 forks source link

Implement Canary Analysis of newly-deployed ML models to run along side existing models - allow rollback if possible #59

Closed cfregly closed 7 years ago

cfregly commented 8 years ago

provide metrics on both system and prediction performance to allow multiple levels of canary analysis

for ML model canary analysis, we'll want to compare

(userId, itemId, currentPrediction)  <-->  (userId, itemId, newPrediction)

we want difference between currentPrediction and newPrediction to be close to 0 or within some tolerance

cfregly commented 8 years ago

http://cloud.spring.io/spring-cloud-static/spring-cloud.html#netflix-metrics

cfregly commented 8 years ago

related to https://github.com/fluxcapacitor/pipeline/issues/61 for alerts/notifications of bad canaries

cfregly commented 8 years ago

Notes:

online evaluation is pinning 1 model (or multi-armed bandit) against another vs offline

online is best, offline is preliminary smoke test. requires online system which is not common. we hope to make it common.

form of canary analysis of new model against existing cluster.

some traffic goes to new model canary.

make sure new model is performing within acceptable tolerance of existing model, otherwise remove it and try again.

simple monitoring: golden set of data scored live against newly-deployed model

complex: deploy canary of new model alongside cluster of old model.

compare as part of canary analysis. keep an eye out for Netflix ACA (automated Canary Analysis)

cfregly commented 7 years ago

zuul screenshots:

image

image

cfregly commented 7 years ago

Moving to Advanced Edition http://pipeline.ai/products