roboflow / inference

A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
https://inference.roboflow.com
Other
1.3k stars 116 forks source link

InferencePipeline: Gracefully handle connection failure when fetching Active Learning config #491

Closed sberan closed 3 months ago

sberan commented 3 months ago

Description

Currently, when you start InferencePipeline offline, if active_learning_enabled is not set to false, there is a pretty ugly error and the application doesn't start:

Code 2024-06-26 13 57 56

With this change, you get a nice warning letting you know what's going on, and the application still starts:

image

Type of change

Please delete options that are not relevant.

How has this change been tested, please provide a testcase or example of how you tested the change?

Tested locally.

Any specific deployment considerations

n/a

Docs