Closed aevo98765 closed 4 months ago
It would be really useful as a dev contributor if you could quickly spin a model inference server.
Two approaches could be taken here:
1) ilab cli setup and ilab serve instructions - link to existing documentation here.
2) Docker contained service to expose port 8000 and 8001 just in case users have another server running on 8000.
Acceptance Criteria:
Resolved in #45 🎉
It would be really useful as a dev contributor if you could quickly spin a model inference server.
Two approaches could be taken here:
1) ilab cli setup and ilab serve instructions - link to existing documentation here.
2) Docker contained service to expose port 8000 and 8001 just in case users have another server running on 8000.
Acceptance Criteria: