ai4os / DEEPaaS

A REST API to serve machine learning and deep learning models
https://deepaas.readthedocs.io
Apache License 2.0
34 stars 14 forks source link

Allow to load non installable models #135

Open BorjaEst opened 6 months ago

BorjaEst commented 6 months ago

Description

After using the package for a year and helping a few users, I find that one of the most confusing issues for users is the concept of application project as installable.

Basic Python users do not understand most of the consequences of installing with pip install . , e.g. how imports work with relative paths, models not being copied to the installation folder (unless include_package_data). And even more complicated to understand is pip install -e . (editable mode). For example, tox does not support -e editable_mode installation, and some of the complaints come in that tox fails because it cannot find package metadata (e.g. by using skipsdist without relying that you need to install).

In addition, other applications use submodules or dependencies that are not in PyPI. This also complicates things exponentially.

Expected behavior:

I think DEEPaaS needs to rethink how the project interface works. I understand that previously the installation was required to support multiple models at the same time. If this is no longer the case (and even if it is still the case, there may be better approaches), perhaps it should be oriented towards the way other web servers work. For example, FastAPI or Flask do not require installation, but rather configuration of a "server" object instance. See https://fastapi.tiangolo.com/#example

Unless projects are intended to be distributed via PyPI, distribution and installation make things unnecessarily complicated for users.

Actual behavior

Instllation with "pip install -e .".

Versions

Version: 2.1.0

Other comments

One constrain of non installing the project, is that metadata from the "package" is not accesible via importlib.metadata. However, it is still under discussion between users, which metadata should be deliver, each case is special. Probably this information should just be computed in the "api.get_metadata" method.

alvarolopez commented 6 months ago

Yes, this is in the roadmap and part of v3.0.0 where we plan to have several ways to load a model, not only from an installable package.