eth-sri / lmql

A language for constraint-guided and efficient LLM programming.
https://lmql.ai
Apache License 2.0
3.7k stars 200 forks source link

Additional backends aren't supported with serve-model #153

Open jplorandi opened 1 year ago

jplorandi commented 1 year ago

I implemented a PeftLLM backend (which I pasted into #152 ) but I cannot load it since there's no way to insert it into the registry via serve-model.

I think this could be fixed with an additional parameter such as --import_extra <some_package> that allowed it to be evaluated just like lmql.models.lmtp.backends is via its __init__.py script, so that additional custom backends are supported.

I think this feature would be very beneficial so that the lmql implementation doesn't have to carry the burden of maintaining a bazillion backends with this weeks new hot stuff (that could be in a lmql-extras package, for instance).

lbeurerkellner commented 1 year ago

I like the suggestion. Did you see https://github.com/eth-sri/lmql/blob/main/src/lmql/models/lmtp/lmtp_programmatic_serve_example.py which allows you to run serve-model from a custom launch script, which also lets you import custom modules beforehand.