saulpw / aipl

Array-Inspired Pipeline Language
MIT License
119 stars 7 forks source link

multiple LLM clients #52

Closed dovinmu closed 1 year ago

dovinmu commented 1 year ago

This generalizes to allow different clients that can do LLM inference.

anjakefala commented 1 year ago

This is so much better and clearer, Rowan! Tysm for refactoring!

Did you test on a local model? I tested with OpenAI, and once I removed the requirement for the environment variable it worked great!

dovinmu commented 1 year ago

@anjakefala yep, I was using the testing code at the bottom of clients.py to make sure each of those worked (though didn't test permutations of env, thanks for the catch!).