thmsmlr / instructor_ex

Structured outputs for LLMs in Elixir
https://hexdocs.pm/instructor
430 stars 48 forks source link

[WIP] Ollama support #15

Closed lilfaf closed 4 months ago

lilfaf commented 4 months ago

Hi @thmsmlr

As you said in the issue ticket, it was mostly about supporting a new json mode, and reusing the OpenAI adapter works just fine.

On the other hand, the results are more random when the streaming mode is enabled.

TODO

TwistingTwists commented 4 months ago

Nice work @lilfaf !

https://github.com/thmsmlr/instructor_ex/pull/16/files#diff-610567b841a1bedbba476a9d2214f9c617d7d3f861a1b3275345a23211e2d8a4R516-R523

Try this change in your PR. It might improve the JSON output from the ollama model.

lilfaf commented 4 months ago

@TwistingTwists Thanks a lot for the example!

I've implemented it and indeed this helps a lot on nested structs.

I'll add missing tests tomorrow or a bit later this week 😉

Does anyone think it's worth having a dedicated module to build the prompt at this point?

thmsmlr commented 4 months ago

Looking good so far!

I'm not a fan of all these small functions. It takes something that is pretty easy to trace and adds three extra layers to the callstack. None of the functions introduced are called in more than one place, they're just being used for function overloading, which can be achieved with other methods. Introducing another module to manage this would only further the complexity of the code. (FWIW I gave no guidance on how to write the code, nor put together a contributing doc, mismatch style preferences is not your fault, it's mine. The code otherwise looks great)

I'm going to push a small refactor that inlines these functions.

TwistingTwists commented 4 months ago

Dope work so far! @lilfaf @thmsmlr :) Really like this. 💛💛

Also, we should add a Contributing.md

thmsmlr commented 4 months ago

Fixes #11