I see somebody else was running into this same issue on the llm-chain-template repo (https://github.com/sobelio/llm-chain-template/issues/1). Thought it might be useful to post this issue here since it seems to be an issue with the llm-chain-openai crate itself.
Updating crates.io index
Compiling llm-chain-openai v0.12.2
error[E0308]: mismatched types
--> /Users/jessiewilkins/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llm-chain-openai-0.12.2/src/chatgpt/executor.rs:113:60
|
113 | let tokens_used = num_tokens_from_messages(&model, &messages)
| ------------------------ ^^^^^^^^^ expected `&[ChatCompletionRequestMessage]`, found `&Vec<ChatCompletionRequestMessage>`
| |
| arguments to this function are incorrect
|
= note: expected reference `&[async_openai::types::types::ChatCompletionRequestMessage]`
found reference `&Vec<async_openai::types::ChatCompletionRequestMessage>`
note: function defined here
--> /Users/jessiewilkins/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tiktoken-rs-0.4.5/src/api.rs:358:12
|
358 | pub fn num_tokens_from_messages(
| ^^^^^^^^^^^^^^^^^^^^^^^^
For more information about this error, try `rustc --explain E0308`.
I see somebody else was running into this same issue on the llm-chain-template repo (https://github.com/sobelio/llm-chain-template/issues/1). Thought it might be useful to post this issue here since it seems to be an issue with the llm-chain-openai crate itself.
I'm getting the following error when trying to build the basic example from the documentation (https://docs.llm-chain.xyz/docs/getting-started-tutorial/generating-your-first-llm-output):