sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.3k stars 128 forks source link

Cannot Compile #164

Closed Jessie-Wilkins closed 1 year ago

Jessie-Wilkins commented 1 year ago

I see somebody else was running into this same issue on the llm-chain-template repo (https://github.com/sobelio/llm-chain-template/issues/1). Thought it might be useful to post this issue here since it seems to be an issue with the llm-chain-openai crate itself.

I'm getting the following error when trying to build the basic example from the documentation (https://docs.llm-chain.xyz/docs/getting-started-tutorial/generating-your-first-llm-output):

Updating crates.io index
   Compiling llm-chain-openai v0.12.2
error[E0308]: mismatched types
   --> /Users/jessiewilkins/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llm-chain-openai-0.12.2/src/chatgpt/executor.rs:113:60
    |
113 |         let tokens_used = num_tokens_from_messages(&model, &messages)
    |                           ------------------------         ^^^^^^^^^ expected `&[ChatCompletionRequestMessage]`, found `&Vec<ChatCompletionRequestMessage>`
    |                           |
    |                           arguments to this function are incorrect
    |
    = note: expected reference `&[async_openai::types::types::ChatCompletionRequestMessage]`
               found reference `&Vec<async_openai::types::ChatCompletionRequestMessage>`
note: function defined here
   --> /Users/jessiewilkins/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tiktoken-rs-0.4.5/src/api.rs:358:12
    |
358 |     pub fn num_tokens_from_messages(
    |            ^^^^^^^^^^^^^^^^^^^^^^^^

For more information about this error, try `rustc --explain E0308`.
Juzov commented 1 year ago

Looking into this

Jessie-Wilkins commented 1 year ago

Thanks, @Juzov and @williamhogman for the help! Works fine on my side now. Good work!