brainlid / langchain_demo

Demo web project using the Elixir LangChain library
Other
115 stars 15 forks source link

[Windows 10] mix setup fail #5

Closed 18a93a664c closed 9 months ago

18a93a664c commented 9 months ago

At the very end of the mix setup output, I receive the following error:

==> exqlite could not compile dependency :exqlite, "mix compile" failed. Errors may have been logged above. You can recompile this dependency with "mix deps.compile exqlite --force", update it with "mix deps.update exqlite" or clean it with "mix deps.clean exqlite" ==> langchain_demo ** (Mix) "nmake" not found in the path. If you have set the MAKE environment variable, please make sure it is correct.

My system is Windows 10 and... Erlang/OTP 25 [erts-13.2.2.4] [source] [64-bit] [smp:8:8] [ds:8:8:10] [async-threads:1] [jit:ns] Elixir 1.15.7 (compiled with Erlang/OTP 25)

brainlid commented 9 months ago

I don't have access to a Windows machine. The problem appears to be around building SQLite on Windows. I'd suggest trying the Windows instructions on the Exqlite GitHub page (Elixir SQLite project).

It's probably some missing build tools. That page has a more dedicated Windows document.

I hope that resolves it for you!

18a93a664c commented 9 months ago

Those links solved my compilation issues. I finally compiled the repo.

My experiment with this repo is interfacing with local LLMs rather than OpenAI. I'm using an app called LMStudio which allows a user to download Huggingface models. It also includes a local server that can be used as a drop-in replacement to OpenAI API.

Their instructions say: "If you're using an OpenAI client, set openai.api_base (python), or the baseURL (node.js) property in your client configuration to "http://localhost:1234/v1".

I'm assuming I need to find the configuration object for OpenAI...where is that located in your project?

In node, I would configure it as follows:

const openai = new OpenAI({ baseURL: 'http://localhost:1234/v1', apiKey: '', });

UPDATE: In the dependencies, I found the langchain/lib/chat_models/chat_open_ai.ex file had references to the OpenAI url so I pasted the baseURL noted above into that field.

The issue I'm encountering now is the OpenAI key verification function is throwing an error. With the local LLM setup, it isn't required to have an api key. I did create an .env file, however, I left the OPENAI_API_KEY= assignment blank. It should have worked, however, the way you coded the app, it still won't continue without an api key.

brainlid commented 9 months ago

I'm glad the resources resolve the build issue for you!

I just released v0.1.4 of the Elixir langchain library. It includes a change for overriding the endpoint of the ChatOpenAI struct. There is a brief README section on using it.

Note that it differs from your example as it is the full URL to the chat completions endpoint and not just the base URL.

Hopefully that works for you!