SKaplanOfficial / Raycast-PromptLab

A Raycast extension for creating powerful, contextually-aware AI commands using placeholders, action scripts, selected files, and more.
https://www.raycast.com/HelloImSteven/promptlab
259 stars 7 forks source link

Using Mistral through Jan.ai with this extension #32

Open saleh-mir opened 10 months ago

saleh-mir commented 10 months ago

I recently came across this library called Jan.ai, which allows you to run offline models such as Mistral Instruct, which is really powerful. Apparently, this tool has the same API as OpenAI.

Is it possible to use it with this extension for Raycast? If so, is there any resources I could read about this? And if you need my help, please do tell.

This is an example of how I can access the model using their API which is the same as OpenAI's:

curl http://localhost:1337/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer EMPTY" \
  -d '{
     "model": "mistral-ins-7b-q4",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

Reply:

{"choices":[{"finish_reason":null,"index":0,"message":{"content":" I understand that you have asked me to say that \"this is a test.\" Here is that statement for you: \"This is a test.\" Is there anything specific you would like me to do with this test, or is it simply for my understanding that we are conducting a test? Let me know if there's anything else I can assist you with.","role":"assistant"}}],"created":1705970517,"id":"tWF3q25T3yaeF96K2bVF","model":"_","object":"chat.completion","system_fingerprint":"_","usage":{"completion_tokens":72,"prompt_tokens":14,"total_tokens":86}}% 
SKaplanOfficial commented 10 months ago

If the API is exactly the same, you can use the OpenAI API Example in the readme, just substituting the endpoint with your localhost one (i.e. http://localhost:1337/v1/chat/completions). You can leave the API key field blank.