eth-sri / lmql

A language for constraint-guided and efficient LLM programming.
https://lmql.ai
Apache License 2.0
3.48k stars 191 forks source link

Is it possible to run LMQL with LMStudio locally ? #316

Closed zacaikido closed 4 months ago

zacaikido commented 5 months ago

Can't get to uderstand how to use with LMStudio. GPT4 and I rode the documentation but both failed. This very same script without LMQL implementation is succesfull.

Anyone would have an idea ?

`import openai import lmql import asyncio

import re

a = 12

Configuration for OpenAI API

class OpenAIConfig: def init(self): self.base_url = "http://localhost:1234/v1" self.api_type = "open_ai" self.api_key = "not-needed" model = lmql.model("gpt2", endpoint="localhost:1234/v1")

Function to read file content.

This setup allows your script to dynamically

read the system message from the system_message.txt file, making it easy to update

the system message without changing the script's code.

def read_file_content(file_path): try: with open(file_path, "r") as file: return file.read().strip() except FileNotFoundError: print(f"File not found: {file_path}") return None

Function to initiate conversation with the local-model and establishes roles and where the instructions come from.

async def initiate_conversation(input_text, system_message): @lmql.query async def query(): '''lmql

access 'a' from the global namespace

    "Tell me a fun fact about {a}: [FACT]"
    # use imported 're' module
    return re.sub(r'\d+', '[NUMBER]', FACT)
    '''

##print(query())
user_message =await query()
print(user_message)

response = openai.ChatCompletion.create(
    model="local-model",
    messages=[
        {"role": "system", "content": system_message},
        {"role": "user", "content": user_message}
    ],
    temperature=0.7,
)
return response.choices[0].message.content.strip()

async def main():

Instantiate configuration

config = OpenAIConfig()
openai.api_base = config.base_url
openai.api_key = config.api_key

# Read system message from file
system_message = read_file_content("system_message.txt")
if system_message is None:
    return

# Conversation loop
while True:
    user_input = input("User: ")
    if user_input.lower() in ['exit', 'bye', 'end']:
        print("Exiting the conversation.")
        break

    model_response = await initiate_conversation(user_input, system_message)
    print("Model Response: ", model_response)

if name == "main": asyncio.run(main())

`

outputs :

` SyntaxWarning: invalid escape sequence '\d' '''lmql User: sdkfjsfsd

:3: SyntaxWarning: invalid escape sequence '\d' /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 0) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:739: OpenAIAPIWarning: OpenAI request with ID 0 failed (timeout or other error) and will be retried warnings.warn("OpenAI request with ID {} failed (timeout or other error) and will be retried".format(request_id), category=OpenAIAPIWarning) /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 1) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 2) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 3) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 4) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 5) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 6) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 7) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 8) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 9) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 10) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 11) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/lmql/runtime/bopenai/batched_openai.py:507: OpenAIAPIWarning: OpenAI API: Underlying stream of OpenAI complete() call failed with error Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')] () Retrying... (attempt: 12) warnings.warn(f"OpenAI API: Underlying stream of OpenAI complete() call failed with error\n\n{attempt.error} ({type(attempt.error)})\n\nRetrying... (attempt: {self.retries})", OpenAIAPIWarning: Enable tracemalloc to get the object allocation traceback /Library/Fr`
lbeurerkellner commented 4 months ago

I have not tested this myself, but I am open to input on experiments here. As with all mock implementations of the OpenAI API format, LMQL actually needs a very faithful re-implementation of the original API (including full support for logit biasing and prompt echoing among other things), to work in this way.

Unfortunately, my experience is that most simple wrappers do not implement the OAI API up to that level of functionality. It would be great to see the outcome of some experiments, up to what degree the API is replicated by LMStudio.

Once this is clear, you should be able to just use the endpoint via lmql.model("openai/text-davinci-003", endpoint="localhost:3000")

lbeurerkellner commented 4 months ago

Closing this due to inactivity. Please feel free to follow up, if things remain unclear.