oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
39.18k stars 5.16k forks source link

Please add an extension for LMQL Playground- natural language prompting with Python. #3376

Closed angrysky56 closed 11 months ago

angrysky56 commented 1 year ago

Description https://lmql.ai/ https://docs.lmql.ai/en/stable/index.html LMQL (Language Model Query Language) is a programming language for large language model (LM) interaction. It facilitates LLM interaction by combining the benefits of natural language prompting with the expressiveness of Python. It has a focus on multi-part prompting and enables novel forms of LM interaction via scripting, constraint-guided decoding, tool augmentation, and efficiency.

LMQL is a research project by the Secure, Reliable, and Intelligent Systems Lab at ETH Zürich.

Additional Context Should be easy to integrate, uses python 3.10- https://docs.lmql.ai/en/stable/language/llama.cpp.html image Feature Overview LMQL is designed to make working with language models like OpenAI and 🤗 Transformers more efficient and powerful through its advanced functionality, including multi-variable templates, conditional distributions, constraints, datatypes and control flow.

Python Syntax: Write your queries using familiar Python syntax, fully integrated with your Python environment (classes, variable captures, etc.) Rich Control-Flow: LMQL offers full Python support, enabling powerful control flow and logic in your prompting logic. Advanced Decoding: Take advantage of advanced decoding techniques like beam search, best_k, and more. Powerful Constraints Via Logit Masking: Apply constraints to model output, e.g. to specify token length, character-level constraints, datatype and stopping phrases to get more control of model behavior. Optimizing Runtime: LMQL leverages speculative execution to enable faster inference, constraint short-circuiting, more efficient token use and tree-based caching. Sync and Async API: Execute hundreds of queries in parallel with LMQL's asynchronous API, which enables cross-query batching. Multi-Model Support: Seamlessly use LMQL with OpenAI API, Azure OpenAI, and 🤗 Transformers models. Extensive Applications: Use LMQL to implement advanced applications like schema-safe JSON decoding, algorithmic prompting, interactive chat interfaces, and inline tool use. Library Integration: Easily employ LMQL in your existing stack leveraging LangChain or LlamaIndex. Flexible Tooling: Enjoy an interactive development experience with LMQL's Interactive Playground IDE, and Visual Studio Code Extension. Output Streaming: Stream model output easily via WebSocket, REST endpoint, or Server-Sent Event streaming.

angrysky56 commented 1 year ago

My AI with Bing search says yes please, lol- https://docs.lmql.ai/en/stable/python/python.html image

github-actions[bot] commented 11 months ago

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

freirefx commented 11 months ago

up