-
Hello,
In iot_agent_usage.py example, the ToolAgent uses OpenAI by default. So I try to use pne.LLMFactory to build Azure OpenAI llm & pass it to ToolAgent. It crashes and the log as below:
```
…
-
LM Studio can use OpenAI API type to call local LLM. It's a best for us to create a notebook for developers show how to use local LLM.
---
@ruanrongman Can you push a notebook to…
-
这个错误莫名其妙,我的key没问题,也有代理,各种解决不了
-
本人想参加2023年的gsoc,but事实上还是想相加一些大的开源项目练习一下,更多地希望ml,dl,ml部署、分布式、python方向的,或者说参与一些框架类的项目,感兴趣的朋友可以加微信群一起交流一下。
欢迎:
- 对参与开源项目感兴趣,但还是小白的朋友
- 自己做一些开源项目,希望寻找一个平台和感兴趣的朋友一起交流一下
- 想要一起组局一起做一个很cool的开源项目的小伙伴
…
-
## Introduction
Sometime, we need some custom prompt to drive our LLM. Currently, there is no convenient mechanism to handle it in chatGPTBox.
**We can also see other relevant demands:**
#261…
-
https://undertone0809.github.io/promptulate/#/modules/llm/llm?id=llm
Description
-
## 🚀 Feature Request
Currently, we cannot enable stream if setting output schema. Eg:
```python
from typing import List
import promptulate as pne
from pydantic import BaseModel, Field
cl…
-
## 🚀 Feature Request
Add pne.chat() use openai provider to proxy some specified model, eg:
```python
import promptulate as pne
pne.chat(messages="hello", model="gpt-4-turbo")
```
If deve…
-
### System Info
langchain v0.0.324
python 3.10
window10 amd64
### Who can help?
@hwchase17 @agola11
### Information
- [ ] The official example notebooks/scripts
- [X] My own modifie…
-
## 🚀 Feature Request
Just build a simple chatbot by streamlit and pne.chat
Functions:
- only chat
- select different models
- input env and key
- stream output
## Output
- a example …