Open qZhang88 opened 1 year ago
Hello @qZhang88, yes it should be every field you see on the API reference documentation should be available.
could anybody give an example?
I think you are right we have to implement this in order to make it work: https://stackoverflow.com/questions/31238626/curl-structuring-request-to-validate-server-sent-events
I will look into this unless someone is willing to take it
hi! just curious about the status of this one?
I've written a demo using libcurl which may help.
class A
{
public:
virtual void print(std::string& s){ std::cout << "Class A: i:" << i << "s: " << s;i++;}
private:
int i = 1;
};
class B: public A
{
public:
virtual void print(std::string& s) override { std::cout << "Class B print: " << s <<endl; }
private:
int i = 2;
};
size_t WriteCallback(void* contents, size_t size, size_t nmemb, A* output) {
size_t totalSize = size * nmemb;
std::string s((char*)contents, totalSize);
output->print(s);
return totalSize;
}
void GetToOpenAIUsingCURL()
{
CURL* curl;
CURLcode res;
B* a = new B();
// 初始化libcurl
curl_global_init(CURL_GLOBAL_DEFAULT);
// 创建一个CURL对象
curl = curl_easy_init();
if (curl) {
// 设置请求URL
curl_easy_setopt(curl, CURLOPT_URL, "http://.../ai/proxy/v1/chat/completions");
// 设置POST请求
curl_easy_setopt(curl, CURLOPT_POST, 1L);
// 设置请求体数据
std::string postData = R"(
{
"model": "gpt-3.5-turbo",
"messages":[{"role":"user", "content":"如何制作三明治"}],
"max_tokens": 1000,
"temperature": 0,
"stream": true
}
)";
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, postData.c_str());
// 设置请求头
struct curl_slist* headers = NULL;
headers = curl_slist_append(headers, "Content-Type: application/json");
headers = curl_slist_append(headers, "Authorization: Bearer sk-...");
headers = curl_slist_append(headers, "Accept: text/event-stream");
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headers);
std::string response;
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, WriteCallback);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, a);
// 发送请求
std::cout << "curl_easy_perform start" << std::endl;
res = curl_easy_perform(curl);
std::cout << "curl_easy_perform end" << std::endl;
this_thread::sleep_for(chrono::milliseconds(1000));
// 检查请求是否成功
if (res != CURLE_OK) {
std::cerr << "curl_easy_perform() failed: " << curl_easy_strerror(res) << std::endl;
} else {
// 打印响应数据
std::cout << "Response: " << response << std::endl;
}
// 释放请求头内存
curl_slist_free_all(headers);
// 清理CURL对象
curl_easy_cleanup(curl);
}
// 清理libcurl
curl_global_cleanup();
}
Sorry guys I will defintely take a look asap about this feature thanks for the example :)
Thanks hint from @WenchiLe
create two data type here, so I can poll stream chunk data outside from libcurl request. Note: the stream data is not parsed yet, get_chunk and get_final will not work
struct StreamChunk {
std::string text;
bool is_end;
};
class StreamResponse {
public:
StreamResponse() : is_end_(false) {}
void set(const std::string & chunk) {
std::unique_lock<std::mutex> lk(mutex_);
chunk_ += chunk;
text_ += chunk;
cv_.notify_all();
}
StreamChunk get_chunk() {
std::unique_lock<std::mutex> lk(mutex_);
cv_.wait(lk, [this] () { return chunk_ != ""; });
std::string ret = chunk_;
chunk_ = "";
return {ret, is_end_};
}
std::string get_final() {
std::unique_lock<std::mutex> lk(mutex_);
cv_.wait(lk, [this] () { return is_end_ == true; });
return text_;
}
private:
std::string chunk_;
std::string text_;
bool is_end_;
std::mutex mutex_;
std::condition_variable cv_;
};
struct CurlData{
char* data;
int64_t size;
};
right after static size_t writeFunction(void* ptr, size_t size, size_t nmemb, std::string* data)
, add s static variable and new WRITEFUNCTION.
static StreamResponse * stream_response_;
static size_t writeStreamFunction(void* ptr, size_t size, size_t nmemb, void * data) {
size_t realsize = size * nmemb;
std::string text((char*)ptr, realsize);
stream_response_->set(text);
return size * nmemb;
}
pass in StreamResponse
object response
in ``
inline Response Session::makeRequest(StreamResponse& response, const std::string& contentType) {
// ...
stream_response_ = &response;
curl_easy_setopt(curl_, CURLOPT_WRITEFUNCTION, writeStreamFunction);
curl_easy_setopt(curl_, CURLOPT_WRITEDATA, nullptr);
// ...
Hey, just wondering if this feature will be added?
Is streaming ready to go?
It looks like @qZhang88 and @WenchiLe figured it out.
I don't mind contributing, what still needs to be done to implement streaming support?
Does this support stream response? So I can continue accept partial message, and have nicer responding to user.
https://platform.openai.com/docs/api-reference/chat/create#chat/create-stream