Open buff-m opened 1 month ago
Are we talking about the browser's built-in fetch
or the fetch
function from tauri's http plugin?
@FabianLars 在tauri的http plugin中的fetch函数
@buff-m can you give an example or a minimal repro I can use?
@amrbashir`` 这是服务器代码例子,它是用uWebSockets写成的
#include "App.h"
#include <iostream>
int main()
{
auto *loop = uWS::Loop::get();
uWS::App app = uWS::App();
app.get("/", [](auto* res, auto* req) {
res->writeHeader("Content-Type", "text/plain");
res->writeHeader("Transfer-Encoding", "chunked");
res->write("H");
res->write("e");
res->write("l");
res->write("l");
res->write("o");
res->end();
});
app.listen(3808, [](auto* token) { std::cout << "Listening on port 3808\n"; });
loop->run();
return 1;
}
当我使用其他工具,比如浏览器自带的 fetch function 或者 postman 请求数据时,能够正常获取到 hello,当我使用 tauri's http plugin 的 fetch function 请求数据时,它失败了提示 “error sending request for url (http://localhost:3808/)” 。
当我注释掉 res->writeHeader("Transfer-Encoding", "chunked"); 时,浏览器的 fetch 和 http plugin 的 fetch 都能获取到正常的数据 hello
可以参考 https://github.com/tauri-apps/plugins-workspace/issues/1002#issuecomment-2382206860 这里提到的一个替代方案,这个方案支持在tauri内使用fetchEventSource调用LLM的API接口。
@buff-m I can't unfortunately use a c++ example, as that took a long time for me to setup and yet I can't get it to work. If you have a server in JS or Rust, I will be happy to take a look.
js version
const express = require('express')
const app = express()
const port = 3000
app.get('/', (req, res) => {
let counter = 0;
const interval = setInterval(() => {
if (counter > 10) {
clearInterval(interval);
res.end()
}
const chunk = JSON.stringify({chunk: counter++});
res.write(`data: ${chunk}\n\n`);
}, 1000);
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})
@lloydzhou thanks for the server code, I tested the http plugin with that and it works well
Describe the bug
当我http服务器设置了发送头 Transfer-Encoding: chunked 属性后,fetch 请求失败 因为我的数据比较大,需要分块发送给客户端
Reproduction
No response
Expected behavior
No response
Full
tauri info
outputStack trace
No response
Additional context
No response