lune-org / lune

A standalone Luau runtime
https://lune-org.github.io/docs
Mozilla Public License 2.0
295 stars 67 forks source link

HTTP server dies when given rapid requests #167

Open PhantomShift opened 3 months ago

PhantomShift commented 3 months ago

I haven't looked into Lune's source code much at all so I don't have an inkling of what the underlying issue is, but if lots of requests are sent in succession, the request handler stops responding for whatever reason. This issue seems to be similar to https://github.com/lune-org/lune/issues/104 though that was fixed recently.

Given this simple script,

local net = require "@lune/net"
local task = require "@lune/task"

local handle = net.serve(13337, function(request)
    print(request.body)
    return {
        status = 200,
        body = "roger that\n"
    }
end)

sending a couple of curl requests in rapid succession causes it to silently fail.

https://github.com/lune-org/lune/assets/83253309/f264a69f-fb0c-4e29-9566-cedeed3d7834

This does not happen if the requests are sufficiently spaced apart.

Additionally, this behavior does not seem to occur if you prevent the main thread from ending with say an endless while loop; appending the following code will make it behave as expected.

while true do
    task.wait(1)
end

https://github.com/lune-org/lune/assets/83253309/fab2f940-4116-4e5f-8bf8-3db38ed00036

Currently running this on Linux in case that's relevant, version 0.8.2. I have not tried testing this on Windows yet.

Log from failing script ``` RUST_LOG=trace lune run networktest.lua 0.004187678s TRACE mlua_luau_scheduler::queue: pushing item to queue with 0 args 0.004213564s DEBUG Scheduler::run: mlua_luau_scheduler::scheduler: status status=Running 0.004252127s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=0 lua_threads_spawned=0 lua_threads_deferred=1 0.004729979s TRACE Scheduler::run:Scheduler::tick:Scheduler::run_until_yield: mlua_luau_scheduler::traits: spawning local task on executor 0.004768846s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 0.004778837s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.092326079s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.092364209s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.092457484s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.092483342s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.092589542s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.092742179s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.273877659s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.273907023s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.273971105s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.273990338s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.274042298s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.274190627s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.443909197s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.443948480s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.444020889s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.444047251s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.444131524s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.444311057s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.609723269s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.609749401s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.609793248s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.609809038s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.609852762s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.609959303s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.754982077s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.755017474s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.755084273s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.755117617s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.755182970s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.755353372s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.906067438s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 2.906105660s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 2.906159541s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 2.906179297s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 2.906238125s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 2.906355541s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.068191692s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.068236001s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.068322009s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 3.068351124s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 3.068423394s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 3.068617045s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.227190251s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.227226245s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.227295227s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 3.227320179s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 3.227384020s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 3.227511720s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.387715787s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.387754676s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.387831437s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 3.387858764s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 3.387929264s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 3.388082587s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.553896367s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.553929209s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.553993981s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 3.554013756s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 3.554065014s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 3.554207004s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.697222916s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.697247327s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.697292903s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args 3.697307780s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=1 test 3.697347747s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=2 lua_threads_spawned=0 lua_threads_deferred=0 3.697444079s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=0 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.870483415s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::traits: spawning local task on executor 3.870550092s TRACE Scheduler::run: mlua_luau_scheduler::scheduler: loop futures_spawned=1 futures_processed=1 lua_threads_spawned=0 lua_threads_deferred=0 3.870666972s TRACE Scheduler::run:Scheduler::tick: mlua_luau_scheduler::queue: pushing item to queue with 1 args ```
filiptibell commented 3 months ago

Additionally, this behavior does not seem to occur if you prevent the main thread from ending with say an endless while loop; appending the following code will make it behave as expected.

This makes me think its related to the serve handle getting garbage collected, same as in issue #104 that you mentioned ... Will need some further investigation and/or a more consistent repro, definitely more spooky this time around.

RuizuKun-Dev commented 1 month ago

having this issue too

after using this code from the wiki and refreshing just a few times, the server seems to die

local net = require("@lune/net")

local counter = 0
net.serve(8080, function()
    counter += 1
    return {
        status = 200,
        body = "Hello! This is response #" .. tostring(counter),
    }
end)

print("Listening on port 8080 🚀")
filiptibell commented 1 month ago

having this issue too

after using this code from the wiki and refreshing just a few times, the server seems to die

@RuizuKun-Dev for now you can use the workaround mentioned in the original post:

Additionally, this behavior does not seem to occur if you prevent the main thread from ending with say an endless while loop; appending the following code will make it behave as expected.

while true do
    task.wait(1)
end

I think I have a good idea of how this could be fixed but it will have to wait for #188 to be merged

Reinitialized commented 1 month ago

Editing to note the first example code I wrote appears to be stable. Have built an entire backend and absolutely spammed it with requests without a single issue so far.

While building a ExpressJS-like framework on the latest release of Lune, I noticed as long as you initialize the local variable first, then assign the handler to it later, the ServeHandle operates as expected:

local ServerHandler
local ModuleRoot = {}
function ModuleRoot.StartServer()
  assert(typeof(ServerHandler) == "nil", "HttpServer is already running!")

  ServerHandler = net.serve(
    8080,
    function(Request) 
      return {status = 200, body = "hi"} 
    end
  )
end

ModuleRoot.StartServer() -- works as expected!

However, if you attempt to immediately initialize the local variable, the garbage collection issue occurs:

local ModuleRoot = {}
function ModuleRoot.StartServer()
  return net.serve(
    8080,
    function(Request) 
      return {status = 200, body = "hi"} 
    end
  )
end

local ServerHandler = ModuleRoot.StartServer() -- gives up after about 6 ~ 12 requests

This indicates the issue has to do with immediate initialization of the variable compared to an already initialized one.

EDIT: On second thought, variable initialization is unlikely to be the issue since the while true do loop also keeps the server alive. I unfortunately don't know much outside of Lua, but I do suspect this has something to do with how the Lua scope is handled on the Rust side.

RuizuKun-Dev commented 3 weeks ago

having this issue too after using this code from the wiki and refreshing just a few times, the server seems to die

@RuizuKun-Dev for now you can use the workaround mentioned in the original post:

Additionally, this behavior does not seem to occur if you prevent the main thread from ending with say an endless while loop; appending the following code will make it behave as expected.

while true do
    task.wait(1)
end

I think I have a good idea of how this could be fixed but it will have to wait for #188 to be merged

work around you mentioned didn't work I noticed it broke when output is "Hello! This is response #13"

also why does it increment counter by 2 each time i refresh?

using version 0.8.5

Reinitialized commented 3 weeks ago

having this issue too after using this code from the wiki and refreshing just a few times, the server seems to die

@RuizuKun-Dev for now you can use the workaround mentioned in the original post:

Additionally, this behavior does not seem to occur if you prevent the main thread from ending with say an endless while loop; appending the following code will make it behave as expected.

while true do
    task.wait(1)
end

I think I have a good idea of how this could be fixed but it will have to wait for #188 to be merged

work around you mentioned didn't work I noticed it broke when output is "Hello! This is response #13"

also why does it increment counter by 2 each time i refresh?

using version 0.8.5

Use the workaround I listed right above your comment. I've thoroughly tested it and haven't had any garbage collection issues since. No while true do loop required.

Nicell commented 2 weeks ago

I was testing some http benchmarking on lune's net package and ran into this. While saving the net.serve result to a global variable helps somewhat, it doesn't fix the issue entirely. With the global variable, I can run wrk with one concurrent request (-c1), but as soon as I bump it up to two concurrent requests, the web server silently crashes like before. Would be curious what's causing this.

Reinitialized commented 2 weeks ago

I was testing some http benchmarking on lune's net package and ran into this. While saving the net.serve result to a global variable helps somewhat, it doesn't fix the issue entirely. With the global variable, I can run wrk with one concurrent request (-c1), but as soon as I bump it up to two concurrent requests, the web server silently crashes like before. Would be curious what's causing this.

To clarify, did you mean a local variable? If you actually used a global variable, I haven't tested my workaround against it and thus cannot validate it. As for the local variable method, I'll try to hammer it some more but it's been working perfectly for me.

Nicell commented 1 week ago

@Reinitialized Yes! I meant local top level scope, apologies. It works fine when you don't have more than one concurrent connection. Try giving the wrk benchmarking tool a try with higher concurrent requests values. https://github.com/wg/wrk

Reinitialized commented 1 week ago

@Reinitialized Yes! I meant local top level scope, apologies. It works fine when you don't have more than one concurrent connection. Try giving the wrk benchmarking tool a try with higher concurrent requests values. https://github.com/wg/wrk

Just gave this a test using wrk and some tests in Studio, and it handled each and every request in order without a problem. I double-checked my implementation and the example I provided and its basically the same, but just in case here is the exact implementation I'm using for this.

Nicell commented 1 week ago

Just gave this a test using wrk and some tests in Studio, and it handled each and every request in order without a problem. I double-checked my implementation and the example I provided and its basically the same, but just in case here is the exact implementation I'm using for this.

Using your implementation results in the same crash for me. I'm on a MacBook Pro M2 Max machine FWIW.

Even with only 5 concurrent requests it crashes in about a second:

Request Details ``` > wrk -t1 -c5 -d5s --latency "http://localhost:8080/bench/world?id=10" Running 5s test @ http://localhost:8080/bench/world?id=10 1 threads and 5 connections Thread Stats Avg Stdev Max +/- Stdev Latency 114.38us 286.18us 10.06ms 99.06% Req/Sec 43.73k 13.70k 52.31k 86.67% Latency Distribution 50% 91.00us 75% 104.00us 90% 122.00us 99% 379.00us 65341 requests in 5.03s, 6.36MB read Requests/sec: 12978.30 Transfer/sec: 1.26MB ``` Average requests/sec of 43k on the thread, end result of 13k, crashed in a little over a second. Subsequent runs result in 0 completed requests.
Code ```luau local http = require("HTTPServer") local net = require("@lune/net") http.AddEndpoint("GET", "/bench/:name", function(req, p) return { status = 200, body = net.jsonEncode({ name = p[1], id = req.query.id, }), } end) http.StartServer() ```
Reinitialized commented 1 week ago

That is so odd. The only two differences I can think of is I'm using AlmaLinux 9 on x86, and you macOS on ARM. In the shell, can you set the environment variable RUST_LOG=trace and see if it catches anything? I tried it on my end, but it doesn't crash :/

Reinitialized commented 1 week ago

Arighty, after getting wrk compiled and running the exact same test you did, I do see Requests seem to stop after about 64~128 total requests: image

Interestingly however, it seems to stop ALL threads, as I had a loop printing the ServeHandler variable to see if it went nil or anything like that. This is making me think this issue is related to #195