adafruit / Adafruit_CircuitPython_HTTPServer

Simple HTTP Server for CircuitPython
MIT License
46 stars 30 forks source link

add support for "multipart/x-mixed-replace" (for streaming video) #95

Closed aguaviva closed 2 weeks ago

aguaviva commented 5 months ago

I presume this has already been considered. If so, I'd be great to document why it has been implemented yes (technical limitations, memory,..) that way if others decide to implement it, can use this information to make better decisions.

Thanks for your awesome work.

michalpokusa commented 5 months ago

Hi, could you share some of your code so I can see how do you want to use it?

Maybe using a ChunkedResponse class would be sufficient. I am not sure how good (and if) it will work, as streaming video will likely require lots of memory, and considering that many boards can barely load the library itself, this use case might require using e.g. Feather ESP32-S3.

I may be wrong, but I believe that even if you goal is possible at all, it likely will require specific hardware with enough memory.

aguaviva commented 5 months ago

Each frame is a few Kb, plus I was streaming video using Micropython before, (Note: I am migrating to CircuitPython because the Micropython platform is a total mess). So it should be possible, all we need is the functionality in the server.

I don't have any decent code to share yet because I am still learning so probably I am making lots of shameful mistakes :)

michalpokusa commented 5 months ago

I will try to make a simple video streaming example using on disk images, but I would really appreciate some, even work in progress code. I curently do not have any camera that I can connect to a microcontroller so I want to understand the workflow of capturing the frames from camera that later could be adapted to a "x-mixed-replace" response.

aguaviva commented 5 months ago

In CircuitPython I tried using the following code:

BOUNDARY = "FRAME"
@server.route("/")
def base(request):
    response =  Response(request)
    response._send_headers(content_type='multipart/x-mixed-replace; boundary=%s' % BOUNDARY)
    for i in range(10):
        jpeg = b"cacacacacacacacacaca" #cam.take()
        response._send_bytes(request.connection, b'--%s\r\n' % BOUNDARY)
        response._send_bytes(request.connection, b'Content-Type: plain/text\r\nContent-Length: %d\r\n\r\n' % len(jpeg))
        response._send_bytes(request.connection, jpeg)
        response._send_bytes(request.connection, b'\r\n')
    return response

As you can see I am only streaming here some text

And with MicroPython I used https://github.com/wybiral/micropython-aioweb with this code:

app.route('/vid')
sync def vid_handler(r, w):
   PART_BOUNDARY = "123456789000000000000987654321"
   STREAM_CONTENT_TYPE = "Content-Type: multipart/x-mixed-replace;boundary=" + PART_BOUNDARY + "\r\n"
   STREAM_PART = "Content-Type: image/jpeg\r\nContent-Length: %u\r\n\r\n";
   STREAM_BOUNDARY = "\r\n--" + PART_BOUNDARY + "\r\n";

   w.write(b'HTTP/1.0 200 OK\r\n')
   w.write(STREAM_CONTENT_TYPE)
   while True:
       w.write(STREAM_BOUNDARY)
       f = camera.capture()
       w.write(STREAM_PART % len(f))
       w.write(f)
       await w.drain()
michalpokusa commented 5 months ago

Thanks, I will try making it work and will come back to you with the results.

michalpokusa commented 5 months ago

I managed to make a working example: https://github.com/michalpokusa/Adafruit_CircuitPython_HTTPServer/blob/x-mixed-replace-example/x_mixed_replace_example.py

Make sure to also download the "frames" folder, or change the code to work with camera from the start.

I am not sure whether it should be a feature in lib itself - it is already very big. Maybe I will make a PR based on example above as your usage seems like it might be common.

Please try the code above, I will be happy to help with any problems you encounter.

aguaviva commented 5 months ago

Thanks!!! I'll check it out later today.

aguaviva commented 5 months ago

This is awesome, and it works like a charm :) I have some minor comments:

I tried to combine the below two lines into 1

        self._send_bytes(
            self._request.connection, 
            bytes(f"{self._boundary}\r\n", "utf-8")
        )
        self._send_bytes(
            self._request.connection,
            bytes(f"Content-Type: {self._frame_content_type}\r\n\r\n", "utf-8"),
        )

but for some reason that was caused an unrelated error:

Traceback (most recent call last):
  File "adafruit_httpserver/request.py", line 349, in __init__
  File "adafruit_httpserver/request.py", line 480, in _parse_request_header
  File "adafruit_httpserver/headers.py", line 59, in __init__
ValueError: need more than 1 values to unpack

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "adafruit_httpserver/server.py", line 408, in poll
  File "adafruit_httpserver/server.py", line 295, in _receive_request
  File "adafruit_httpserver/request.py", line 351, in __init__
ValueError: ('Unparseable raw_request: ', b'GET /live-feed HTTP/1.1\r\nHost: 192.168.1.57:5000\r\nUser-Agent: Wget/1.21.2\r\nAccept: */*\r\nAccept-Encoding: identity\r\nConnection: Keep-Alive\r\n\r\n')
Traceback (most recent call last):

Also, I'd be great to minimize copies by moving the b"\r\n" to the above _send_bytes (only the second time this gets called)

self._send_bytes(self._request.connection, bytes(encoded_frame) + b"\r\n")

(note that the last b"\r\n" is not utf8 encoded)

also, this bit is not needed:

content_type="video/mp4"

Do you think the suggested changes would help? Otherwise this is great and gets a decent frame rate (140Kb/s)

aguaviva commented 5 months ago

173K/s by batching the b"\r\n" like this:

        if self.frame_idx > 0:
            self._send_bytes(
                self._request.connection, 
                bytes(f"\r\n{self._boundary}\r\n", "utf-8")
            )
        else:
            self._send_bytes(
                self._request.connection, 
                bytes(f"{self._boundary}\r\n", "utf-8")
            )

        self.frame_idx +=1

For some reason merging the two _send_bytes doesn't work

michalpokusa commented 5 months ago
  1. You are right about content_type="video/mp4", I don't remember putting it there, it indeed doesnt make sense as this is not mp4, probably a leftover from example I copied or Copilot added it and I didn't spot it.

  2. I think wgeting the live feed wouldn't work, as wget would have to keep downloading the file, and considering the file is jpeg, it probably does not know what to do with next frames.

  3. You are right about minimizing copying the encoded frame, although I wouldn't keep track of current frame, you could simply

    self._send_bytes(self._request.connection, bytes(encoded_frame))
    self._send_bytes(self._request.connection, bytes("\r\n", "utf-8"))

    It makes the code clear and does not really impact performance. ChunkedResponse already does it this way, overhead is very, very minimal.

I could improve that example, but it was mainly a quickly written proof-of-concept.

  1. Regarding the ValueError, I believe the cause of it are the double \r\n at the end of request ...Connection: Keep-Alive \r\n\r\n

The headers parser uses splitline and then splits each line to get header name and value, I suspect it tried to split empty line between "\r\n" and "\r\n", which resulted in ValueError, the question is why did the request containe doubled newline at the end.

I might add check for that to the parser in the PR, thanks for spotting that.

When you finish your project, please share a link if you can, I always like to see new ways how people use the adafruit_httpserver. 👍

aguaviva commented 5 months ago

You code and CircuitPython made work this small setup :) image The antenna is highly recommended as otherwise the camera casues cross over issues with the on board antenna

BTW I cant believe how much better is Circuitpython compared to Micropython. I love the web interface and its software stack is great.

Oh and I'd be great to have a way to known when the connection got closed (try/except maybe? or maybe a callback?), that way we could deinit the camera or whatever we are streaming.

michalpokusa commented 5 months ago

I do not have experience with MicroPython, but I agree that CircuitPython is very good and easy to use for most part.

It might be worth to present your setup on Show and Tell that Adafruit is hosting on Wednesday, this way more people can see what you created.

When it comes to knowing when connection is closed, I encourage you to check out an Websocket example from docs, it stores the connection in global scope, and uses async to handle requests between sending next frames. It might take some work to implement this behaviour, but it is not very hard, examples show most of the functionality.

aguaviva commented 5 months ago

I looked at the websocked example but I couldn't figure out how to detect when the connection closes, what did you have in mind?

michalpokusa commented 5 months ago

What I meant is that when the client disconnects, futher sent frames should result in BrokenPipeError, which you can catch and mark the connection as closed, look at fail_silently argument in Websocket.

I will have some time during the weekend, so I will try to make a sync await example that detects the closed connection. It is kind of necessary as without it you would be stuck in sending the one response, and the MCU wouldn't be able to do anything until client disconnects. I think I should have a working example by Monday.

michalpokusa commented 5 months ago

I am nearly done, I managed to make a live video feed to multiple devices at the same time and detecting when the connection is dropped, this seems like a perfect example on how to create custom response types.

I will make a PR later with the elegant code. For now a little preview:

https://github.com/adafruit/Adafruit_CircuitPython_HTTPServer/assets/72110769/eaf42662-e705-4b4d-aaaa-0fd1464b9117

aguaviva commented 5 months ago

Wow that is pretty cool! And there is no lag, are you using a esp32?

michalpokusa commented 5 months ago

I used ESP32-S2 TFT for this.

The lag probably will be more noticeable with bigger frames. I suspect that under the hood the transfer is optimized somehow, as it seems to good to be true...

https://github.com/adafruit/Adafruit_CircuitPython_HTTPServer/assets/72110769/d61dac87-e77d-479e-8988-04288c29e122

aguaviva commented 2 weeks ago

Hi Michal, it's been a while :) Any chance to make this part of your great lib?

michalpokusa commented 2 weeks ago

Hi Michal, it's been a while :) Any chance to make this part of your great lib?

To be honest, it completely slip out of my mind. Thanks for reminding me.

The code I have is pretty much ready, I will add docs for the example and make a PR in a day or two.

This time, the sticky note is on my desk, so hopefully I will not forget. 😅

aguaviva commented 2 weeks ago

Many thanks :)