openfaas / python-flask-template

HTTP and Flask-based OpenFaaS templates for Python 3
MIT License
85 stars 86 forks source link

WIP template using FastAPI #69

Closed alexellis closed 1 year ago

alexellis commented 1 year ago

Description

Adds a WIP template using FastAPI

Motivation and Context

Kubiya complained of an issue to do with FastAPI returning a 10MB payload through the openfaas watchdog. It appeared to time-out.

This template uses the recommended HTTP framework for FastAPI and aims to be a minimal reproduction of the issue.

How Has This Been Tested?

faas-cli build
faas-cli local-run

alex@bq:~$ curl -s http://127.0.0.1:8080?payload_size_mb=1 > out-1.txt
alex@bq:~$ curl -s http://127.0.0.1:8080?payload_size_mb=2 > out-2.txt
alex@bq:~$ curl -s http://127.0.0.1:8080?payload_size_mb=10 > out-10.txt
alex@bq:~$ du -h out-1.txt 
1.1M    out-1.txt
alex@bq:~$ du -h out-2.txt 
2.1M    out-2.txt
alex@bq:~$ du -h out-10.txt 
11M     out-10.txt
alex@bq:~$ 
Screenshot 2023-05-30 at 17 36 13

Types of changes

New template

Checklist:

alexellis commented 1 year ago

@omerc7 and @shakedaskayo - please take a look at the example to make sure you're happy with the approach.

In each case - 1MB - 2MB and 10MB of data, each returned the full set of data - no extended timeouts etc were needed.

Next, I ran it within a K8s cluster on Linode with OpenFaaS standard:

alex@ae-m2 python-flask-template % DOCKER_HOST=ssh://192.168.1.14 faas-cli up

[0] < Pushing variable-payload [ttl.sh/variable-payload:latest] done.
[0] Worker done.

Deploying: variable-payload.

Deployed. 202 Accepted.
URL: http://127.0.0.1:8080/function/variable-payload

curl  -s -i http://127.0.0.1:8080/function/variable-payload > out-1.txt

1.0M    out-1.txt

curl  -s -i "http://127.0.0.1:8080/function/variable-payload?payload_size_mb=10" > out-10.txt

 10M    out-10.txt

faas-cli logs variable-payload

2023-05-30T16:41:23Z 2023/05/30 16:41:23 Version: 0.9.11        SHA: ae2f5089ae66f81a1475c4664cb8f5edb6c096bf
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Forking: uvicorn, arguments: [main:app]
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Started logging: stderr from function.
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Started logging: stdout from function.
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Watchdog mode: http    fprocess: "uvicorn main:app"
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Timeouts: read: 30s write: 30s hard: 30s health: 30s
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Listening on port: 8080
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Writing lock-file to: /tmp/.lock
2023-05-30T16:41:23Z 2023/05/30 16:41:23 Metrics listening on port: 8081
2023-05-30T16:41:23Z 2023/05/30 16:41:23 stderr: INFO:     Started server process [10]
2023-05-30T16:41:23Z 2023/05/30 16:41:23 stderr: INFO:     Waiting for application startup.
2023-05-30T16:41:23Z 2023/05/30 16:41:23 stderr: INFO:     Application startup complete.
2023-05-30T16:41:23Z 2023/05/30 16:41:23 stderr: INFO:     Uvicorn running on http://127.0.0.1:5000 (Press CTRL+C to quit)
2023-05-30T16:41:47Z 2023/05/30 16:41:47 stderr: payload_size_mb: 1
2023-05-30T16:41:48Z 2023/05/30 16:41:48 stdout: INFO:     127.0.0.1:59366:0 - "GET / HTTP/1.1" 200 OK
2023-05-30T16:41:48Z 2023/05/30 16:41:48 GET / - 200 OK - ContentLength: 1.049MB (0.8005s)
2023-05-30T16:42:32Z 2023/05/30 16:42:32 stderr: payload_size_mb: 10
2023-05-30T16:42:40Z 2023/05/30 16:42:40 stdout: INFO:     127.0.0.1:58502:0 - "GET /?payload_size_mb=10 HTTP/1.1" 200 OK
2023-05-30T16:42:40Z 2023/05/30 16:42:40 GET /?payload_size_mb=10 - 200 OK - ContentLength: 10.49MB (8.0433s)

Looking at the HTTP request/response from within the container - with and without the watchdog middleware:

Direct:

curl -v -s localhost:5000/?payload_size_mb=10 &> 10mb-direct.txt

*   Trying ::1...
* TCP_NODELAY set
* Expire in 150000 ms for 3 (transfer 0x562255edd110)
* Expire in 200 ms for 4 (transfer 0x562255edd110)
* connect to ::1 port 5000 failed: Connection refused
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Expire in 150000 ms for 3 (transfer 0x562255edd110)
* Connected to localhost (127.0.0.1) port 5000 (#0)
> GET /?payload_size_mb=10 HTTP/1.1
> Host: localhost:5000
> User-Agent: curl/7.64.0
> Accept: */*
> 
< HTTP/1.1 200 OK
< date: Tue, 30 May 2023 17:04:55 GMT
< server: uvicorn
< content-length: 10485815
< content-type: application/json
< 
{ [32768 bytes data]
["{\"payload_size_mb\": 10, \"payload\": \"FIGQLBKLOSMXYAJVHKVCUNK

Via watchdog:

 curl -v -s localhost:8080/?payload_size_mb=10 &> 10mb.txt

*   Trying ::1...
* TCP_NODELAY set
* Expire in 150000 ms for 3 (transfer 0x557be94a4110)
* Expire in 200 ms for 4 (transfer 0x557be94a4110)
* Connected to localhost (::1) port 8080 (#0)
> GET /?payload_size_mb=10 HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.64.0
> Accept: */*
> 
< HTTP/1.1 200 OK
< Content-Length: 10485815
< Content-Type: application/json
< Date: Tue, 30 May 2023 17:04:04 GMT
< Server: uvicorn
< X-Duration-Seconds: 8.176718
< 
{ [36703 bytes data]
["{\"payload_size_mb\": 10, \"payload\": \"CNMFTMYSB

Generating the string took 0.8s the first time, and 7.7s for 10mb.

My understanding is that you see a timeout that never returns, I've been unable to reproduce any issue with the same HTTP server and FastAPI.

Is there anything I could be missing that's different in your setup?

At this point, I'd suggest perhaps doing the following:

There may be a potential issue with your use of run_in_threadpool:

I'm unsure if it's related - the presence or absence of the watchdog would have no bearing on this.

alexellis commented 1 year ago

@omerzamir

Seems like you don't have the timeouts set properly:

Screenshot from 2023-05-31 10-28-24

Please see the following: https://docs.openfaas.com/tutorials/expanded-timeouts/

I'll get this closed.