Open AndySemenov opened 2 years ago
@AndySemenov Thanks for the issue. Out of curiousity, if you remove --warm-container EAGER
does it work as expected? I am trying to narrow down where this might be stemming from.
Yes. Without the EAGER warm the lambda builds and runs
2022-08-26 15:33:52,288 | Cleaning all decompressed code dirs
2022-08-26 15:33:55,192 | Exception on /my_endpoint [GET]
I'm experiencing the same problem on SAM CLI version 1.73.0. This only occurs when the flag warm-containers
is set to EAGER.
Same issue on Windows 11 using SAM CLI version 1.78.0. Occurs regardless of the any flags.
I'm experiencing something similar on windows 10, SAM CLI 1.84.0. I press ctrl-c to shut things down and I get the "Commands you can use next" message but not the "terminate batch job y/n." One thing I noticed is that the issue occurs with the flag --warm-containers eager but not with --warm-containers lazy. However, if I send a request to the api, causing a few of the lazily loaded containers to mount, then I get the same issue as with eager.
If I don't use --warm-containers at all, then I can press ctrl-c and get the terminate batch job message that allows me to shut everything down. I guess the issue is related to actually having things warm and mounted.
I'm wondering do you observe the same in the latest SAM CLI version v1.88.0 ?
I updated to SAM 1.89.0. I tried --warm-containers lazy and ran a few of the functions in my api. It allowed me to terminate, but not before giving me a lovely python exception.
The exception was this:
Exception in thread Thread-7:
Traceback (most recent call last):
File "threading.py", line 932, in _bootstrap_inner
File "threading.py", line 870, in run
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\lib\utils\file_observer.py", line 267, in _watch_images_events
for event in self.events:
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\types\daemon.py", line 29, in __next__
return next(self._stream)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\client.py", line 344, in _stream_helper
yield from json_stream(self._stream_helper(response, False))
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\utils\json_stream.py", line 61, in split_buffer
for data in stream_as_text(stream):
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\utils\json_stream.py", line 17, in stream_as_text
for data in stream:
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\client.py", line 349, in _stream_helper
data = reader.read(1)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\urllib3\response.py", line 567, in read
data = self._fp_read(amt) if not fp_closed else b""
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\urllib3\response.py", line 533, in _fp_read
return self._fp.read(amt) if amt is not None else self._fp.read()
File "http\client.py", line 458, in read
File "http\client.py", line 492, in readinto
File "http\client.py", line 587, in _readinto_chunked
File "http\client.py", line 555, in _get_chunk_left
File "http\client.py", line 515, in _read_next_chunk_size
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\transport\npipesocket.py", line 224, in readinto
return self.sock.recv_into(buf)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\transport\npipesocket.py", line 25, in wrapped
return f(self, *args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\transport\npipesocket.py", line 152, in recv_into
return win32file.GetOverlappedResult(self._handle, overlapped, 0)
pywintypes.error: (109, 'GetOverlappedResult', 'The pipe has been ended.')
Terminate batch job (Y/N)? y
I get the same behavior and the same exception (for different thread identifiers) when I start it up with eager. It's regardless of whether I run my lambdas or not.
I do not get an exception if I use lazy and don't run any of my lambdas.
At least it lets me terminate, which helps.
Description:
After
local start-api
shell is provided withCtrl+C
keystroke and the environment starts shutting down it hangs along with the underlying Python process.Steps to reproduce:
sam --debug local start-api -n env_vars.json -t template.yaml --config-env qa --docker-network bridge --debug --warm-containers EAGER
Ctrl+C
Observed result:
The local environment has 4 layers attached. The environment hangs after printing the last line shown. It's in the hanging state for more than an hour by the time of filing this issue.
Expected result:
The environment shuts down and returns control to the CMD application.
Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
sam --version
: SAM CLI, version 1.55.0, installed from the official.msi
package.