Closed tiptop96 closed 3 years ago
Hi @tiptop96,
The perserve
operator doesn't not preserve the content of the stream but prevents the stream from being closed when exiting the streaming context. In order to preserve the content, you can add the data to a list and later chain this list with the preserved iterator:
async def main():
async with aiohttp.ClientSession() as sesh:
async with sesh.get("https://jsonplaceholder.typicode.com/todos/1") as resp:
async with stream.preserve(resp.content).stream() as streamer:
stack = []
async for line in streamer:
stack.append(line)
await asyncio.sleep(1)
# Some criteria in the middle of content
if b"title" in line:
break
preserved_content = stream.iterate(stack) + resp.content
async with preserved_content.stream() as streamer:
async for line in streamer:
print(line)
await asyncio.sleep(1)
You can also factorize this logic into a dedicated operator:
@operator(pipable=True)
async def preserve_content(source, items):
for item in items:
yield item
async with stream.preserve(source).stream() as streamer:
async for item in streamer:
items.append(item)
yield item
async def main():
async with aiohttp.ClientSession() as sesh:
async with sesh.get("https://jsonplaceholder.typicode.com/todos/1") as resp:
items = []
preserved = preserve_content(resp.content, items)
async with preserved.stream() as streamer:
async for line in streamer:
await asyncio.sleep(1)
# Some criteria in the middle of content
if b"title" in line:
break
async with preserved.stream() as streamer:
async for line in streamer:
print(line)
await asyncio.sleep(1)
As a side note, I also receive a warning as follows: [...]
The stream operators are classes that are created dynamically, so I'm not surprised that linters might struggle to detect their interface. There might be a way to help them figure out the right info, I'll try to look into it later.
Hope it helps :)
Ahh my bad for missunderstanding the operator! But this is awesome, thanks so much for the thorough response and the awesome lib! 🔥
No problem :)
I am looking to preserve an
aiohttp
streaming response through their async iterator.The idea is to stream a response to check for a specific criteria that will appear somewhere in the first 5-20 lines of the response and if it is not matched we would close the stream, and if it is matched we would pipe it into a blob storage service.
However I cannot seem to preserve the stream. I have tried this a few way but I cannot seem to get it right.
At first I figured that this it would be something as simple as this:
As the comments say, only one loop runs.
I also tride wrapping the stream in an
aiostream.stream.operator
but it gave me an exception:As a side note, I also receive a warning as follows: If I remove the
stream()
call it also raises an exception.Any guidance on how to do this or if it is even possible is much appreciated. 👍