Closed lewoudar closed 3 years ago
Yeah, that's not going to work. It would be the same as running CPU heavy work in a proper async
function. There is really no magic behind the async_
decorator, it really makes your function behave like a real async function, with the same benefits and disadvantages.
Ok, thanks for the answer. Another question not related to the first one. When writing code like this
with patch_blocking():
import requests
Will it just patch the concerned modules (ssl, socket, etc..) for the requests use? or will it patch the modules for everything else until the end of the program execution?
The modules are patched while the code inside the with block runs. As soon as that ends, everything is restored to normal. Any imports that are issued in the with block will see the patched modules. Imports that are issued outside (either before or after the patch_blocking) will see the normal stuff.
Keep in mind that once something is imported, a second import of the same thing will be ignored. For example:
import socket
with patch_blocking():
import requests
This is not going to work. Requests will import socket at some point, but socket has already been imported in its original form, so the patch in this case will not work.
Many thanks for the clarification!
You should probably mention the subtlety about import and patch_blocking
in the documentation.
@miguelgrinberg Can you explain the idea behind this approach? I mean why you implement context manager instead regular function to patch? As you said same import will be ignored.
@spumer this provides more control, and also is a bit more explicit. But if you want to use a monkey patching function you can write one yourself. Something like this:
def monkey_patch_socket():
with patch_blocking():
import socket
Hi, I think I have the answer but just to be sure, is it a bad idea to run cpu intensive code with
async
instead of running it in a threadpool? (my assumption is yes for the answer)