python-trio / trio

Trio – a friendly Python library for async concurrency and I/O
https://trio.readthedocs.io
Other
6.09k stars 331 forks source link

trio.run() should take **kwargs in addition to *args #470

Open kennethreitz opened 6 years ago

kennethreitz commented 6 years ago

this would greatly enhance usability — especially for function declarations like:

async def request(method, url, *, pool=None, preload_content=False):

smurfix commented 4 years ago

@mentalisttraceur That was me, at https://github.com/python-trio/trio/issues/470#issuecomment-473648115 Thanks for the endorsement ;-)

I dunno about )(. Personally I don't have a problem with them. Also, if you use the call inside another wrapper, as in

nursery.start_soon(
    trio.to_thread.run_sync.options(cancellable=True),
    func, *args, **kwargs
)

having to attach .call or .do or … to the second line looks strange. But maybe we can simply let people use whatever style they prefer …

oremanj commented 4 years ago

Do you find that writing def __add__(self, other): ... or referencing obj.__name__, etc., grates on you and makes you dislike using python?

No -- but I do think of the "dunder namespace" as fundamentally belonging to the language implementation, not to library authors. Some libraries do use it, true, but generally as something that "looks like" how the implementation uses it: a vaguely "magical" method or attribute name. Seeing dunder keyword arguments feels especially weird to me -- if I saw that in some unfamiliar code, I'd wonder whether I should consult the target method's documentation or Python's. I'm sure I'll get used to it if it's the way things are, but I suspect I'm not the only one who will find it somewhat off-putting at first.

nursery.start_soon.options(...)(...)

I like this approach, and I do think we can type it using PEP 612. The implementation will need a bit of descriptor trickery to remember which nursery it should eventually call start_soon on. I agree that )( might be polarizing, but I think __keywords__ have a similar risk.

jab commented 4 years ago

With the .options approach, could the docs mitigate the concern over )( by breaking up an initial example into multiple subexpressions?

custom_start_soon = nursery.start_soon.options(name="foo")
custom_start_soon(f, spam='eggs')

(Maybe the docs could then even briefly note that this can be combined into a 1-liner like nursery.start_soon.options(name="foo")(f, spam='eggs') if you're into that sort of thing.)

In the case of __keywords__, on the other hand, I can't think of any similar way to build up to their usage incrementally to reduce any "this looks weird" feelings.

mentalisttraceur commented 4 years ago

@njsmith Yeah, that is much simpler. Nice. The only downside I can think of with your example implementation is that help(trio.run) will describe an instance of takes_callable? help in the Python REPL is a common way that I quickly reference documentation for libraries I am using. Unfortunately after spending a few hours on it, there doesn't seem to be a comparably simple solution that gets nice docstrings everywhere - especially not while also preserving other I-presume-desireable properties like pickleability or introspectability.

I hear you on )(. It definitely takes an extra mental step, and I've worked with developers who I know would get confused at first the moment they see it. Part of me is tempted to just say that the way to solve that problem is teaching, not limiting ourselves. But I am also sympathetic that APIs are user interfaces and that unless there is good reason, user interfaces should be tuned to people as they are.

If the )( is ultimately decided to be too much of an issue, I second the suggestion by @smurfix to permit both - for example just put call = __call__ at the bottom of the takes_callable class.

smurfix commented 4 years ago

takes_callable could simply copy the wrapped function's docstring onto itself.

mentalisttraceur commented 4 years ago

@smurfix Yeah that's a good start. However:

  1. Do we only want a good docstring on run, or also on run.options? Maybe even run.options(...)? (The former is easy enough as a one-liner docstring like f"""Options setter for {wrapped.__module__}.{wrapped.__qualname__}""". Someone else can probably think of a good simple one for the latter.)

  2. help shows more than just the docstring - for class instances it'll display the whole class, including all of its methods.

But my point was just that it takes extra lines for each additional bit of functionality - if we want the result of .options(...) to be pickleable we can't use lambdas so that's more lines, if we want the options to be programmatically accessible or visible in the docstring then that's extra lines, if we want the help of .options to include what it's a wrapper of rather than just generically that it's a wrapper and we can't use f-strings that's maybe extra lines, etc.

All doable, but when it's all done every variant I've thought of is a lot more lines than just the very nice and simple variant up above. But of course it becomes simpler if we just say "we don't care about it being pickleable", "we don't care about help on the .options showing a docstring that gives any information about what it's options for", etc.

smurfix commented 4 years ago

Prettying up the __doc__string and the help() output is certainly doable but, well, perfect is the enemy of good and all that.

No, I have to admit that I don't care at all for callables to be pickle-able. Doing that is rarely if ever good design. Besides, except for trio.run itself (and why would you want to pickle that?) they're all on classes which can't be pickled anyway.

mentalisttraceur commented 4 years ago

Okay, if pickling is discarded, then I think a traditional function-style decorator is sufficient to cover all docstrings and help outputs nicely:

def takes_callable(wrapped):
    @functools.wraps(wrapped)
    def wrapper(fn, /, *args, **kwargs):
        return wrapped(partial(fn, *args, **kwargs))

    name = f'{wrapped.__module__}.{wrapped.__qualname__}'

    def bind_options(self, **options):
        f"""Bind options to {name}"""
        def call_with_options(fn, /, *args, **kwargs):
            f"""Call {name} with options: {options}"""
            return wrapped(partial(fn, *args, **kwargs), **options)

        # Enable `).call(` as an alternative to `)(`:
        call_with_options.call = call_with_options

        return call_with_options

    wrapper.options = bind_options
    return wrapper

Which still feels a little too complex, but it's the best I can do so far while

  1. retaining docstrings and nice help output on wrapped, .options and .options(...) and
  2. supporting .options(...).call(...) as an equivalent alternative to .options(...)(...).
altendky commented 4 years ago

https://docs.python.org/3/reference/lexical_analysis.html#reserved-classes-of-identifiers

Any use of __*__ names, in any context, that does not follow explicitly documented use, is subject to breakage without warning.

(pretty sure this doesn't even qualify as two cents but...)

mentalisttraceur commented 4 years ago

@altendky I actually think this is very important to bring up! Up until now I thought Python just took a casual "we use dunder identifiers for internal stuff sometimes" position, but it sounds like Python is officially taking the Serious(tm) "we reserve all dunder identifiers for internal usage at any time" position.

dhirschfeld commented 3 years ago

A new PEP could be the best way forward in case anyone wants to comment: https://github.com/python/peps/blob/master/pep-0637.txt

I guess trio wouldn't be able to use it for quite a while though...

smurfix commented 3 years ago

Well, we could simply alias __getitem__ to options, which would transparently allow both trio.run.options(option=value)(func, *args, **kw) and trio.run[option=value](func, *args, **kw) (when the PR is implemented, assuming that it will be).

+1 from me.

smurfix commented 3 years ago

https://github.com/python/peps/pull/1615

brettcannon commented 3 years ago

PEP 637 was rejected, so that won't be an option.

tiangolo commented 2 years ago

I want to point out another thing to have in mind related to usability and user experience:

Now that ParamSpec is accepted (and even supported by mypy), it allows a trick that enables all this autocompletion and tooling support for the arguments and return values of the functions passed to Trio functions. This means better tooling for user's code.

And as ParamSpec is also in typing_extensions all this also works for current/older versions of Python (3.6+).


I'm making a PR here (at least as a conversation starter/re-starter): https://github.com/python-trio/trio/pull/2208

It has a proposal of adding some 3 new alternative functions that just wrap the current ones and have these typing tricks.

As an example, here's how it could look like when sending tasks to a worker thread (similar for Nurseries, more examples in the PR):

Selection_034

Selection_035

Selection_036

Selection_037

main.py:13: error: Unsupported operand types for + ("str" and "int")
Found 1 error in 1 file (checked 1 source file)
noahbkim commented 1 year ago

Hi, I'm sorry if this is covered in the discussion above and I missed it, but why can't there be a simpler, underlying interface along the lines of:

def run_explicit(async_fn, args: Iterable[str], kwargs: Dict[str, Any], clock=None, ...):

This introduces trivial overhead and would give a verbose option for developers who need the flexibility.

oremanj commented 1 year ago

run_explicit(fn, (1, 2), {"foo": 3}, clock=...) is more typing and harder to read than run(partial(fn, 1, 2, foo=3), clock=...), and would need to be replicated for every function in Trio that takes a function and arguments (there's a long list above).

noahbkim commented 1 year ago

Yes, that's why you still offer run() as is (but passes its args down to run_explicit). Partial is a no-go in caching scenarios I'm currently dealing with because of the inline/anonymous functions. I think this would be a very straightforward change and would be willing to write the PR.

oremanj commented 1 year ago

It increases the API surface to little benefit; I don't think there would be appetite for the change. I don't understand what's causing problems for you about partial() though, maybe you could elaborate on the problems you're seeing?

smurfix commented 1 year ago

@oremanj Yes, partial is bad for you if you do any caching.

>>> from functools import partial
>>> a=partial(int,1.23)
>>> b=partial(int,1.23)
>>> a==b
False
>>> hash(a)
8775412814713
>>> hash(b)
8775412814723
>>> 
>>> a=(int,1.23)
>>> b=(int,1.23)
>>> a==b
True
>>> 

@noahbkim did I understand the problem correctly?

I do wonder whether there's a rationale for partial not offering reasonable __hash__ and __eq__ built-ins, other than "we didn't think of that" …

oremanj commented 1 year ago

I'm not really a fan of complicating Trio's API to support caching use cases that I think will be pretty uncommon, especially since it's easy to write run_explicit yourself.

arthur-tacca commented 1 year ago

@smurfix

I do wonder whether there's a rationale for partial not offering reasonable __hash__ and __eq__ built-ins, other than "we didn't think of that" …

There's a surprisingly good reason, described in CPython issue 3564: if partial objects tested for equality by comparing func, args and keywords then you could end up with a situation where two partial objects compare as equal but give different results in practice so really ought to be compared as different. That's because a function can tell the difference between e.g. 1 and 1.0 even though they compare as equal. For a more extreme example, consider partial(id, x) and partial(id, y) where x and y are different objects that compare equal.

rafalkrupinski commented 2 months ago

I'd love to see this implemented the way @tiangolo suggested in #2208 .

Properly type hinting partial is much more difficult than "inheriting" type hints from the wrapped function