Open pnkfelix opened 10 years ago
When you say "struct builders", are you talking about the builder pattern, or the struct literal syntax? If you're talking about augmenting the struct literal syntax with default values, then I'd still definitely endorse keyword and default arguments, since this is traditionally the kind of behavior that constructors are meant to handle (and using constructors allows the author to keep struct members private). It sounds like both these options will solve the same set of problems, and keyword and default arguments will lessen the overall amount of boilerplate, since it doesn't force users to always use structs.
Whether it's good to force users to always use structs is a matter of opinion, but this seems to me like arguing against tuples because "users should be using a struct". I would prefer to provide both tools and let users make informed decisions based on their use case.
@bfops What I'm talking about is
struct Foo {
a: u32 = 300, // <- the `300` is the default value of the field a
b: i32, // This field do not have a default value
c: i32, // Neither do this
}
fn my_function() {
let var = Foo {
// Note that I do not define the value of `a` here
b: 42,
c: 999,
... // Fill the rest of the struct with the default values
};
assert_eq!(var.a, 300); // var.a == 300
}
or something like that.
OK, so you’re basically saying that with enough syntactic sugar, structs can become simple enough that they are able to make default/named parameters superfluous?
i think there are several problems, the biggest of which is imho: there’s no clear relation between the function and its param struct
sure, you can create fn foo
and struct FooParams
, but you’ll still have to define both separately, users have to import both separately, and you have to spell it out.
use fooer::{foo,FooParams};
foo(FooParams {...});
just doesn’t have the same simplicity as
use fooer::foo;
foo();
Agreed, both foo(FooParams {...})
and OpenBuilder().this().that().whatever()
look like hacks around the fact that there's no support for optional/default arguments.
Agreed, both foo(FooParams {...}) and OpenBuilder().this().that().whatever() look like hacks around the fact that there's no support for optional/default arguments.
You could look at this from different direction where languages are not expressive enough to express the builder pattern in a comfortable way have kwargs to gloss over that.
I personally really enjoy using the builder pattern most of the time (although it does have some quirks regarding finalizers) and think it is strictly more expressive (you can’t express e.g. mutually exclusive options with kwargs).
I find the builder patterns very nice to work with too.
I think Builder is the most ugly thing I've ever seen. First time I thought it's some piece of brainfuck code.
you know what? why not combine this stuff?
add a trait Params: Default
and syntactic sugar for invoking functions that have a Params
-implementing last parameter.
also add syntactic sugar for automatically defining an anonymous params struct.
e.g.:
#[derive(Params,Default)]
struct ExplicitParams {
herp: u8,
derp: i8,
}
//use when you have many parameters
fn explicit(durr: &str, named: ExplicitParams) { ... }
//use when you only have few parameters
fn implicit(foo: &str = "xy", bar = 1u8) { ... }
fn main() {
explicit("unnamed", derp = 2);
//desugared: explicit("unnamed", ExplicitParams { derp: 2, ..Default::default() })
implicit(bar = 2);
}
I'm pretty new to rust, so I haven't built complex macros, but I've seen others build really complex and powerful macros (even making a python like list comprehension in rust!)
Is this something macros can already do and, if not, isn't something like default/keyword args really in the domain of macros? It seems like the exact sort of thing macros are intended to handle.
If it were macros, it would be great to have an easy way to declare it, maybe something like:
#[kwargs(y:5, z:MyStruct{x: 4, y: 7})]
fn myfunc(x: u32, y: u32, z: MyStruct) -> u32 { ... }
This would wrap myfunc
with a macro. So you could still call the regular function:
let z = myfunc(1, 2, MyStruct{x:3, y:85}) // must specify all values
or you could call the macro/kwarg version:
let z = myfunc!(1, y:72) // don't need to specify y or z, y overridden here
// same as:
// let z = myfunc!(x:1, y:72)
This seems like the easiest way to be orthogonal to other rust features.
Edit: replaced =
with :
since =
returns ()
in rust.
@e-oz
I think Builder is the most ugly thing I've ever seen. First time I thought it's some piece of brainfuck code.
I encourage you to read the Rust Code of Conduct, in particular these two lines:
- Please be kind and courteous. There's no need to be mean or rude.
- Respect that people have differences of opinion and that every design or implementation choice carries a trade-off and numerous costs. There is seldom a right answer.
This comment does neither of these things. Please be respectful to others.
@vitiral that is my opinion, so please follow second rule from quotation.
Is this something macros can already do and, if not, isn't something like default/keyword args really in the domain of macros? It seems like the exact sort of thing macros are intended to handle.
i don’t think so. i think
when i see a macro invocation, i know that more is going on than a simple function call, and there are mostly only few macros. if you now couldn’t learn the 1-4 macros of a library anymore because every function became one, that would be bad.
What I am seeking for is the Qt binding in rust (mainly Qt widget rather than qml). It seems that such thing won't happen unless this issue is solved. Obviously the builder/macro would be too burdensome for this application. IMO a language cannot wrap Qt is not very "multi-paradigm".
when i see a macro invocation, i know that more is going on than a simple function call
I guess that was my whole point - kwargs are always going to have more going on than a function call, aren't they? The creation of variables you don't see is going to take up more bytes in the binary certainly. I have a hard time believing they are ever a "zero cost abstraction", except in the case where the function is inlined.
Also, it would only "increase the number of macro creations / invocations" when you wanted kwargs. In my opinion kwargs are little more than convenience (and coming from python, they are VERY convenient), which is what macros are supposed to do.
As for static analysis, I think an analyzer could follow all uses that use the kwargs attribute I suggested, certainly.
@zhou13 why would macros be too burdensome?
One of the benefits of macros is that they let your kwargs have arbitrary code, so you can use constructors, etc for them. For example:
#[kwargs(y: 5, z: MyStruct::new!(4))] // `new` can even use kwargs itself! In this case `y: 7`
fn myfunc(x: u32, y: u32, z: MyStruct) -> u32 { ... }
This is a much more ergonomic way to declare kwargs I would think, and I think it would probably be a mistake to allow this kind of behavior in non-macros.
Edit: This gets more significant when you have a "default struct" that you want for one of your values. Ideally, you could just use a reference to a global variable for the default, but there might be cases where you can't. For example, suppose I wanted to make a read
function for a socket where you can pass in your own buffer to conserve on memory allocations
fn read(socket: &mut Socket, size: u32, buf: &mut Vec<u8>)
the buf
variable is there to give the user an optional performance boost, but you don't want the user to need to include it if they don't care that much about performance. Therefore you declare the kwargs:
#[kwargs(buf: &mut vec!())]
This gives your api that flexibility.
Doing arbitrary code without macros sounds non-rusty to me, so doing kwargs any other way than macros while having this ability is not possible.
The creation of variables you don't see is going to take up more bytes in the binary certainly
not more than adding code to it like let arg_val = match opt_arg { Some(v) => v, None = default };
you also can’t reference macros, and they’ll have no connection to “their” function other than the name.
sorry, i don’t think this is a good substitution at all.
@flying-sheep What do you mean "you can't reference macros"?
well, you can create references to functions
@flying-sheep Ah, I see what you mean.
This is actually a very good question -- would it be even possible to have zero-cost abstractions for function pointers/references with kwargs?
I ask, because the problem seems non trivial to me. Say you have fn A(x: u8, y=18: u16)
and fn B(x: u8, y=49: u16)
. If you have a function with function type fn(u8, u16)
, you simply wouldn't be able to use the kwarg functionality. How would the compiler know which value you want for u16
when you can take either A
or B
at runtime?
You might be able to write the function type to be fn(x: u8, y=18: u16)
, and then this would only be able to accept A
but not B
, and you wold be able to use kwarg functionality -- but this seems very limiting and not all that useful.
Without this restriction though, I don't think using kwargs with function pointers is even possible without incurring a cost (like carying around the default value with every kwarg function, forcing every function call to check if the function is a kwarg function to apply defaults, etc). Again, the macro makes it explicit that you don't care about the extra cost and you just want the convienience.
the macro option also has the benefit that it doesn't create two classes of functions, functions that have at least one kwarg implementation for them and functions that don't. There will always be a runtime penalty if a function could potentially take kwargs (i.e. if you had fn C(x: u8, y: u16)
and then you defined A
or B
, using references to C
would now be slower since the compiler would have to check whether there were kwargs).
A macro is clearly separate from a function, so it does not create this separation.
why would macros be too burdensome?
@vitiral Because Qt heavily uses optional args and function overload. If we want to use macro, then you almost need to add !
to the end of every basic function. More importantly, I think you cannot use macro in trait methods.
Actually it looks like you cannot even use macros as parts of crates or modules. I.e. I cannot write
use mylib;
mylib::!mymacro()
This is problem for my proposal (and actually a problem with macros, I'm curious to know why the devs chose this). I did not realize the full extent of this until now.
The fact that it is not easy to import and use macros seems like a problem with macros. I personally would like macros to be scoped like everything else, and don't see why that's not possible in theory.
Without scoped macros, though, obviously using macros for kwargs doesn't make sense
This is problem for my proposal (and actually a problem with macros, I'm curious to know why the devs chose this). I did not realize the full extent of this until now.
Because macro expansion happens early in the compilation process.
I would be interested in keyword-based / default parameters. One concern I have seen raised often is the fact that the parameter names become part of the API and thus changing it becomes a breaking change. Keyword-parameters should therefore be opt-in.
I am not sure about what designs have been proposed thus far, I skimmed through this issue and an old rfc but this is one idea I had:
// Normal function with positional parameters
pub fn positional(a: i32, b: i32) -> i32;
// Function with keyword parameters
pub fn keyword(pub a: i32, pub b: i32) -> i32;
// Function with default keyword parameters
pub fn default_keyword(pub a: i32 = 0, pub b: i32 = 2) -> i32;
// Mix of all
pub fn mixed(a: i32, pub b: i32, pub c: i32 = 0)
pub
is already a keyword. Using it here would totally coincide with it's meaning: "making something public / part of the API". The use of pub
would make it very clear that this parameter is facing the outside world, making it part of the API and everything that that implies.
I thought I would write it down here so that others could share their opinions about this idea :)
Concerning the builder pattern, I think it is not a bad pattern. It definitely has it's use cases and they certainly do partly overlap with default / keyword parameters. But I don't think that should be a reason to exclude keyword parameters. When they fit on one line they are far less bulky than the builder pattern. In my opinion, lib authors could even propose both at the same time!
Wow @azerupi that's a fantastic idea for the syntax. I really like it.
@azerupi That's a solid suggestion!
@azerupi, nice idea. My only worry is that this syntax cannot be extended to a future RFC about default values for struct fields.
Sorry, I am new here, so I can miss some nuances, but I really do not like such "pub" using - it makes signature more noisy. Yes, they can increase probability of breaking changes, but I doubt that parameter names will often change without changing their meaning in stable API. Besides, we have semantic versioning. And if you change behavior, then it will even be helpful to break compilation.
Additionally, it will be library authors' responsibility to choose between "stability" or convenience. I suppose library users can choose that for themselves. Moreover, library authors already can influence that by not creating functions with too many parameters, so users will not be tempted to use named arguments without reason. (:
Additionally, it will be library authors' responsibility to choose between "stability" or convenience. I suppose library users can choose that for themselves.
@DarkEld3r With enough users, somebody will end up relying on it. Then when you decide that a more expressive name would be nice, it becomes a breaking change.
If it is exposed, it becomes part of the API. Library authors bear the responsibility of tracking breaking changes and alerting users when they occur, so library authors should have the ability to limit what they expose.
If it is exposed, it becomes part of the API. Library authors bear the responsibility of tracking breaking changes and alerting users when they occur, so library authors should have the ability to limit what they expose.
I don't really understand what's the big fuss about exposing keyword argument names. The function name itself is already 'exposed' anyway, so changing it to a more expressive one would also break the API. Say, Python has had keyword/default arguments pretty much since the start and that's been a great help for library developers rather than the opposite.
If you as a library developer would want to rename a keyword argument in a nice way, you could always support both the old and the new name for one release cycle and use #[deprecated]
to trigger a compilation warning if an old name is used.
The function name itself is already 'exposed' anyway
And that was explicitly decided by using pub
before the function. My understanding of @DarkEld3r's comment was that every pub
function should have all of its parameter names be publicly available for use with keyword arguments. Fair point though; while this would add to the API a pub
function exposes (you used scare quotes around exposed; is there a better word?), it's not a big deal to make sure that you're as confident in your parameter names as you are in the function name and (most importantly) the types when you make it pub
.
Say, Python has had keyword/default arguments pretty much since the start and that's a great help for library developers rather than the opposite.
I feel like Rust has a somewhat different stance than Python. Rust makes some strong static guarantees, is often conservative in its defaults (e.g. immutability and privacy), and generally allows for a fair amount of control (e.g. pub
struct vs. pub
members1).
I suppose library users can choose [between "stability" or convenience] for themselves.
doesn't seem to fit with that. That said, I'm in no way opposed to either keyword or default arguments. I just want to make sure we're thinking things through.
If you as a library developer would want to rename a keyword argument in a nice way, you could always support both the old and the new name for one release cycle and use #[deprecated] to trigger a compilation warning if an old name is used.
How would that work?
1: This is different from keyword arguments because it exposes more than just names.
Here are some thoughts on each of these ideas:
Keyword-Based Parameters
The biggest problem with keyword-based parameters is that they make it easier to write functions that take a large number of parameters. Ruby on Rails is a good example, with a number of functions taking 5+ parameters. I think most Rust developers would agree that this is an anti-pattern.
It also makes parameter names part of the public interface of every public function. While this may be less onerous, it is still a surprising change. It is also a major change of expectations for existing Rust code. Even if it's not considered a breaking change itself, it may be undesirable from the perspective of a current Rust library maintainer.
Variable-Arity Functions
Rust has macros, which can already be used to implement this sort of thing. Rust has a solid track record of keeping features orthogonal, and any non-macro-based system for this would be a strictly less-useful system re-implementing an ability already effectively available in the language.
Optional/Default Parameters
I don't have a strong opinion on this one. It seems like it's a new complication for very little gain, but maybe there are good arguments for it. All the ones I've seen so far in here have been aesthetic, which isn't a particularly compelling angle.
I don't think named parameters are an anti pattern, unless you consider the builder pattern one too, which is a very similar use case. Having 5+ params in a function may just be bad design, or it may be needed.
Variable-Arity functions can be worked around by using vecs or similar, but still something that should be core of the language, rather than having to use macros for every new function you create.
Optional/Default Parameters are great at providing intent within the signature.
All these 3 features would be a great addition.
Sorry, I realize now the wording was ambiguous. I meant that having functions with a large number of parameters is an anti-pattern.
As far as variable argument function support, are you really needing them for every function?
Not every function, just ones where it makes sense and improves readability.
Yes you could use a macro, but each time you have a different function you would need to create a new macro to support it.
As I commented above, you really can't have anything but a macro because otherwise there is no way to have initial variables that are structs or made through (for example) a new()
method. Having initial values is great, but kind of pointless without that ability (in my opinion).
The problem with macros is that they aren't well supported from a namespace standpoint, and so having a macro-per-kwarg-function is not acceptible.
Once macros are fixed to work inside namespaces though, I think that is the way to go here.
I've definitely wanted optional/default args. In most cases, it's "meh, good enough" to just have an extra function with a different name or allow None
or Default::default()
for the extra args, but in at least one case, I iterated over several solutions before deciding to take a single Into<AlgoRef>
arg and implemented the trait for both &str
and the tuple (&str, Into<Version>)
so I can call algo("foo/bar/0.1")
or algo(("foo/bar", Version::Minor(0,1)))
. It definitely feels hackish/unnatural to need the extra parentheses to accomplish this, but for the particular requirements, I preferred this trade-off (and have followed this issue ever since).
As for variable-arity, I haven't actually encountered a need, though I do think it'd be elegant if method-level macros existed to call things like logger.info!("foo: {}", foo)
- but I won't pretend to understand the implications.
It's probably easiest to leave variable-arity to macros, well they are used quite rarely.
I wonder if named arguments could be achieved with very ergonomic anonymous structs? You might call it like f({ .. })
because it's anonymous. I donno if named arguments are wise, but the extra { .. }
might help discurage their abuse.
I think care should be taken with default arguments since they could be used by mistake. A priori, I'd suggest the programmer to writes a placeholder like f(a, , b)
or f(a, !, b)
.. I suppose f(a, _, b)
sucks because _
might denote lambda.
Another option is simplify giving default parameters a specific name, so maybe an enum but probably a macro or inlined, and maybe scoped to the function. It offers the advantage that multiple defaults can be specified, but abusing defaults is discouraged.
It's absolutely doesn't matter what is anti-pattern and what is not. Programming language is a tool, not a vice squad. Yes, I don't like functions with many arguments, but sometimes we need it. I also don't like "unwrap", but sometimes it's needed. If something is just case of taste - it's not an argument.
On Fri, 25 Mar 2016 at 14:29 Jeff Burdges notifications@github.com wrote:
It's probably easiest to leave variable-arity to macros, well they are used quite rarely.
I wonder if named arguments could be achieved with very ergonomic anonymous structs? You might call it like f({ .. }) because it's anonymous. I donno if named arguments are wise, but the extra { .. } might help discurage their abuse.
I think care should be taken with default arguments since they could be used by mistake. A priori, I'd suggest the programmer to writes a placeholder like f(a, , b) or f(a, !, b).. I suppose f(a, , b) sucks because might denote lambda.
Another option is simplify giving default parameters a specific name, so maybe an enum but probably a macro or inlined, and maybe scoped to the function. It offers the advantage that multiple defaults can be specified, but abusing defaults is discouraged.
— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub https://github.com/rust-lang/rfcs/issues/323#issuecomment-201245439
Variable Arity and named parameters are also great predecessors to more FP stuff like currying.
It sounds like everyone is opposed to functions that do too much, and by extension, functions that may have way too many args, but I don't think that keyword/default args or variable arity should be on the chopping block because they could be misused.
Many argued strongly in favour of using structs, and I'm using structs as args extensively in a library I'm working on; but, that's because the data should be grouped together like position: Xyz{x: f64, y: f64, z: f64}
. However, this is a far cry from being a perfect solution in all cases.
To address @AndrewBrinker,
Variable-Arity Functions
Rust has macros, which can already be used to implement this sort of thing. Rust has a solid track record of keeping features orthogonal, and any non-macro-based system for this would be a strictly less-useful system re-implementing an ability already effectively available in the language.
I am strongly in favour of re-using existing tools if possible, but only if they're a good match for the job. I'm not even sure if macro's could solve the following problem, regardless their documentation doesn't make them look trivial to use/understand/maintain (and I'm no stranger to meta-programming and manipulating an AST for codegen). Whereas in the following code with variable-arity functions allowed, it's trivial for even a novice to understand what's happening.
After doing a bunch of programming in Erlang/Elixir, I really miss the ergonomics of variable arity/default arg functions at times like below:
pub struct SceneGraph<'a> {
pub tree: RefCell<RoseTree<GraphNode<'a>>>,
root: GraphNode<'a>,
}
impl<'a> SceneGraph<'a> {
pub fn add_node(&mut self, parent: NodeIndex, node: GraphNode<'a>) -> NodeIndex {
......//Extra logic removed
self.tree.borrow_mut().add_node(parent, node)
}
pub fn add_node(&mut self, node: GraphNode<'a>) -> NodeIndex {
self.add_node(self.root, node)
}
}
A: add_node/3
, where node: Option<GraphNode<'a>>
B: rename add_node/2
to add_node_root/2
.
C: if default args are powerful enough, they could do self.root
; But, let's imagine that we need to do some computation or something beyond the ability of a default arg.
If I understand correctly, A would carry a runtime penalty to unwrap and test.
B entails an additional method name/ergonomic issues to deal with when using the library. Alternative B also embeds unnecessary implementation details in its public interface, this poses a challenge if an internal implementation detail changes**, and then the method no longer accurately reflects its purpose, prompting me to rename it to add_node_wubba_lubba_dub_dub/2
. For the sake of this discussion, please don't get pedantic about this example, and focus on the spirit of the use-case (It can be useful for multiple functions to share the same name, because externally they do the same thing, and your public interface shouldn't leak implementation details.)
**With no change to the end user because :heart: semver.
If this gets supported, +1 for @azerupi's syntax suggestion, repeated below (although it might need some refinement to also work with variable-arity; Erlang/Elixir are good examples of doing both variable arity and default args).
Keyword args/defaults should be opt-in on a per variable basis, for the reasons @reddraggone9 mentioned.
// Mix of all pub fn mixed(a: i32, pub b: i32, pub c: i32 = 0) // Variable Arity pub fn mixed(a: i32 = 0)
Would create functions mixed/0
, mixed/1
, mixed/2
, mixed/3
. Where mixed/0,1
invoke the latter definition, while mixed/2,3
invoke the former. These need to be mutually exclusive sets, unless we want ambiguous function choices from the compiler a la Java et al, which IMO should be a compiler error.
Note: NAME/0
- Denotes a function called NAME
with arity 0 (Takes 0 arguments).
Can't a compiler just to reorder keyword arguments and fill an omitted arguments places with a default values (or raise an error when an argument does not have a default value and is not specified)? Consider the following function:
fn foo(a: i32, b: i32, c: i32 = 3, d: i32 = 4)
A programmer may call it using:
foo(1, 2, 24, 71)
- just like it works right nowfoo(5, 6, c: 17)
- gets translated into foo(5, 6, 17, 4)
foo(5, 6, d: 48)
- gets translated into foo(5, 6, 3, 48)
foo(5, 6)
- gets translated into foo(5, 6, 3, 4)
foo(b: 35, a: 18)
- gets translated into foo(18, 35)
, which, in turn, gets translated into foo(18, 35, 3, 4)
foo(b: 35)
- throws a compile error, because a
is not specified and does not have a default valueI believe, this will not have an impact on a runtime performance, because arguments reordering and stuff will be performed during a compilation.
@zmoshansky How do you propose handling a case like this:
pub fn add_node(&mut self, node=GraphNode.origin(): GraphNode<'a>) -> NodeIndex {}
Clearly (to me at least) you cannot do this in anything but a macro -- you are calling a method in the declaration.
@vitiral, Thanks for the feedback. As I understand it (GraphNode.origin()
being GraphNode::origin()
), this would be easiest to do with variable arity functions. By defining a add_node/1
this could be easily achieved as follows. Essentially add_node/1
would call on add_node/2
which would call upon add_node/3
.
pub fn add_node(&mut self) -> NodeIndex {
self.add_node(GraphNode::origin())
}
I think variable arity is particularly useful in cases like this where the default is too complex/not able to be done at compile time.
@vitiral You can just restrict the default values to whatever is assignable to const
, including const fn
, problem solved.
@e-oz, encouraging antipatterns by providing fancy syntax for it is not something that is aligned with Rust's design philosophy.
@zmoshansky ah, I see. I see now that variable arity was one of the options given in the thread title - I was pretty focused on kwargs.
Variable arity is certainly possible. I'm honestly not sure why it's not implemented already since it so simple. I could see it causing problems with function documentation (since you now have multiple functions), as well as lending a hand to making confusing APIs. It might help in some places though.
@aldanor I would see that as of pretty low usefulness, but maybe that is just me. Having complex structs with new
defined, but not actually being able to pass any variables in would get annoying. You would have to reimplement new
for every value.
Coming from python it seemed at first glance like this would be really useful for dynamic string formatting (#642)
let template = String::from(".. {id} .. {id} .. {id} ..");
template.format(id="foo");
... but that won't actually work. In python keywords are just hashmaps which seems like a nonstarter for rust.
Regarding keywords this changes my previously tacit :+1: to :-1:, unless I misunderstood and keywords aren't supposed to be just syntactic compile time checked sugar for positional arguments.
A portion of the community (and of the core team) sees one or more of the following features as important for programmer ergonomics:
This issue is recording that we want to investigate designs for this, but not immediately. The main backwards compatibility concern is about premature commitment to library API's that would be simplified if one adds one or more of the above features. Nonetheless, We believe we can produce a reasonable 1.0 version of Rust without support for this.
(This issue is also going to collect links to all of the useful RFC PR's and/or Rust issues that contain community discussion of these features.)