Closed anuragsoni closed 7 years ago
Thanks for the PR!
Now that I see the actual diff, I'm not sure this is going to be worth it. It looks like we're just trading having to type Some(...)
everywhere for having to type .as_str()
everywhere.
as_str
looks like it's necessary because using this new generic form of the error constructor function stops the deref coercion (&String
to &str
). Is that correct? That's what I concluded from playing around with this in a toy program:
fn main() {
let e1 = error(None);
let e2 = error(Some("some str"));
let e3 = error("bare str");
let e4 = error(Some(&format!("some string")));
let e5 = error(&format!("bare string"));
}
fn error<'a, S>(message: S) -> ApiError where S: Into<Option<&'a str>> {
ApiError {
error: match message.into() {
Some(message) => message.to_string(),
None => "An error occurred.".to_string(),
},
}
}
struct ApiError {
error: String,
}
$ rustc foo.rs
error[E0277]: the trait bound `std::option::Option<&str>: std::convert::From<std::option::Option<&std::string::String>>` is not satisfied
--> foo.rs:5:14
|
5 | let e4 = error(Some(&format!("some string")));
| ^^^^^ the trait `std::convert::From<std::option::Option<&std::string::String>>` is not implemented for `std::option::Option<&str>`
|
= help: the following implementations were found:
= help: <std::option::Option<T> as std::convert::From<T>>
= note: required because of the requirements on the impl of `std::convert::Into<std::option::Option<&str>>` for `std::option::Option<&std::string::String>`
= note: required by `error`
error[E0277]: the trait bound `std::option::Option<&str>: std::convert::From<&std::string::String>` is not satisfied
--> foo.rs:6:14
|
6 | let e5 = error(&format!("bare string"));
| ^^^^^ the trait `std::convert::From<&std::string::String>` is not implemented for `std::option::Option<&str>`
|
= help: the following implementations were found:
= help: <std::option::Option<T> as std::convert::From<T>>
= note: required because of the requirements on the impl of `std::convert::Into<std::option::Option<&str>>` for `&std::string::String`
= note: required by `error`
error: aborting due to 2 previous errors
Am I understanding this trade off correctly, or is there a way to have our cake and eat it too?
/cc @ruma/contributors
From what I gathered the auto deref does not kick in when trait matching
I might be missing a few cases, but from what I could tell in the code, in most cases the call to ApiError methods is made using format!
like in the snippet below. In that case if we try to accept an Option.as_str() or &String[..]
.
But again, like you mentioned it might not be a worth enough to switch to using the Into
fn error<S>(message: S) -> ApiError where S: Into<Option<String>> {
ApiError {
error: match message.into() {
Some(message) => message,
None => "An error occurred.".to_string(),
},
}
}
struct ApiError {
error: String,
}
fn main() {
let _e1 = error(None);
let _e4 = error(Some(format!("some string")));
let _e5 = error(format!("bare string"));
}
Thanks for those links—that was very helpful. I think you're right that a lot of the awkwardness comes from trying to use references when at most of the call sites we're creating Strings, and the final struct value ultimately stores the message as a String. Let's go ahead and change the signature to the one in your code example:
fn error<S>(message: S) -> ApiError where S: Into<Option<String>>
It feels like a better compromise to have to call to_string
at the call site in the event that a string literal is passed in instead of a formatted String.
Okay. I should be able to push the change for review sometime today!
Sweet! Thanks very much for your work.
115