Closed douglascamata closed 1 year ago
This doesn't really work. Enum
does not implement generic arithmetics. Enum#+(Enum)
is not defined.
What you need to do here is to sum up the value
s: commands.sum(&.value)
I would love Crystal to either let me tell it what's the zero-value of my Enum.
That's totally possible. It just won't lead very far for the reason mentioned above.
But you can implement Command.zero
for a generic zero value (alternatively, you can also call it Command.additive_identity
).
Or simply pass the zero value as an argument to #sum
.
Or it could assume it's the first.
Again, for the reason mentioned above, this won't happen. Enum
is not a numerical type.
Gotcha, @straight-shoota. I thought that because my Enum
was of the standard type (some variation of Int
, if I recall correctly), Crystal would do some "magic" to make it work for me.
All the ideas proposed sound plausible to me. :D
Definitely not a bug then. Closing this, sorry for the noise and thanks a lot for the suggestions.
@straight-shoota if Enum does not implement generic arithmetic, then I am surprised that sum is even a method on Enum at all. I would have expected a compiler error that sum is not defined. Instead you get a confusing error that zero is not defined.
then I am surprised that sum is even a method on Enum at all
I think the catch is that there isn't, as it's called on the Array
type. But that implementation expects T
to have either .zero
or .additive_identity
defined. At least when using the argless overload.
Oh my apologies. For some reason I misread and I was thinking sum was defined on the Enum, but this is on Array.
Even so, the error message is not clear to me, and I think there might be a nicer way to present it.
But that implementation expects
T
to have either.zero
or.additive_identity
defined.
And the addition operation itself.
Yes, these kind of compile time errors are difficult to understand, because the actual error happens somewhere down the stack and it doesn't immediately point out the problem. It's hard to improve that though.
I suppose we would either need some kind of rescue
for catching and rewriting compiler errors, or check all possible error conditions at the entry point, in this case Enumerable#sum
which would be a lot of duplication.
I think this brings us back to --error-trace
being insufficient in its current form. If more of the stack was shown, the user would have known why .zero was being called.
Interestingly, I was looking for the old PR where asterite was working on it and he gave this exact example: https://github.com/crystal-lang/crystal/pull/11756#issuecomment-1019326187. I still contend that while we couldn't come to full agreement there, we still need to do something about --error-trace
.
I was about to comment exactly this but I'm getting tired of sounding like a broken machine.
I think we should do this. The simplest thing we can do is to remove the flag and always show the full error trace. Unfortunately I might be the only core team member who thinks that way.
Mind that I am new in the Crystal world, but why can't the compiler tell me cannot call Array(T)#sum when type T doesn't implement X
, where X
is AdditiveIdentity interface/signature
or something like this? Seems like a great case for applying abstract class
, from the little I know about Crystal.
Indeed, as @asterite wrote on his original issue, this error is so cryptic to a beginner like me. The compiler should report an error on my code, not on enumerable.cr
.
where X is AdditiveIdentity interface/signature
There are a couple of issues with that:
For point 2, we want something like:
module Enumerable(T)
def sum where T < Additive
end
end
that is, putting a restriction on the T
that's in Enumerable(T)
: the type isn't on any argument.
And even after making it match an Additive
module you'll get an error saying "Hey, T must be Additive". But that isn't enough to know what you need to implement. You'll have to jump to the Additive
docs, read it, etc.
In Crystal you get an error similar to what you would get in Ruby: "Hey, you need to implement a zero
class method for this type in order to sum it" And I personally think that's a clear error message: if you want to sum a bunch of values you have to start by the value zero, and then you need to define what's the value "zero" for your type.
Once you do that I think the compiler will tell you to implement +
on your type. So the compiler is already guiding you toward a solution, one method at a time.
That said, this can definitely be improved, it's just that it's not clear how.
That is #3298. My preferred syntax would be something like:
# no bounds = unconstrained generic
module Enumerable(T) forall T; end
# partial specialization
module Enumerable(T) forall T <= Additive
def sum; end
end
But I don't like the idea that types must explicitly include Additive
, because such a module is unlikely to have anything other than abstract defs. So ultimately we would probably be looking for some kind of trait / concept types or #2549
Bug Report
Playing around with the language on Advent of Code, I noticed that when you call
Enumerable#sum
on anArray
of anEnum
that starts at 1 the compiler errors out because it cannot figure out the zero-value of theEnum
.Example:
Running this will trigger:
I think the error makes sense. I would love Crystal to either let me tell it what's the zero-value of my Enum. Or it could assume it's the first.