Open yashpola opened 7 months ago
this is from textbook @yashpola
Personally I'm also having doubt when doing this question, as the textbook only mentioned "Use enumerations when a certain variable can take only a small number of finite values."
How do we define small in this case? 10001 is considered small compared to the Integer.MAX_VALUE
. In the case, if I created an enumeration that takes values ZERO
, ONE
, TWO
, ... , TEN THOUSAND
, is it acceptable?
Personally I'd say 10001 is "common sense" enough. You don't want to spend time in your job typing out 10001 lines for the enum (remember that in Java you have to type out all the enum values and it must reside in the source code), and of course such code won't get approved.
That said, "can take only a small number of finite values" feel like an incorrect explanation. The common sense explanation would be "use enum when the names are meaningful, and use integer when the integer value is meaningful".
E.g. even if you assume "each person can have at most 3 children", it would not make much sense to create enum NumberOfChildren { ZERO, ONE, TWO, THREE };
since if you want to add one children you need to switch-case it.
In particular, always use enum for nominal types, and don't use enum for interval/ratio types. (https://en.wikipedia.org/wiki/Level_of_measurement)
Technically in Python you can dynamically generate an enum with a large number of values. But at the moment I can't think of a valid use case, since __getattr__
exists anyway.
Is it because the values don't "mean" anything persay unlike
Priority
for instance which has aHIGH, MEDIUM, LOW
of 0, 1, 2? That is, we can just check whether the integer is in range 0 to 10000 before assigning it instead of using an enumCould it also be inappropriate to use an enum for 10000 values? It seems extremely cumbersome