The problem is that the compiler casts the value 'str' to 'type', instead of using Value(str). Of course, fixing this means that we have to always pass a Value for a type instead of 'type' for any argument which is a type. This is perhaps more specific than what we want in some cases.
To fix this, we may need to add some additional control to the type-signature allowing us to specify what level of detail the compiler (and also interpreter) should assume for a given argument.
I want this to work correctly, but it doesn't:
The problem is that the compiler casts the value 'str' to 'type', instead of using Value(str). Of course, fixing this means that we have to always pass a Value for a type instead of 'type' for any argument which is a type. This is perhaps more specific than what we want in some cases.
To fix this, we may need to add some additional control to the type-signature allowing us to specify what level of detail the compiler (and also interpreter) should assume for a given argument.