Open InsertCreativityHere opened 1 year ago
Do other languages have type alias restrictions like this?
Rust doesn't really have inheritance, but it does have a restriction similar to what I'm proposing. Trying to use a type-alias as a supertrait (closest in Rust to interface inheritance) gives the follow error:
error[[E0404]](https://doc.rust-lang.org/stable/error_codes/E0404.html): expected trait, found type alias `Alias`
--> src/lib.rs:6:13
|
6 | trait Test: Alias {}
| ^^^^^ type aliases cannot be used as traits
For more information about this error, try `rustc --explain E0404`.
trait Trait {}
type Alias = Trait;
trait Test: Alias {}
C# will allow you to inherit through type aliases, so this code compiles just fine:
using Alias = Main;
class Main {}
class Sub: Alias {}
But C# doesn't let you mark aliases as optional, or apply attributes on them. So inheriting through them is always safe in C#. Where-as in Slice, you can mark typealiases as optional, or apply attributes on them. And this is precisely where the problem with allowing this syntax arises for us.
Where-as in Slice, you can mark typealiases as optional
See #474.
or apply attributes on them.
Which attribute can you apply on the typealias for an interface?
[????]
typealias MyFoo = Foo
Currently, the Slice compiler will allow inheritance to occur through type-aliases like so:
This is equivalent to having
Bar
inherit fromFoo
directly.This works, but I think it's not very useful and invites users to do some strange things.
It's Not Very Useful
The motivation for typealiases is to be able to describe a long and complex type once, and use a simpler shorthand for it everywhere else, de-cluttering your Slice. For example:
This goal is completely inapplicable to inheritance though: 1) It is illegal to place attributes within an inheritance list. 2) Inheritance is unlikely to occur as often as referencing a type. 3) You just specify an identifier, there's no complexity like with sequences/dictionaries.
The only place where using a typealias could shorten your code meaningfully is if the type you're inheriting from is in a deeply nested namespace. But the real fix to that is a
using
statement (which we plan to add: #74). So I don't think that usingtypealias
as a poor-man'susing
is a compelling reason.It Invites Bad Syntax
While the first example is totally fine, type-aliases don't know what context's they'll be used in, so it's perfectly valid for users to change the typealias like so:
But now, this inheritance is bogus; it's meaningless to inherit from an optional type, or one with attributes on it.
It adds Complexity
Obviously the machinery to handle resolving type-aliases, and validating that they're the right type (classes can only inherit from classes), non-optional, and have no attributes complicates the compiler.
Normally I'm all for language flexibility and features, but in this case, supporting this is at best useless, and at worst a pitfall for users. It's also more true to the name typealias. It's intended to be used for types, which is orthogonal to inheritance. For instance
exception
s support inheritance, but weren't types until very recently.