python / typing

Python static typing home. Hosts the documentation and a user help forum.
https://typing.readthedocs.io/
Other
1.58k stars 233 forks source link

Mark Shannon's presentation at the 2017 Language Summit #432

Closed gvanrossum closed 6 years ago

gvanrossum commented 7 years ago

@ilevkivskyi @markshannon.

Mark observed that the typing module uses classes to represent types. This can be expensive, since e.g. the type List[int] really ought to be the tuple (List, int) but it's actually a class object which has a fair amount of overhead (though not as much as early versions of typing.py, since we now cache these class objects).

If we changed to tuples (or at least objects simpler than class object), we'd have a problem: The simpler object couldn't be subclassed from. But who subclasses List[int]? Then again, maybe simpler objects aren't the point?

Mark also pointed out that after

from typing import List
class C(List[int]): pass
print(C.__mro__)

We find that C.__mro__ has 17 items!

I confirmed this. The roughly equivalent code using collections.abc

from collections.abc import MutableMapping
class C(MutableMapping): pass
print(C.__mro__)

has only 7 items. And subclassing builtins.list

class C(list): pass
print(C.__mro__)

has only three.

This affects performance, e.g. which is faster?

class C(list, Sequence[int]): pass
C().append(1)
class D(Sequence[int], list): pass
D().append(1)

One append() call is 10% faster than the other.

JukkaL commented 7 years ago

I agree with Mark that this is a problem and can be a blocker for adopting type annotations for some projects. We are mostly shielded from the problem at Dropbox because we use comment annotations, but once we migrate to Python 3 we'd potentially have problems with the current approach as well.

Inheriting from List[int] (or, say Dict[str, Foo]) is occasionally useful and probably worth supporting, but Mark's suggestion to use a different syntax for this seems like a reasonable compromise. I think his suggestion was to do something like this, since List[int] is not a class:

from typing import implements, List

@implements(List[int])
class MyList(list): ...

It's unclear whether this would also affect user-defined generic classes. For example, consider this code that works currently:

class Box(Generic[T]): ...

class SpecialBox(Box[int]): ...

class AnotherBox(Box[T]): ...

def do_stuff(box: Box[int], box2: AnotherBox[str]) -> None: ...

If I understood things right, this would be written like this based on Mark's proposal:

class Box(Generic[T]): ...

@implements(Box[int])
class SpecialBox(Box): ...

@implements(Box[T])
class AnotherBox(Box): ...

def do_stuff(box: Box[int], box2: AnotherBox[str]) -> None: ... # No change

This would still mean that user-defined classes may trigger metaclass conflicts due to Generic. This would mostly happen with user-defined generic classes -- just inheriting from Iterable[str], for example, would no longer imply any metaclass restrictions.

These things should probably also work:

@implements(Tuple[int, str])
class MyTuple(tuple): ...

@implements(Tuple[int, ...])
class MyUniformTuple(tuple): ...

@implements('List[int]')  # Still slightly more efficient
class MyList(list): ...
gvanrossum commented 7 years ago

Personally I think this part of the proposal is not workable. E.g. inheriting from Sequence (an existing ABC repurposed as a type) also adds some concrete default implementations, such as __iter__ and __contains__. You'd have to write

@implements(typing.Sequence[T])
class MyList(collections.abc.Sequence):
    ...
JukkaL commented 7 years ago

Maybe that could written like this:

@implements(typing.Sequence[T])
class MyList(typing.Sequence):
    ...

There would still be a duality -- Sequence and other ABCs would be both classes (for things like inheritance, isinstance, etc.) and types. However, indexing a generic class would result in an object that is only a type, not a class. So these would all be true:

This is still a usability regressions and a backward compatibility break, as defining subclasses of generic classes would become different and slightly harder. However, maybe we can live with this, since after we have protocols, using Iterable[x] and other common ABCs as base classes would no longer be necessary. Also, by making the distinction between types and classes explicit, things may be less confusing for users.

Finally, generic types such as Dict[int, T] would have to be real objects, since we need to be able to index them, due to generic type aliases.

markshannon commented 7 years ago

Does anyone actually define their own generics, or even inherit from a type?

As a data point, Zulip makes no use of inheritance to define a class and a type at the same time. In other words there is no code of the form class C(T): ... where T is a class in the typing module does not exist in the Zulip code base.

https://lgtm.com/query/1957210066/project:140750185/lang:python/

dmoisset commented 7 years ago

I've done both things (define own generics and inheriting from something lice Dict[str, str]) in the context of writing stubs for popular libraries, and I've done the second (inheriting from a specific realization of a generic) in application code

markshannon commented 7 years ago

How often, though? I expect that it is a very rare thing to do. For the first case (stub files): Stub files are special, we never run them, so it might be OK to allow a type as a base class in stub files. For the second (application code): Is the code open source? It would be good to have some real world examples.

JukkaL commented 7 years ago

Mypy itself has three examples:

$ ag 'class.*\(Dict' mypy
mypy/binder.py
18:class Frame(Dict[Key, Type]):

mypy/nodes.py
2351:class SymbolTable(Dict[str, SymbolTableNode]):

mypy/test/testsemanal.py
215:class TypeInfoMap(Dict[str, TypeInfo]):

Also our internal Dropbox codebases have dozens of examples of dict subclasses. Subclassing list seems pretty rare though.

JelleZijlstra commented 7 years ago

As a user, I'm not too bothered with the current state of things, since it's rare that I need to subclass a generic. (I found only a single inheritance from Generic[T] in my codebase.) I hope we won't switch to suboptimal APIs like @implements just to improve performance in rare cases.

markshannon commented 7 years ago

@JelleZijlstra It isn't just performance. Less code, means fewer bugs. Also, keeping types and classes distinct in the implementation helps user maintain mental separation.

markshannon commented 7 years ago

@JukkaL class Frame(Dict[Key, Type]): would be better (IMO) as

@implements(Mapping[Key, Type])
class Frame(dict):

That way the type (mapping) can be kept separate from the implementation (dict).

Similarly for the other two examples.

JukkaL commented 7 years ago

@markshannon What would then happen to methods defined in dict but not in Mapping, such as __init__? In particular:

  1. Are they available through Frame (I suppose so, as otherwise there's no way to construct an instance)?
  2. What will the signature of __init__ etc. be (with respect to type variables of dict)?

@gvanrossum I wonder if there would be a way to simplify the MROs even if we inherit from List[int] etc.? Could we filter out the cruft, and maybe have a separate __typed_mro__ attribute or something that would have all the generic base classes? Then we could use an inspection API to access the full MRO. This wouldn't directly help with the space usage of generic type objects, though, but maybe this would help with the slower method calls.

ilevkivskyi commented 7 years ago

Here are my comments:

JukkaL commented 7 years ago

Yeah, migration would already be tricky. On the other hand, two years from now it would be still much harder, so if we are going change something fundamental, we should do it as soon as possible.

markshannon commented 7 years ago

First of all a bit of history. In the run up to accepting PEP 484, in an email thread with @gvanrossum and @JukkaL I specifically requested that all isinstance() and issubclass() implementations be removed. Looking back, however, it looks as if I didn't make that public. My bad.

I have consistently been of the opinion that equating types and classes is a bad idea. What has changed is that we now have evidence that doing so is complex and slow and that inheriting from a type is very rare.

@ilevkivskyi The specific proposal is that no classes representing types should inherit from type. This would simplify the typing module and reduce its performance impact by a large amount. But performance is not the only problem; coupling types to classes impairs understanding of an already subtle topic.

As the use of type-hints spread, I expect that applications that declare types will remain a (small?) minority, but that applications that use at least one module that uses type-hints will become common. Therefore, most applications will pay the performance cost of loading the typing module plus the cost of creating types in their library code. We should keep that cost as small as possible, ideally zero.

gvanrossum commented 7 years ago

FWIW isinstance() was indeed removed, per your request. issubclass() remains because there were problems with removing it. There's still an issue open about removing it.

ilevkivskyi commented 7 years ago

issubclass() remains because there were problems with removing it. There's still an issue open about removing it.

issubclass() was removed in September (by me).

dmoisset commented 7 years ago

@markshannon :

How often, though? I expect that it is a very rare thing to do. For the first case (stub files): Stub files are special, we never run them, so it might be OK to allow a type as a base class in stub files.

The problem here is that it gets harder in practice to have a generic in a stub and non-generic in the library. For example working on the numpy stubs, I defined ndarray as generic (on the element type) so you can write x: ndarray[float], but to run that code you are forced to use comment or quoted annotations, otherwise the x declaration will fail in runtime.

The examples of inheriting things like Dict are used in my django stubs for stuff like the QueryDict (to represent vars in http requests) which inherits a dict with specific keys/values, and the cookie jar (also a dict with string... this is essentially the same code as in stdlibs http.cookies module)

For the second (application code): Is the code open source? It would be good to have some real world examples.

This code is not opensource, but the usecase is essentially some kind of generic container (to represent results of API calls that are collections, but not necessarily a pythonic container)

markshannon commented 7 years ago

@dmoisset How is ndarray any different from list? Presumably for numpy we would need a NdArray type analogous to List.

JukkaL commented 7 years ago

I talked about this with Mark in person, and he had another idea that would not break compatibility (at least not as much). Not sure if I understood it fully but I'm attempting to capture the idea here so that we don't completely lose track of it. @markshannon please let me know in which ways I misunderstood you :-)

We could make types like List[int] be regular (non-type) objects but make them valid in base class lists in a future Python version by adding a __type__ magic method for type-like objects that would return the corresponding normal type object, that would take effect if the type is used as a base class. For example, List[int].__type__() could return list. Thus this would still be okay:

class MyList(List[int]): ...

However, the MRO of MyList would only include regular type objects like list and object, not generic type objects. Maybe we could preserve the full MRO which may include generic types and such in a separate type object attribute.

ilevkivskyi commented 7 years ago

Here are some more comments:

One append() call is 10% faster than the other.

This situation (list and Sequence[int] in different orders) looks rather like an extreme case. For typical things like:

class C(List[int]):
    ...

list is third in C.__mro__. This is the same for other types, Dict, Set, etc: __extra__ is typically inserted near the start. Moreover, such situations are quite rare, so that the actual overhead in real code could be very small (less than 1%). Are there any realistic benchmarks?

I looked at my old profiling results and things I have found really slow are instantiation of user defined generic classes (up to 10x slower), and valid isinstance checks (3-6x slower) like this:

class C(Generic[T]):
    ...
c: C[int] = C()
isinstance(c, C)

The first situation (instantiation) can be made few times faster by "inlining" _gorg() and _geqv() (these two turns out to be super-hot, although there is no point in re-calculating them). The second one (instance check) is slow because all generics are ABCs, and instance and class checks for ABCs are in principle much slower than for normal classes.

@JukkaL

We are mostly shielded from the problem at Dropbox because we use comment annotations

I don't understand this point. As I understand, you are still heavily using subclassing generic classes even in Python 2, and Mark claims that this is the main problem. Do you see any slow-down in Dropbox code when you make classes generic (or inherit from List[int] etc)?

We could make types like List[int] be regular (non-type) objects but make them valid in base class lists in a future Python version by adding a __type__ magic method

Interesting. This is exactly what I first thought when this thread started. This approach however should be well thought out. For example, when the fallback to __type__ should happen? There are four possible scenarios: 1) metaclass conflict:

class C(int, 1): # This fails with metaclass conflict
    pass

2) special TypeError:

class C(int, object()): # This fails with TypeError: bases must be types
    pass

3) bad signature:

class A:
    pass
class C(A()): # This fails with TypeError: object() takes no parameters
    pass

4) good signature:

class A:
    def __init__(*args, **kwargs):
        pass
class C(A()): # This actually works!
    pass

However, I think if we will figure this out, then this will fix many performance problems (including the original one, and two that I mention above) while preserving full backwards compatibility.

JukkaL commented 7 years ago

I don't understand this point. As I understand, you are still heavily using subclassing generic classes even in Python 2, and Mark claims that this is the main problem.

Subclassing is just one of the problems that I've heard being talking about. Another potential issue is the memory use of annotations, as generic types take some space. I'm aware that identical types are shared, so it's much better now than it used to be. It can still be a problem for memory-constrained environments such as microcontrollers (MicroPython) and very large applications, perhaps. Startup overhead is another worry some users have. This should be easy to measure, though.

ilevkivskyi commented 7 years ago

It can still be a problem for memory-constrained environments such as microcontrollers (MicroPython) and very large applications, perhaps. Startup overhead is another worry some users have. This should be easy to measure, though.

This makes sense, I just tried this on my laptop, here is what I get:

I was thinking a bit more about __type__ and I think it should work as an override, i.e. Python runtime should prefer it if found even if the class definition would succeed without it. Maybe we should then call it __type_override__?

Another question is should this be a function or just an attribute pointing to the actual class object? At the time of subscripting of a generic class we already know that it should just point to the original class. (Btw this attribute will also allow us to remove completely _gorg and _geqv mentioned in my previous comment.)

There is another thing that will help to speed-up things: a flag that ignores all annotations for modules, classes, and functions (like for docstrings currently). Both this flag and __type_override__ are easy to implement. This is now just the question of making this decision.

gvanrossum commented 7 years ago

Lots of people seem interested in runtime processing annotations. E.g. https://github.com/ericvsmith/dataclasses

ilevkivskyi commented 7 years ago

Lots of people seem interested in runtime processing annotations.

Good point! Ignoring annotations flag will break NamedTuple, TypedDict, and all similar constructs.

JukkaL commented 7 years ago

Good point! Ignoring annotations flag will break NamedTuple, TypedDict, and all similar constructs.

It's still possible to use the functional forms of these even if annotations are ignored. Also, we could perhaps populate annotations dictionaries in classes but use None instead of the actual types, for example.

gvanrossum commented 7 years ago

A global flag would break libraries that use the non-functional notation (which is much nicer anyway).

ilevkivskyi commented 7 years ago

@gvanrossum

A global flag would break libraries that use the non-functional notation (which is much nicer anyway).

But what do you think about the idea of setting all annotations to None by this flag? This will keep NamedTuple etc. working and will only affect runtime type checkers (but I expect that the sets of people who use runtime type checkers, and people who will use the optimization flag are probably non-overlapping).

gvanrossum commented 7 years ago

I worry about library developers that use type annotations e.g. to construct JSON schemas. We already have problems with -O and -O.

What problem are we really trying to solve, other than Mark complaining?

JukkaL commented 7 years ago

I guess we are not doing things in a logical order -- first we should do benchmarking to determine if there is a significant problem right now. In addition to @ilevkivskyi 's results, these data points would be interesting and hopefully not difficult to generate (though they are still a significant amount of work):

Perhaps we could extrapolate that the relative overhead for mypy would be similar for other fully annotated, medium-sized programs. Not sure if we can extrapolate to much larger codebases, but at least the extrapolation could give a rough estimate for larger codebases as well.

markshannon commented 7 years ago

@gvanrossum What we are trying to solve is:

All the above apply to CPython as well as MicroPython.

ilevkivskyi commented 7 years ago

What problem are we really trying to solve, other than Mark complaining?

Good question!

first we should do benchmarking to determine if there is a significant problem right now

I did many benchmarks, but I think they are all not reliable, since they are micro-benchmarks focused on the speed of specific things like generic class instantiation, isinstance call, generic class subcscription, generic class subclassing, etc. What we really need is benchmarking some medium-size real programs. I agree mypy mypy is a good candidate for this. In addition to items mentioned by Jukka I would add:

Here are some speculations: implementing __type_override__ in CPython should be simple. Corresponding backward compatible changes in typing are also not difficult to implement (e.g. Union was very easy to convert from a class to an object). My expectation for speed gain for a typical project like mypy would be 3-5%

markshannon commented 7 years ago

Can we flip this discussion? Instead of demanding justification for types not being metaclasses, can someone justify why types need to be metaclasses? Making a class inherit from (the class) type is hardly standard.

As to maintaining backwards compatibility, @JukkaL misinterpreted what I meant, but I prefer his approach, so lets go with that :smile: Not sure about the name __type__ though, as a method that a type implements to return a class, I would prefer __runtime__ to similar.

JimJJewett commented 7 years ago

Why is micropython a concern? They already make plenty of other changes to support a radically smaller memory footprint, including leaving out most of the standard library by default.

Since the primary use case of typing is for static checks, it would presumably be run only during development (on a more powerful machine), and NOT on the small devices. I think the default would be to not even make the annotations available on the small device; if they do choose to support some sort of typing for run-time checks, a lighter-weight version than CPython uses would be less of a compatibility issue than some of the other changes already are.

(Making typing more efficient, particularly when not used, is still a good goal, but I don't think micropython in particular is an important use case.)

ilevkivskyi commented 7 years ago

Why is micropython a concern?

It is not the only concern. There are several aspects where typing can be made significantly faster (making types and classes more distinct also makes sense). TBH, I am interested in playing with this, but I don't have much time for this now. Also there is one "conceptual" problem: most probably this will require the __type_override__ mentioned above to allow subclassing non-classes, but this can be done only in Python 3.7, so that we will end up with two versions of typing: one slow "backport" version, and one fast version for Python 3.7+. Not everyone will be happy with such perspective.

ilevkivskyi commented 7 years ago

Here is PR #439 with some short term solutions I mentioned above (as I predicted this gives around 5% speed-up for real applications that use typing).

ncoghlan commented 7 years ago

The idea of a __subclass_base__ method to allow an object to specify a different base class that gets used when it appears as a nominal base class in a class definition seems interesting. Should that be filed separately as an RFE on bugs.python.org?

Or do you want to experiment with mutating the bases list in typing.TypingMeta.__new__ first? (That will presumably be necessary anyway if you want to ensure consistent behaviour across Python versions)

ilevkivskyi commented 7 years ago

@ncoghlan The idea is indeed interesting since it will not only remove the performance concerns, but will also avoid metaclass conflicts as e.g. #449 (by using __init_subclass__) and probably will make the typing.py code simpler, while preserving the public API.

Should that be filed separately as an RFE on bugs.python.org?

I am quite sure now that this is a reasonable idea. We have tried different variations of bases substitution in GenericMeta.__new__, and although they give some speed wins, this still looks suboptimal. I don't have much time now, but since these issues keep coming, I think this is a priority so that I will come up with a POC implementation soon (my idea is to just patch __build_class__, so this should be quite simple).

ncoghlan commented 7 years ago

OK, cool. In that case, I think the most important part of my comment is the suggested method name: __subclass_base__. My rationale for that:

  1. We want the word class in there somewhere, since the initial purpose is to convert a conceptual type into a concrete runtime class
  2. I'd like to have the word base in there, since our main known use case is to be able to include conceptual types in a list of base classes without having them actually appear in the runtime MRO
  3. I think calling it __subclass_base__ aligns nicely with __subclass_init__: where __subclass_init__ lets a base class do something when a new subclass is defined, __subclass_base__ is instead a way to say "Nope, you don't want me, you want this other class instead"
JelleZijlstra commented 7 years ago

Minor point: it's __init_subclass__, not __subclass_init__ (https://docs.python.org/3/reference/datamodel.html#object.__init_subclass__).

ilevkivskyi commented 7 years ago

OK, I have a POC implementation. Here are observations:

Concerning the last point, there are two possible options:

a) go with a simple solution (it will already give great speed-up) and keep GenericMeta (I will document it then). There is a problem with this solution: many libraries use metaclasses, this means that users who want generic classes that subclass library classes will need to manually pass metaclass=... to all such classes, I could imagine this is annoying, and already have seen this complain several times.

b) We could modify PyObject_GetItem inserting a fallback for classes right before "object is not subscriptable". For example something like this (plus some safety checks):

...
PyObject_GetItem(PyObject *o, PyObject *key)
{
    ...
    if (PyType_Check(o)){
        fallback = PyObject_GetAttrString(o, "__class_getitem__")
        if (fallback == NULL){
            goto error;
        }
        esle{
            /* pack 'o' and 'key' into 'args'*/
            return _PyObject_FastCall(fallback, args, 2);
        }
    }
    ...
}

My idea is that people rarely subscript random things inside try: ... except TypeError: ... so that the speed penalty will be negligible.

@gvanrossum @ncoghlan what do you think? Should we go with option (a) or (b)? (I like (b) a bit more since it is quite simple, however it introduces a new dunder.)

ncoghlan commented 7 years ago

I'm not sure I'm entirely following the problem:

ilevkivskyi commented 7 years ago

@ncoghlan GenericMeta.__getitem__ is needed to make this work:

class Custom(Generic[T]): ...
Custom[int]  # Should be OK
Another(Custom[T]): ...
Another[int]  # Should be also OK

Everything else seems to be possible without a metaclass (only with __init_subclass__).

Concerning the performance there are two major slow-downs currently:

Both above problems will be fixed by __base_subclass__, the problem with metaclass is orthogonal, but the point is that if we go with __base_subclass__ then avoiding metaclass is possible (and easy) otherwise it would be a non-starter.

ilevkivskyi commented 7 years ago

Two additional notes:

ncoghlan commented 7 years ago

Regarding the name, while Jelle's right that the implemented name is __init_subclass__ (we went back and forth enough times during the design process that I often forget where we ended up), the new API should still be __subclass_base__, as __base_subclass__ sounds like we're requesting a subclass of the base class, which isn't what's happening.

The TypeVar case is an interesting one, but it seems to me that it could potentially be addressed by:

  1. Always calling __subclass_base__ on GenericMeta instances
  2. Duplicating the current _check_generic call from getitem, and returning the class itself from __subclass_base__ if it's actually still generic

If the isinstance check also proves to be too slow (or otherwise impractical), then I'd suggest we look at ways of optimising that before going down the __class_getitem__ path.

ilevkivskyi commented 7 years ago

@ncoghlan

Regarding the name ...

OK

The TypeVar case is an interesting one, but it seems to me that it could potentially be addressed by...

Yes, it works perfectly if we keep the GenericMeta, the only problem is that keeping GenericMeta will cause metaclass conflicts, this is why I am not 100% happy with it. But anyway, it seems to me the best strategy is to do this in two steps:

gvanrossum commented 7 years ago

I'm a little lost. Ivan, if you have an implementation somewhere, can you link to it? I presume it's modifications to the C code of CPython? If we're going that route, what will happen on Python 3.5 and before? (Or on 3.6 and before if we decide this is too invasive to go into CPython 3.6.3.) I suppose you can fall back to metaclasses.

If we want Custom[int] without a metaclass, and we're changing C code anyways, could we add a __getitem__ implementation to type itself that defers to __class_getitem__?

Another solution to the 3rd party metaclass problem (which is real) could be to just recommend people inherit their metaclass from abc.ABCMeta instead of directly from type -- would that work in most cases? (I realize it would slow things down.)

Why is this still in the "Mark Shannon" thread? I think I missed a part of the conversation.

ilevkivskyi commented 7 years ago

@gvanrossum

I'm a little lost

Sorry, probably we went to fast. Here is a short summary:

If we're going that route, what will happen on Python 3.5 and before? (Or on 3.6 and before if we decide this is too invasive to go into CPython 3.6.3.) I suppose you can fall back to metaclasses.

Most probably we will need to have a separate source file for newer versions (like we now have for Python 2 and 3). I think there will be so many fallbacks so that the code will be hard to read. I expect that __subclass_base__ will really simplify the code. I am going to invest more time to show how it will look.

...could we add a __getitem__ implementation to type itself that defers to __class_getitem__?

This is actually another possibility that I was thinking about. Maybe it is even better (it is a more "local" change anyway).

recommend people inherit their metaclass from abc.ABCMeta instead of directly from type -- would that work in most cases?

It will probably fix vast majority of metaclass conflicts. I think we should start from a simple solution (i.e. have very minimal changes to CPython and keep GenericMeta), this will already fix most performance issues. Then (if people will continue to complain about metaclass conflicts) we may remove GenericMeta, this is a quite independent problem.

gvanrossum commented 7 years ago

I've gotta focus elsewhere for a while, but I'd like to note that I was probably wrong about recommending people inherit their metaclasses from ABCMeta -- it'll still be a metaclass conflict.

ilevkivskyi commented 7 years ago

it'll still be a metaclass conflict

Yes, sorry, you are right, I was confused by the fact that this works:

class C(typing.List[int], collections.abc.Sequence): ...

Anyway, I am still not sure what to do. If you think we might go with __class_getitem__, then I will come up with an extended POC implementation.

ncoghlan commented 7 years ago

(It probably makes sense to break this tangent out into a new issue, but I'll continue here for now)

I think it makes sense to break exploration of this idea into 3 phases:

  1. See how far you can get by doing something like this in typing.TypingMeta.__new__ before calling super().__new__:
    def _keep_base(x):
        return x
    new_bases = tuple(getattr(base, "__subclass_base__", _keep_base)() for base in bases)
    if new_bases != bases:
        # Bases list changed, check if that changes the metaclass
        orig_meta = type(cls)
        unhinted_meta, __, __ = types.prepare_class(name, bases)
        if orig_meta is unhinted_meta:
            # Original metaclass matched the one derived from the bases list, so recalculate it
            new_meta, __, __ = types.prepare_class(name, new_bases)
            if new_meta is not orig_meta:
                # Start the class creation over again with the new metaclass
                # and no keyword arguments (disallowing `typing.TypingMeta` subclasses)
                return new_meta(name, new_bases, namespace)

That is, allowing non-classes in a subclass bases list would be a feature of typing.TypingMeta, not a generally available Python level feature. As a result, it can't have a performance impact on standard class definitions, at the price of making derivation from typing.TypingMeta a bit slower.

1a. Potentially look at exposing better building blocks (e.g. a types.recalculate_metaclass function) for metaclasses wanting to get up to these kinds of tricks (OTOH, it's not exactly the sort of thing we want to encourage, since it can break in all sorts of interesting and exciting ways if you're not careful with it)

  1. Look at how feasible it would be to make __subclass_base__ support a standard feature of the type system in 3.7+, rather than something specific to typing.TypingMeta. This would avoid the triple calculation of the derived metaclass from the list of bases, the double execution of parts of the metaclass instantiation process, and the incompatibility between the use of __subclass_base__ and keyword arguments in class definitions.

  2. Look at how feasible it would be to add a type.__getitem__ implementation in 3.7+ that delegates to __class_getitem__ on the instance (potentially eliminating the need for typing.GenericMeta entirely).