Open yrodiere opened 1 year ago
/cc @Sanne (hibernate-orm), @gsmet (hibernate-orm)
One challenge is how to retrieve such beans. Some time ago Arc changed the way it behaved so that you can't just use
.select(TenantResolver.class)
; it just wouldn't return any bean ifTenantResolver
is a generic type.
Well, Instance#select(TenantResolver.class)
is an equivalent of @Inject TenantResolver
and if TenantResolver
is a parameterized type, then the following CDI typesafe resolution rule applies: "A parameterized bean type is considered assignable to a raw required type if the raw types are identical and all type parameters of the bean type are either unbounded type variables or java.lang.Object." In other words, TenantResolver<String>
would not match. However, you can use a TypeLiteral
instead of the raw type to emulate @Inject TenantResolver<?>
in which case, the TenantResolver<String>
would match (according to the rules).
However, you can use a TypeLiteral instead of the raw type to emulate @Inject TenantResolver<?> in which case, the TenantResolver
would match (according to the rules).
That would be perfect, thank you.
From what I understand here, I would need to use new TypeLiteral<TenantResolver<T>>() {}
, where T
is unbounded and not just a wildcard, if I want raw type TenantResolver
to match as well? It's not an absolute requirement but would be nice for backwards compatibility.
A raw bean type is considered assignable to a parameterized required type if the raw types are identical and all type parameters of the required type are either unbounded type variables or java.lang.Object.
From what I understand here, I would need to use
new TypeLiteral<TenantResolver<T>>() {}
, whereT
is unbounded and not just a wildcard, if I want raw typeTenantResolver
to match as well? It's not an absolute requirement but would be nice for backwards compatibility.
So if you do io.quarkus.hibernate.orm.runtime.tenant.TenantResolver
=> TenantResolver<T>
then a class compiled against the TenantResolver
should have the raw type in its set of bean types (although I'm not 100% sure it would). Then yes, if you need to obtain the TenantResolver
with new TypeLiteral<TenantResolver<T>>() {}
.
Actually it would seem that new TypeLiteral<TenantResolver<T>>() {}
does not match implementations of TenantResolver
with any type argument.
I created a simpler example with Hibernate Validator here: #36843 . It's about ValueExtractor
instead of TenantResolver
, but the idea is the same.
The relevant changes are here: https://github.com/quarkusio/quarkus/pull/36843/files#diff-b452e79d9b446d0ef3bd595149c583c58f1e03a82231bbe6d036b47a02e4238dR41-R166
To reproduce:
io/quarkus/hibernate/validator/runtime/HibernateValidatorRecorder.java:165
.mvn clean install -pl extensions/hibernate-validator/deployment -Dtest=SingletonCustomValueExtractorTest -Dmaven.surefire.debug
Then at the breakpoint you'll see that running Arc.container().beanManager().createInstance().select(valueExtractorTypeLiteral())
does not return any bean, even though there is a bean implementing ValueExtractor
declared in this test (io.quarkus.hibernate.validator.test.valueextractor.SingletonCustomValueExtractorTest.SingletonContainerValueExtractor
) and it hasn't been removed.
I also tried with new TypeLiteral<ValueExtractor<?>>() {}
, and it didn't seem to work either.
Do you think I did something wrong @mkouba? Or did we interpret the spec wrong? Or should I file an issue?
Do you think I did something wrong @mkouba? Or did we interpret the spec wrong? Or should I file an issue?
So the SingletonContainerValueExtractor
implements ValueExtractor<Container<@ExtractedValue ?>>
which is not a legal bean type according to CDI: "A parameterized type that contains a wildcard type parameter is not a legal bean type." and that's why the type is ignored during bean discovery. See also Bean types of a managed bean.
TypeLiteral<ValueExtractor<T>>
or TypeLiteral<ValueExtractor<Object>>
required type should work fine for raw types, i.e. for ValueExtractor
.
CC @manovotn @Ladicek
Ok, thanks for the explanation.
This does mean however that Quarkus extensions can never rely on .select()
for parameterized types, since we have no control over what implementations use as type arguments. E.g. for Hibernate ORM they could be using TenantResolver<List<? extends Number>>
for all I know.
Why on earth would the spec say "those types are illegal" rather than simply "the behavior for such types is implementation-specific"... it's really annoying, because it means we can't go further than the spec in Arc :/
Unless, maybe, we had a way (build item) to say to Arc: "for ValueExtractor
and TenantResolver
implementations, please consider all parameterized types as legal bean types, even if the CDI spec says they're illegal"?
Or, maybe more realistic, "for ValueExtractor
and TenantResolver
implementations, please always include the raw type in the bean types, even if the CDI spec says you shouldn't"?
I mean it would still feel odd, but at least it requires less code for extensions that need to retrieve beans implementing a generic interface...
In CDI Full, I would say you could use an extension that hooks into something like ProcessBeanAttributes
and add the raw type as their bean type. However, that's not something we can do in Arc yet (and CDI build compatible extensions can't do that either, yet).
I don't know if there is a technical limitation to it; I'd rather say we didn't need it so far. Maybe we should look into allowing that?
This does mean however that Quarkus extensions can never rely on .select() for parameterized types, since we have no control over what implementations use as type arguments. E.g. for Hibernate ORM they could be using TenantResolver<List<? extends Number>> for all I know.
Yes, that is correct. If the impl class can have arbitrary types, you may not be able to perform typesafe resolution based on the impl type alone.
One thing you could use is if all of those classes implemented some non generic marker interface, you could then use CDI to resolve based on the marker interface type. I.e. public interface ValueExtractor<X> extends MarkerInterface
.
Then you could use things like ArcContainer#listAll(MarkerInterface.class)
. But I understand that this isn't likely to be a solution for your case :)
Alternatively, we'd have to have some Arc annotation such as @IncludeRawTypes
which you could add onto bean (via annotation transformer for example) and Arc would then deliberately add raw types into set of bean types as well. But that's not very nice from CDI perspective and I am pretty sure it won't get past @mkouba :D
I think that we could introduce a new build item that will be used to mark a type for which a raw type variant should be included in the set of bean types. Something like RawBeanTypeBuildItem
with a Predicate<org.jboss.jandex.Type>
. For any bean type that would match the predicate, the set of bean types would contain the actual type, e.g. ValueExtractor<String>
and also the raw type ValueExtractor
.
The consequence would be that a bean definition like:
public class MyValueExtractor implements ValueExtractor<String> {}
could be injected in an injection point like @Inject ValueExtractor<T>
and @Inject ValueExtractor<Object>
. Which is currently not the case if I'm not mistaken. Am I? @manovotn @Ladicek
I think that we could introduce a new build item that will be used to mark a type for which a raw type variant should be included in the set of bean types. Something like
RawBeanTypeBuildItem
with aPredicate<org.jboss.jandex.Type>
. For any bean type that would match the predicate, the set of bean types would contain the actual type, e.g.ValueExtractor<String>
and also the raw typeValueExtractor
.
Hm, that's not unlike what I talked about in my previous comment. Maybe we should instead look into a solution that allows manipulating bean attributes as such? I.e. an equivalent of ProcessBeanAttributes
?
Your idea definitely works but it's limited to this use case whereas having a more general hook into bean attributes might come handy for wider range of cases. That being said, I haven't yet looked into the code to see how intrusive would such a change be.
could be injected in an injection point like
@Inject ValueExtractor<T>
and@Inject ValueExtractor<Object>
. Which is currently not the case if I'm not mistaken. Am I? @manovotn @Ladicek
Yes. That shouldn't be an issue here.
FWIW the extensions I work with would definitely rather use a build item than an annotation, since the relevant interfaces (i.e. ValueExtractor
) come from external libraries.
Though I guess you could have RawBeanTypeBuildItem
consumed by a build step that modifies bytecodeto add whatever annotations you need on relevant beans... it feels like an unnecessary extra step but I don't have the full picture in mind.
Though I guess you could have RawBeanTypeBuildItem consumed by a build step that modifies bytecodeto add whatever annotations you need on relevant beans... it feels like an unnecessary extra step but I don't have the full picture in mind.
We shouldn't modify bytecode. Any way we approach this, we'd just alter metadata that ArC operates with (similar to annotation transformations for example). If we go for the build item approach, that would just be something ArC consumes to internally take a note of a set of classes where we deliberately add raw types into set of bean types.
I think that we could introduce a new build item that will be used to mark a type for which a raw type variant should be included in the set of bean types. Something like
RawBeanTypeBuildItem
with aPredicate<org.jboss.jandex.Type>
. For any bean type that would match the predicate, the set of bean types would contain the actual type, e.g.ValueExtractor<String>
and also the raw typeValueExtractor
.Hm, that's not unlike what I talked about in my previous comment. Maybe we should instead look into a solution that allows manipulating bean attributes as such? I.e. an equivalent of
ProcessBeanAttributes
? Your idea definitely works but it's limited to this use case whereas having a more general hook into bean attributes might come handy for wider range of cases. That being said, I haven't yet looked into the code to see how intrusive would such a change be.
I'm not so sure about a solution for manipulating bean attributes. It would be very powerful indeed but also too easy to shoot yourself in the foot.
I'm not so sure about a solution for manipulating bean attributes. It would be very powerful indeed but also too easy to shoot yourself in the foot.
Can't argue with that; it's basically the definition of CDI extensions :)
Hey @manovotn @mkouba , any news on this? Do you still think it would make sense?
The proposal:
I think that we could introduce a new build item that will be used to mark a type for which a raw type variant should be included in the set of bean types. Something like
RawBeanTypeBuildItem
with aPredicate<org.jboss.jandex.Type>
. For any bean type that would match the predicate, the set of bean types would contain the actual type, e.g.ValueExtractor<String>
and also the raw typeValueExtractor
.The consequence would be that a bean definition like:
public class MyValueExtractor implements ValueExtractor<String> {}
I created a playground project where I have:
public interface MyIface<T> {
void consume(T value);
}
@ApplicationScoped
public class MyImpl1 implements MyIface<String> {
@Override
public void consume(String value) {
}
@Override
public String toString() {
return "MyImpl1";
}
}
@ApplicationScoped
public class MyImpl2 implements MyIface<Number> {
@Override
public void consume(Number value) {
}
@Override
public String toString() {
return "MyImpl2";
}
}
@ApplicationScoped
public class MyImpl3 implements MyIface<List<?>> { // `MyIface<List<?>>` is an illegal bean type
@Override
public void consume(List<?> value) {
}
@Override
public String toString() {
return "MyImpl3";
}
}
@Singleton
public class MyIfaceConsumer {
@Inject
@All
List<MyIface<?>> all;
void print() {
System.out.println(all);
}
}
The following code
System.out.println(Arc.container().select(new TypeLiteral<MyIface<?>>() {}).stream().toList());
Arc.container().instance(MyIfaceConsumer.class).get().print();
prints
[MyImpl1, MyImpl2]
[MyImpl1, MyImpl2]
So using a wildcard (unbounded) should solve this issue (aside: it is also possible to use a wildcard with a bound, and it will do what you'd expect), with the exception of illegal bean types (and with the exception of raw types: if I make class MyImpl4 implements MyIface
, it won't work by specification). My personal opinion is that if the implementations of the interfaces in question are supposed to be beans, then using a type argument that makes the entire bean type illegal should be prohibited.
The idea of transforming the set of bean types to always include an erasure seems reasonable. I have this feeling that having 2 distinct types whose erasures are identical in the set of bean types is dangerous, but I cannot really articulate why. I'm probably being overly cautious. It would require a bean attributes transformation facility in ArC, which currently doesn't exist.
My personal opinion is that if the implementations of the interfaces in question are supposed to be beans, then using a type argument that makes the entire bean type illegal should be prohibited.
Right, well... The thing is, in all the cases I'm interested in, CDI is actually not required for things to work, it's just something that is allowed in the (JPA/Bean Validation) spec as a "bonus", an alternative way of plugging things in.
Now, in Quarkus, we try to use CDI everywhere we can, and get rid of the "original" way of plugging things (through properties, ...).
If we start telling users "yeah well that's not a valid bean type, duh", we need to provide them with a solution. In the case of Hibernate Validator, they simply have to use wildcards, so they have to use invalid bean types. The only solution would then be to provide them to plug in their components through something else than CDI, for example configuration.
We end up duplicating our effort, just for the sake of staying true to the spec, even for use cases where the spec actually gets in the way.
I don't know about you, but to me, this looks like a good reason to try to find a workaround.
I have this feeling that having 2 distinct types whose erasures are identical in the set of bean types is dangerous, but I cannot really articulate why. I'm probably being overly cautious. It would require a bean attributes transformation facility in ArC, which currently doesn't exist.
Note I'm not suggesting to do this for all beans. Just for beans implementing types that are retrieved by extensions, and even then only if the extension decides it's necessary. For Validator it absolutely is, for Hibernate ORM it might be.
@yrodiere I was thinking about this some more and I think there's an approach that you could take even with existing APIs.
You'd need to:
BeanInfo#getUnrestrictedTypes()
and filter them for the interface types you are looking for (i.e. ValueExtractor
)String
) of such a bean; you can obtain that via BeanInfo#getIdentifier
String
values from build time to runtime - recorder I guess?InjectableBean<T> injectableBean = Arc.container().bean(stringId)
and T = Arc.container().instance(injectableBean).get()
Does that make sense? Is it missing some of the requirements you have?
@manovotn Thanks, it makes sense, but it looks very similar to what we currently do with Jandex in the Validator extension. See this part of the issue description:
So, the question is: how would we retrieve retrieve all implementations of TenantResolver regardless of their generics? > Do we need to collect those at build-time and mark them somehow, like we did for Hibernate Validator's ValueExtractors? :/
To clarify, the goal would be to make it simple to use Arc to retrieve components from Quarkus extensions, in a way that is at least as feature-complete as non-CDI solutions (i.e. doesn't ignore any components for CDI-specific reasons).
I tried to contribute something like that in #36843 , but that was rejected, and TBH I'm not sure it even works.
If we could ensure that those beans can also be retrieved from CDI by libraries, that would be nice as well, though I haven't encountered a situation where this is a problem so far. But I suspect it might be in Hibernate ORM/Search, since these libraries can retrieve CDI beans by type during bootstrap: e.g. in Hibernate Search it's possible we'd end up retrieving a bean by its raw type ValueBridge
and user-provided name (@Named
qualifier).
If we could ensure that those beans can also be retrieved from CDI by libraries, that would be nice as well, though I haven't encountered a situation where this is a problem so far. But I suspect it might be in Hibernate ORM/Search, since these libraries can retrieve CDI beans by type during bootstrap
So we're talking libraries that could rely on purely CDI APIs to do that?
I am asking because otherwise we could also look into having something like Arc.container().selectByUnrestrictedTypes()
but that obviously requires direct Arc usage.
To be clear, those beans (the impls of given generic interfaces) are still CDI beans that are resolvable, it's just the raw type that's missing from the set of their types - would users try to resolve based on the raw type or is that just a thing Hibernate needs internally?
I guess I am not clear on what's the expected use case from user perspective here (maybe I just forgot since it's a long time we discussed this, sorry if that's the case). Is there always just one impl of the generic interface(say, ValueExtractor
)? Assuming there are multiple implementations, you cannot perform a typesafe resolution based on the raw type alone anyway - that'd be ambiguous. Or does user expect to use the raw type just to get to all the existing impls and then somehow iterate over them?
So we're talking libraries that could rely on purely CDI APIs to do that?
In the case of Hibernate ORM, yes. Though with JPA it can get really weird: https://github.com/hibernate/hibernate-orm/blob/d18cdbec350c83df53d4867868cbf2dbb8a9f66c/hibernate-core/src/main/java/org/hibernate/resource/beans/container/internal/JpaCompliantLifecycleStrategy.java#L131-L137 There's also a more sensible implementation (non-compliant with the JPA spec) that Hibernate Search relies on: https://github.com/hibernate/hibernate-orm/blob/8f8ae50e0b278b85000451bde91ba746f22e43d9/hibernate-core/src/main/java/org/hibernate/resource/beans/container/internal/ContainerManagedLifecycleStrategy.java#L202
In the case of Hibernate Validator I don't remember, but I believe the Quarkus extension does the retrieval, not Hibernate Validator. At least in the case of ValueExtractor
.
To be clear, those beans (the impls of given generic interfaces) are still CDI beans that are resolvable, it's just the raw type that's missing from the set of their types
It's a bit more complicated. The raw type is missing indeed, but any other type that includes wildcards is also missing. So for some beans, there is actually no way to select those beans by type.
- would users try to resolve based on the raw type or is that just a thing Hibernate needs internally?
If by users you mean application developers, I imagine they could, but no that's not the main concern here. Beans are generally retrieved by the library in the scenario I'm interested in.
The library generally uses a Class
instead of a type literal because of integration constraints (e.g. references specified through annotations, library-specific APIs, ...). So it's sometimes necessary to use raw types.
And of course as mentioned above, for some beans, even type literals wouldn't work.
I guess I am not clear on what's the expected use case from user perspective here
Users want to implement an interface, and have an instance retrieved by the Quarkus extension or library, through CDI.
If that interface contains generics, it must work too, even if the implementation uses wildcards (e.g. MyClass extends LibraryInterface<List<? extends Number>>
).
Is there always just one impl of the generic interface(say, ValueExtractor)?
Generally they can be multiple implementations.
Assuming there are multiple implementations, you cannot perform a typesafe resolution based on the raw type alone anyway - that'd be ambiguous. Or does user expect to use the raw type just to get to all the existing impls and then somehow iterate over them?
It depends.
Sometimes the library will just iterate over all implementations and derive metadata through reflection -- in that case we make sure it happens on static init.
Sometimes the application developper references a bean by name (in annotations, in a library API, ...), in which case we use .select(<the raw type>, new NamedQualifierLiteral(name))
. Which supposedly resolves the ambiguity.
Another thing to consider is that we need to register such beans as unremovable, and as mentioned above, we can't rely on CDI bean types for that (as it's sometimes unreliable, e.g. when wildcards are involved).
Out of curiosity, I tried to implement @manovotn's suggestion (find all the beans during build based on unrestricted bean types, pass the set of bean IDs to runtime, lookup all the beans at runtime based on those IDs). I had to massage the Hibernate Validator extension a little bit, and I added 2 small methods to ArC's BeanStream
to make the resulting code nicer. Here goes: https://github.com/Ladicek/quarkus/commits/value-extractors-rewrite/ The latest commit is relevant to this discussion, the previous 3 commits are preparation. I didn't test this extensively, just executed the tests in the hibernate-validator/deployment
module.
I confirm that the suggestion is basically what the Hibernate Validator extension already does, except a bit more streamlined. I think it's a fairly reasonable approach.
As @manovotn mentioned, we could possibly transfer the unrestricted bean types to runtime (InjectableBean.getUnrestrictedTypes()
, Arc.container().selectByUnrestrictedType()
etc.), but I'm not sure that's something CDI would like to standardize. I wish I knew more about the motivation for the concept of illegal bean types (maybe proxying?), but I don't.
I confirm that the suggestion is basically what the Hibernate Validator extension already does, except a bit more streamlined. I think it's a fairly reasonable approach.
I agree it's a tad cleaner, though not quite as simple as I'd hoped for when I opened #36843.
It seems odd to have such edge cases (beans using wildcards being ignored) and require rather convoluted code for a use case that I'd expect to be rather widespread: have an extension retrieve CDI beans implemented by the user... which may or may not include wildcards, the extension just doesn't know (and shouldn't need to know, IMO).
Even my PR wasn't great, TBH. It requires extension maintainers to be aware of the problem, whereas in an ideal world an extension maintainer would just express "beans of this type are retrieved programmatically, please make that work" and it would... work. Without weird edge cases.
UnremovableBeansBuildItem
has been used for this use case, but the semantics of that build item are more specific, and we're discovering now there's more to this than just marking beans as unremovable.
I'd suggest introducing a new build item, but at this point this is unlikely to get traction unless we also make some very unrealistic changes (deprecations on widely-used APIs).
As @manovotn mentioned, we could possibly transfer the unrestricted bean types to runtime (InjectableBean.getUnrestrictedTypes(), Arc.container().selectByUnrestrictedType() etc.), but I'm not sure that's something CDI would like to standardize.
I also find it unlikely to be standardized, and if not it probably won't solve the whole problem as libaries most likely won't depend on Arc. Even if it was standardized, it feels too obscure to get wide adoption, so extension maintainers or users unaware of the problem will still be affected.
I wish I knew more about the motivation for the concept of illegal bean types (maybe proxying?), but I don't.
So would I. This arbitrary limitation is really annoying. And I'm pretty sure it affects some users out there as well; not many, since it's a relatively obscure edge case, but I can't be the only one to ever have used List<?>
as a generic type parameter.
Wouldn't there be a way to alter Arc's behavior in a way that's both compliant with the spec and fixes this problem? E.g.:
ValueExtractor<?>
or ValueExtractor<? extends Number>
are not legal bean types (makes sense, you can't implement that anyway), but ValueExtractor<List<?>>
and ValueExtractor<List<? extends Number>>
are... because they do not really "contain" a wildcard, only their type parameters do?
Note that in this very section, the spec takes care of making the recursion explicit for array types: "An array type whose component type is not a legal bean type". One can wonder why the recursion would be implicit for parameterized types.
If such a change made Arc fail the TCK, maybe a challenge to the TCK could work?ValueExtractor<? extends Number>
would get assigned the raw bean type ValueExtractor
, at least?
- Interpret the sentence "A parameterized type that contains a wildcard type parameter is not a legal bean type" from the spec as a non-recursive constraint? I.e.
ValueExtractor<?>
orValueExtractor<? extends Number>
are not legal bean types (makes sense, you can't implement that anyway), butValueExtractor<List<?>>
andValueExtractor<List<? extends Number>>
are... because they do not really "contain" a wildcard, only their type parameters do? Note that in this very section, the spec takes care of making the recursion explicit for array types: "An array type whose component type is not a legal bean type". One can wonder why the recursion would be implicit for parameterized types.
I think that it was intentional because the spec does not cover a parameterized bean type with a wildcard in Assignability of raw and parameterized types, i.e. only the cases where the required type (e.g. injected type) has a wildcard are covered.
Add some "fallback" mechanism in Arc so that a bean that implements ValueExtractor<? extends Number> would get assigned the raw bean type ValueExtractor, at least?
+0. Keep in mind that this would mean it would be eligible for injection into ValueExtractor<T>
and ValueExtractor<Object>
. I'm not so sure about the ValueExtractor
raw type?
I think that it was intentional because the spec does not cover a parameterized bean type with a wildcard in Assignability of raw and parameterized types, i.e. only the cases where the required type (e.g. injected type) has a wildcard are covered.
A shame, since the rules for type variables could be easily extended to cover wildcards:
the required type parameter is a wildcard, the bean type parameter is a type variable and the upper bound of the type variable is assignable to or assignable from the upper bound, if any, of the wildcard and assignable from the lower bound, if any, of the wildcard, or
Add some "fallback" mechanism in Arc so that a bean that implements ValueExtractor<? extends Number> would get assigned the raw bean type ValueExtractor, at least?
+0. Keep in mind that this would mean it would be eligible for injection into
ValueExtractor<T>
andValueExtractor<Object>
.
Damn.
Alright then... We can introduce a synthetic, dangling type variable. ValueExtractor<List<?>>
would become ValueExtractor<List<T>>
, and ValueExtractor<List<? extends Number>>
would become ValueExtractor<List<T>>
where T
is defined as T extends Number
.
But I guess at this point we're just considering types with wildcards as legal bean types :/
I think that it was intentional because the spec does not cover a parameterized bean type with a wildcard in Assignability of raw and parameterized types, i.e. only the cases where the required type (e.g. injected type) has a wildcard are covered.
Wait, these rules can actually be interpreted to resolve the case we're interested in... depending on what the definition of "actual type" is. A parameterized type with a wildcard in its parameters is an "actual type", right?
Example:
Required: ValueExtractor<?>
, bean type: ValueExtractor<List<?>>
the required type parameter and the bean type parameter are actual types with identical raw type, and, if the type is parameterized, the bean type parameter is assignable to the required type parameter according to these rules, or
=> Are parameters assignable according to these rules?
=> Required: ?
, bean type: List<?>
the required type parameter is a wildcard, the bean type parameter is an actual type and the actual type is assignable to the upper bound, if any, of the wildcard and assignable from the lower bound, if any, of the wildcard, or
=> Is List<?>
assignable from/to the bounds of ?
?
=> There are no bounds.
=> Yes... ?
Description
Hibernate ORM changed a few things to allow tenant identifiers to be any type now, instead of just strings.
In particular:
MultiTenantConnectionProvider
becomesMultiTenantConnectionProvider<T>
CurrentTenantIdentifierResolver
becomesCurrentTenantIdentifierResolver<T>
As a first step we'll probably just force
T
toString
in the Quarkus implementations, but eventually we'll need to add generics in Quarkus-specific interfaces as well:io.quarkus.hibernate.orm.runtime.tenant.TenantResolver
=>TenantResolver<T>
io.quarkus.hibernate.orm.runtime.tenant.TenantConnectionResolver
=>TenantConnectionResolver<T>
Implementation ideas
One challenge is how to retrieve such beans. Some time ago Arc changed the way it behaved so that you can't just use
.select(TenantResolver.class)
; it just wouldn't return any bean ifTenantResolver
is a generic type.So, the question is: how would we retrieve retrieve all implementations of
TenantResolver
regardless of their generics? Do we need to collect those at build-time and mark them somehow, like we did for Hibernate Validator's ValueExtractors? :/@mkouba , @Ladicek maybe you have a better idea? Some feature or build item that would make it easy to retrieve all beans implementing a given generic type regardless of type arguments?