Closed rahulkhanna2 closed 9 months ago
@sdelamo I am experiencing the same issue on 4.2.0. Will this be addressed soon?
Cant really track down what is causing the issue, as we are importing implementation("io.micronaut.kafka:micronaut-kafka:5.2.0")
directly, but also using io.micronaut.platform:micronaut-platform:4.2.0
which also imports micronaut-kafka.
This error occurs if Micronaut Kafka is configured in such a way that the internal configuration classes are disabled, thus the AOP interceptor for the @KafkaClient
methods never gets applied, and yet you are still triggering initialization of the @KafkaClient
bean, as is happening here by way of the dependency being injected into the controller. In the case of the sample app, setting kafka.enabled=true
removes this error.
For debugging these sorts of issues, it is quite useful to turn on logging of conditional beans so that you can spot when things are unintentionally disabled by adding
<logger name="io.micronaut.context.condition" level="DEBUG"/>
to logback.xml
.
In general, I would suggest some improvements to your classpath structure:
For a module that you will be importing into a main server application, reduce the dependencies to only what are needed by the actual module. For example, the sample mn-foobar
module does not need http-server-netty
, but the mn-injection
server application does.
I would also suggest disabling the Maven shaded plugin for such modules, otherwise you end up with duplicate classes on the classpath in the importing application. You can do this with:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
in the pom.xml
of the module.
@jeremyg484 / @sdelamo so it means that if we have to use this approach then it is mandatory to keep kafka enabled ?
This seems not feasible because different envs have different requirements.
@twinklearora2 No, I don't think so.
To be clear, I believe you would get the same error in a non-modular application if you were trying to inject a @KafkaClient
bean into another bean when your configuration has kafka.enabled=false
, as was happening here - that seems correct to me. (I have re-worded my comment above to try and make what is going on here more clear for the record.)
I would envision that if you have an app running in an environment where Kafka is unavailable, then you would use an interface combined with either environment configuration or something such as `@Requires(property="kafka.enabled" value="false"), etc to inject a different implementation that doesn't require Kafka and that will all work fine.
@twinklearora2 we ended up using the following annotation
import io.micronaut.context.annotation.Requires
@Retention(AnnotationRetention.RUNTIME)
@Target(AnnotationTarget.CLASS, AnnotationTarget.FILE)
@Requires(property = "kafka.enabled", notEquals = "false")
annotation class RequiresKafka
Then we applied that to the Client class, and worked like a charm
@RequiresKafka <---------------
@KafkaClient
interface InternalKafkaClient {
@Topic("\${topic}")
fun publishEvent(@KafkaKey key: String?, value: String?)
}
Expected Behavior
It should be able to send a message to kafka
Actual Behaviour
When kafka is defined in a dependency jar created using micronaut and imported in another micronaut project. It gives interceptor issue
At least one @Introduction method interceptor required, but missing for method: void sendEvents(String key,String value,List<RecordHeader> headers). Check if your @Introduction stereotype annotation is marked with @Retention(RUNTIME) and @InterceptorBean(..) with the interceptor type. Otherwise do not load @Introduction beans if their interceptor definitions are missing! client is defined
Steps To Reproduce
Environment Information
No response
Example Application
https://github.com/rahulkhanna2/mn-foobar
Version
4.1.4