spring-projects / spring-boot

Spring Boot helps you to create Spring-powered, production-grade applications and services with absolute minimum fuss.
https://spring.io/projects/spring-boot
Apache License 2.0
75.19k stars 40.69k forks source link

Interoperability between R2DBC and JPA/JDBC configurations #28025

Closed LifeIsStrange closed 3 years ago

LifeIsStrange commented 3 years ago

Edit context and TL;DR: This issue was originally a question about spring R2DBC interop/compat with a project using JPA/JDBC. It was found that this is the more general problem of multiple repos modules, which has many solutions I decided to specify the respective modules paths in @Enable.. annotations. A problem remained, R2DBC will deactivate the JDBC/JPA autoconf, as documented and one need to import it with the @Import annotation.

Original issue: Like many projects I don't want to color all of my controller routes with Flux/Flow and only have some specific legitimate use cases for R2DBC. However it is not clear at all (and it should) wether one can integrate R2DBC in an existing app and keep using JPA/JDBC for most routes and only use R2DBC where appropriate. (our legitimate use case being a SSE stream using postgres NOTIFY/LISTEN) If there were to be interop issues, I'd like to clarify that my use of R2DBC is read only while all writes are using the JPAs

LifeIsStrange commented 3 years ago

@mp911de friendly ping

LifeIsStrange commented 3 years ago

According to someone on the internet it's possible to do, but that's not a guarantee or a sound reasoning that could provide the developers of r2db (or of the spring integration) https://stackoverflow.com/questions/62253297/is-it-possible-to-use-both-spring-data-r2dbc-and-spring-data-jpa-in-a-single-spr

LifeIsStrange commented 3 years ago

currently hitting this interop issue https://stackoverflow.com/questions/64874752/reactive-and-non-reactive-repository-in-a-spring-boot-application-with-h2-databa?noredirect=1&lq=1

mp911de commented 3 years ago

I'm not sure what you're asking for. JDBC uses JDBC driver technology and R2DBC uses non-blocking drivers. A R2DBC driver cannot be used in JDBC mode and vice versa hence you need to include both technologies if you want to use both stacks.

LifeIsStrange commented 3 years ago

I understand that I need to include both technologies! But if I have one blocking/JPA repository for RequestModel and another, reactive R2DBC repository for RequestModel. My question is whether this is supported or if there will be side effects/synchronization differences between the two connections. Since my use for now of R2DBC is read only I don't think there can be synchronization issues but if I did writes then the writes might not respect the sequential order of the code? More importantly introducting R2DBC on my JPA project give an exception (like in the linked stackoverflow post)

Parameter 0 of constructor in com.brainflow.brainflowserver.services.UserService required a bean of type 'com.brainflow.brainflowserver.repositories.UserRepository' that could not be found.

where my blocking repository is no longer recognized by Spring I am currently searching for the difference since apparently this repository combined both stacks without this issue https://github.com/hantsy/spring-puzzles/tree/master/jpa-r2dbc

LifeIsStrange commented 3 years ago

Litterally simply including the R2DBC dependencies (spring, postgres driver) throw the exception

Parameter 0 of constructor in com.brainflow.brainflowserver.services.UserService required a bean of type 'com.brainflow.brainflowserver.repositories.UserRepository' that could not be found.

even with zero R2DBC code in my app and application.properties specific to it commented

mp911de commented 3 years ago

Thanks for the background, it wasn't immediately clear from the initial description. Using multiple Spring Data modules in a single project enables strict repository detection mode meaning that either entities or repositories must have a clear indication to which module they belong. That can happen by either annotating entities with @Entity/@Table or by using the particular module-specific repository interface (JpaRepository, R2dbcRepository) as repository super-class.

You should see some log output that indicates how many repositories were found by Spring Data for a particular module (Could not safely identify store assignment for repository candidate …, Multiple Spring Data modules found, entering strict repository configuration mode!, Finished Spring Data repository scanning …).

Check out the documentation on this topic: https://docs.spring.io/spring-data/jpa/docs/current/reference/html/#repositories.multiple-modules

LifeIsStrange commented 3 years ago

Thanks, I had forgotten about this multi module concern!

however after specifying the (non overlapping) modules:

@EnableJpaRepositories("com.brainflow.brainflowserver.repositories")
@EnableR2dbcRepositories("com.brainflow.brainflowserver.reactiveRepos")
class BrainflowServerApplication {

(note: com.brainflow.brainflowserver.reactiveRepos is currently an empty folder)

It still trigger an error

Parameter 0 of constructor in com.brainflow.brainflowserver.services.UserService required a bean named 'entityManagerFactory' that could not be found.

The injection point has the following annotations:
    - @org.springframework.beans.factory.annotation.Autowired(required=true)

Action:

Consider defining a bean named 'entityManagerFactory' in your configuration.

So I supposed that @EnableJpaRepositories("com.brainflow.brainflowserver.repositories") disabled spring boot JPA autoconf, breaking the entitymanager but when I specify the jpa module and remove r2dbc dependencies there is no problem..

edit: you were right about the logs

Multiple Spring Data modules found, entering strict repository configuration mode!
Bootstrapping Spring Data JPA repositories in DEFAULT mode.
Finished Spring Data repository scanning in 46 ms. Found 9 JPA repository interfaces.
Multiple Spring Data modules found, entering strict repository configuration mode!
Bootstrapping Spring Data R2DBC repositories in DEFAULT mode.
Finished Spring Data repository scanning in 0 ms. Found 0 R2DBC repository interfaces.
Multiple Spring Data modules found, entering strict repository configuration mode!
Bootstrapping Spring Data R2DBC repositories in DEFAULT mode.
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Spring Data R2DBC - Could not safely identify store assignment for repository candidat
Finished Spring Data repository scanning in 16 ms. Found 0 R2DBC repository interfaces
Multiple Spring Data modules found, entering strict repository configuration mode!
Bootstrapping Spring Data Redis repositories in DEFAULT mode.
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Spring Data Redis - Could not safely identify store assignment for repository candidat
Finished Spring Data repository scanning in 7 ms. Found 0 Redis repository interfaces.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.BrainflowRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.LinkRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.NodeRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.ProjectMemberRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcReposit
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.ProjectRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.RequestRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.TagRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.UserNodeRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.RepositoryConfigurationExtensionSupport : Spring Data R2DBC - Could not safely identify store assignment for repository candidate interface com.brainflow.brainflowserver.repositories.UserRepository. If you want this repository to be a R2DBC repository, consider annotating your entities with one of these annotations: org.springframework.data.relational.core.mapping.Table (preferred), or consider extending one of the following types with your repository: org.springframework.data.r2dbc.repository.R2dbcRepository.
.s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 16 ms. Found 0 R2DBC repository interfaces.

I don't understand why R2DBC spring look into the repositories module, I explicitly tell spring to only look in the reactive module @EnableR2dbcRepositories("com.brainflow.brainflowserver.reactiveRepos")

edit adding @EnableRedisRepositories("com.brainflow.brainflowserver.repositories") do not solve the problem

mp911de commented 3 years ago

I think the problem is that DataSourceAutoConfiguration back off when ConnectionFactory is on the class path and a data source is being provided:

   DataSourceAutoConfiguration:
      Did not match:
         - @ConditionalOnMissingBean (types: io.r2dbc.spi.ConnectionFactory; SearchStrategy: all) found beans of type 'io.r2dbc.spi.ConnectionFactory' connectionFactory (OnBeanCondition)

Since this is becoming a Spring Boot issue, I'd suggest to move this ticket into the boot project.

You can address this issue with providing your own DataSource bean.

odrotbohm commented 3 years ago

Moved to Spring Boot as per @mp911de's request.

wilkinsona commented 3 years ago

From a Spring Boot perspective, this is working as designed and documented:

When a ConnectionFactory bean is available, the regular JDBC DataSource auto-configuration backs off. If you want to retain the JDBC DataSource auto-configuration, and are comfortable with the risk of using the blocking JDBC API in a reactive application, add @Import(DataSourceAutoConfiguration.class) on a @Configuration class in your application to re-enable it.

LifeIsStrange commented 3 years ago

Thanks that was indeed the issue I was facing (I should have read the doc ><'), and as always thank you for your excellent professionalism!

arshiya-maersk commented 1 year ago

These are the changes I did to get this right. This is my R2dbc Config class ->

@Configuration
public class R2dbcConfig {

    @Value("${spring.r2dbc.url}")
    private String url;

    @Value("${spring.r2dbc.name}")
    private String name;

    @Value("${spring.r2dbc.username}")
    private String username;

    @Value("${spring.r2dbc.password}")
    private String password;
    @Bean
    public ConnectionFactory connectionFactory() {
        return new PostgresqlConnectionFactory(
                PostgresqlConnectionConfiguration.builder()
                        .host(url)
                        .database(name)
                        .username(username)
                        .password(password)
                        .build()
        );
    }

    @Bean
    DatabaseClient databaseClient(ConnectionFactory connectionFactory) {
        return DatabaseClient.builder()
                .connectionFactory(connectionFactory)
                .namedParameters(true)
                .build();
    }
}

Then I also define my jpa config file

@Configuration
@EnableTransactionManagement
@EnableJpaRepositories(basePackages = "com.artemis.repositories")
@EntityScan("com.artemis.entities")
@Slf4j
public class JpaConfig implements EnvironmentAware {

    private static final String ENV_HIBERNATE_DIALECT = "hibernate.dialect";
    private static final String ENV_HIBERNATE_HBM2DDL_AUTO = "hibernate.hbm2ddl.auto";
    private static final String ENV_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
    private static final String ENV_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
    private Environment env;

    @Bean
    public DataSource dataSource() {
        return new DriverManagerDataSource(
                env.getProperty("datasource.url"),
                env.getProperty("datasource.username"),
                env.getProperty("datasource.password")
        );
    }

    @Bean
    public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource) {
        LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
        emf.setDataSource(dataSource);
        emf.setPackagesToScan(ArtemisApplication.class.getPackage().getName());
        emf.setPersistenceProvider(new HibernatePersistenceProvider());
        emf.setJpaProperties(jpaProperties());
        return emf;
    }

    private Properties jpaProperties() {
        Properties extraProperties = new Properties();
        extraProperties.put(ENV_HIBERNATE_FORMAT_SQL, env.getProperty(ENV_HIBERNATE_FORMAT_SQL));
        extraProperties.put(ENV_HIBERNATE_SHOW_SQL, env.getProperty(ENV_HIBERNATE_SHOW_SQL));
        extraProperties.put(ENV_HIBERNATE_HBM2DDL_AUTO, env.getProperty(ENV_HIBERNATE_HBM2DDL_AUTO));
        if (log.isDebugEnabled()) {
            log.debug(" hibernate.dialect @" + env.getProperty(ENV_HIBERNATE_DIALECT));
        }
        if (env.getProperty(ENV_HIBERNATE_DIALECT) != null) {
            extraProperties.put(ENV_HIBERNATE_DIALECT, env.getProperty(ENV_HIBERNATE_DIALECT));
        }
        return extraProperties;
    }

    @Bean
    public PlatformTransactionManager transactionManager(LocalContainerEntityManagerFactoryBean entityManagerFactory) {
        return new JpaTransactionManager(entityManagerFactory.getObject());
    }

    @Override
    public void setEnvironment(Environment environment) {
        this.env = environment;
    }
}

Here is my service class with a postgress trigger to listen

class myService{

    final PostgresqlConnection connection;

    public myService(ConnectionFactory connectionFactory ) {
        this.connection =  Mono.from(connectionFactory.create())
                .cast(PostgresqlConnection.class).block();
    }

    @PostConstruct
    private void postConstruct() {
        connection.createStatement("LISTEN my_channel").execute()
                .flatMap(PostgresqlResult::getRowsUpdated).subscribe();
        connection.getNotifications().subscribe(myService::catchTrigger);
    }

    private static void catchTrigger(Notification notification) {
        System.out.println(notification.getName());
        System.out.println(notification.getParameter());
    }
}
andrevka commented 7 months ago

I got it working only by adding annotations First, i added a configuration class where i ignore the default jdbc auto configuration to scan your projects basepackage for repositories - otherwise it will throw an exception that reactive repositories are not allowed. Next i specified the packages separately for JPA and r2dbc repositories.

@Configuration
@Import(DataSourceAutoConfiguration.class)
@EnableJdbcRepositories(basePackages = "ignore")
@EnableJpaRepositories(basePackages = "your.jpa.repositories.package")
@EnableR2dbcRepositories(basePackages = "your.r2dbc.repositories.package")
public class PersistenceConfig {

}

I also had to change my entity classes. Before i used only r2dbc and then the entity classes required annotations from org.springframework.data but now i have duplicate annotations like

org.springframework.data.annotation.Id
@jakarta.persistence.Id

@jakarta.persistence.Table(schema = "schema", name = "table")
@Table(schema = "schema", name = "table")

@Transient
@jakarta.persistence.Transient