aws / aws-advanced-jdbc-wrapper

The Amazon Web Services JDBC Driver has been redesigned as an advanced JDBC wrapper. This wrapper is complementary to and extends the functionality of an existing JDBC driver to help an application take advantage of the features of clustered databases such as Amazon Aurora.
Apache License 2.0
182 stars 34 forks source link

HikariCP cant pass targetDataSourceProperties #1049

Open ucjonathan opened 1 week ago

ucjonathan commented 1 week ago

Describe the bug

While working on using the wrapper and following the example provided, I noticed in my output that whatever properties are specified in targetDataSourceProps are showing up in the log output:

2024-06-29 07:00:36,167 DEBUG com.zaxxer.hikari.util.PropertyElf - Try to create a new instance of "{wrapperLogLevel=ALL, defaultRowFetchSize=10000, wrapperPlugins=failover,efm2}" 2024-06-29 07:00:36,168 DEBUG com.zaxxer.hikari.util.PropertyElf - Class "{wrapperLogLevel=ALL, defaultRowFetchSize=10000, wrapperPlugins=failover,efm2}" not found or could not instantiate it (Default constructor)

This concerned me that the properties were not making it to the actual underlying data source. Looking into the GitHub source for HikariCP I find that the output I'm seeing happens when the property value can't be passed through:

https://github.com/brettwooldridge/HikariCP/blob/0a6ccdb334b2ecde25ae090034669d534736a0de/src/main/java/com/zaxxer/hikari/util/PropertyElf.java#L152

Expected Behavior

The properties to be passed down to the data sources properly.

What plugins are used? What other connection properties were set?

failover,efm2

Current Behavior

I'm seeing this error output:

2024-06-29 07:00:36,167 DEBUG com.zaxxer.hikari.util.PropertyElf - Try to create a new instance of "{wrapperLogLevel=ALL, defaultRowFetchSize=10000, wrapperPlugins=failover,efm2}" 2024-06-29 07:00:36,168 DEBUG com.zaxxer.hikari.util.PropertyElf - Class "{wrapperLogLevel=ALL, defaultRowFetchSize=10000, wrapperPlugins=failover,efm2}" not found or could not instantiate it (Default constructor)

Reproduction Steps

// Setup the PG Sql writer Data Source
HikariConfig config = new HikariConfig();

config.setDataSourceClassName(AwsWrapperDataSource.class.getName());
config.setUsername("myusername");
config.setPassword("mypassword");
config.setConnectionTestQuery("SELECT 1");
config.setMaximumPoolSize(25);
config.setMinimumIdle(5);
config.setRegisterMbeans(false);

String pgSqlWriterHostname = "test.cluster-123456789.us-east-1.rds.amazonaws.com";

config.addDataSourceProperty("jdbcProtocol", "jdbc:postgresql:");
config.addDataSourceProperty("serverName", pgSqlWriterHostname);
config.addDataSourceProperty("serverPort", "5432");
config.addDataSourceProperty("database", "mydatabase");
config.addDataSourceProperty("user", "myusername");
config.addDataSourceProperty("password", "mypassword");

Properties targetDataSourceProps = new Properties();

// Configuring PGSimpleDataSource if needed:
targetDataSourceProps.setProperty("defaultRowFetchSize", "10000");

// Enable the failover and host monitoring connection plugins.
targetDataSourceProps.setProperty("wrapperPlugins", "failover,efm2");
targetDataSourceProps.setProperty("wrapperLogLevel", "ALL");

config.addDataSourceProperty("targetDataSourceProperties", targetDataSourceProps);

// The failover plugin throws failover-related exceptions that need to be handled explicitly by HikariCP,
// otherwise connections will be closed immediately after failover. Set `ExceptionOverrideClassName` to provide
// a custom exception class.
config.setExceptionOverrideClassName("software.amazon.jdbc.util.HikariCPSQLException");

// Specify the driver-specific data source for AwsWrapperDataSource:
config.addDataSourceProperty("targetDataSourceClassName", "org.postgresql.ds.PGSimpleDataSource");

return new HikariDataSource(config);

Possible Solution

Their PropertyElf class has no concept of how to handle a Properties object. It seems like the only solution would be to have a AwsWrapperDataSource.setTargetDataSourceProperties that takes a String and then parses it into the properties object.

Additional Information/Context

Maybe I'm missing something here, but it seems like a functional HikariCP (not just a HikariDataSource) scenario is not possible. Can we get a full blown example of a HikariCP configuration that works with the failover plugin, etc. I also need to make sure I can pass down PostgreSQL JDBC driver properties such as "defaultRowFetchSize" since some of the result sets are quiet large.

The AWS Advanced JDBC Driver version used

2.3.7

JDK version used

1.8.0_412

Operating System and version

Ubuntu 22

ucjonathan commented 1 week ago

The HikariFailoverExample exhibits the same error message. I've been able to correct that by instantiating an AwsWrappedDataSource and then calling HikariDataSource.setDataSource. I also added in an example of configuring the SSL root certificate on the PGSQL driver so that the connection is performing SSL.

import com.zaxxer.hikari.HikariDataSource;
import software.amazon.jdbc.ds.AwsWrapperDataSource;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Properties;

public class HikariFailoverExample {
  private static final String USER = "username";
  private static final String PASSWORD = "password";
  private static final String DATABASE_NAME = "postgres";
  private static final String ENDPOINT = "db-identifier.cluster-XYZ.us-east-2.rds.amazonaws.com";

  public static void main(String[] args) throws SQLException, IOException {
    try (HikariDataSource ds = new HikariDataSource()) {
      // The failover plugin throws failover-related exceptions that need to be handled explicitly by HikariCP,
      // otherwise connections will be closed immediately after failover. Set `ExceptionOverrideClassName` to provide
      // a custom exception class.
      ds.setExceptionOverrideClassName("software.amazon.jdbc.util.HikariCPSQLException");

      // Specify the driver-specific data source for AwsWrapperDataSource:
      ds.addDataSourceProperty("targetDataSourceClassName", "org.postgresql.ds.PGSimpleDataSource");

      Properties targetDataSourceProps = new Properties();

      // Configuring PGSimpleDataSource if needed:
      // Download your RDS regional bundle from the AWS website.  This example has us-east-1-bundle
      byte[] rootCert;
      try (InputStream inputStream = HikariFailoverExample.class.getResourceAsStream("us-east-1-bundle.pem");
           ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {

        if (inputStream == null) {
          throw new IOException("Resource not found: us-east-1-bundle.pem");
        }

        byte[] buffer = new byte[4096];
        int bytesRead;
        while ((bytesRead = inputStream.read(buffer)) != -1) {
          outputStream.write(buffer, 0, bytesRead);
        }

        rootCert = outputStream.toByteArray();
      } catch (IOException e) {
        e.printStackTrace();
        rootCert = null;
      }
      if (rootCert != null) {
        // Write out the temp file
        File rootCertFile = File.createTempFile("rdsroot", "pem");
        rootCertFile.deleteOnExit();
        // Write the bytes to the temporary file
        try (FileOutputStream fos = new FileOutputStream(rootCertFile)) {
          fos.write(rootCert);
        }

        targetDataSourceProps.setProperty("sslmode", "verify-full");
        targetDataSourceProps.setProperty("sslrootcert", rootCertFile.getAbsolutePath());
      }

      // Enable the failover and host monitoring connection plugins.
      targetDataSourceProps.setProperty("wrapperPlugins", "failover,efm2");

      AwsWrapperDataSource awsWrappedDataSource = new AwsWrapperDataSource();
      awsWrappedDataSource.setTargetDataSourceProperties(targetDataSourceProps);

      // Configure the connection pool:
      awsWrappedDataSource.setUser(USER);
      awsWrappedDataSource.setPassword(PASSWORD);

      // Specify the underlying datasource for HikariCP:
      ds.setDataSourceClassName(AwsWrapperDataSource.class.getName());

      // Configure AwsWrapperDataSource:
      awsWrappedDataSource.setJdbcProtocol("jdbc:postgresql:");
      awsWrappedDataSource.setServerName(ENDPOINT);
      awsWrappedDataSource.setServerPort("5432");
      awsWrappedDataSource.setDatabase(DATABASE_NAME);

      // Alternatively, the AwsWrapperDataSource can be configured with a JDBC URL instead of individual properties as
      // seen above.
//      awsWrappedDataSource.setJdbcUrl("jdbc:aws-wrapper:postgresql://db-identifier.cluster-XYZ.us-east-2.rds.amazonaws.com:5432/postgres");

      ds.setDataSource(awsWrappedDataSource);

      // Attempt a connection:
      try (final Connection conn = ds.getConnection();
           final Statement statement = conn.createStatement();
           final ResultSet rs = statement.executeQuery("SELECT * from aurora_db_instance_identifier()")) {
        while (rs.next()) {
          System.out.println(rs.getString(1));
        }
      }
    }
  }
}