spring-attic / jdbc

Apache License 2.0
11 stars 18 forks source link

jdbc source is ignoring datasource properties #27

Closed robertovillarejo closed 5 years ago

robertovillarejo commented 5 years ago

I've been trying to configure a stream for reading data from a MySQL DB and Oracle DB but the jdbc source app is ignoring the datasource properties.

For example, this stream should not work. Should try to connect to host named 'oracle-server' but instead execute the query in localhost, i.e. in dataflow server host.

jdbc --max-messages=10 --password=oracle --query='SELECT * FROM STREAM_DEFINITIONS' --fixed-delay=5 --max-rows-per-poll=10 --time-unit=SECONDS --initial-delay=5 --url=jdbc:oracle:thin@oracle-server:1521:xe --username=system --driver-class-name=oracle.jdbc.driver.OracleDriver --initialize=false | log
artembilan commented 5 years ago

Tell us, please, what jdbc-source version do you use?

artembilan commented 5 years ago

Does it work, BTW, if you specify fully qualified properties names:

--spring.datasource.url=jdbc:oracle:thin@oracle-server:1521:xe --spring.datasource.username=system --spring.datasource.driver-class-name=oracle.jdbc.driver.OracleDriver

?

robertovillarejo commented 5 years ago

I'm using version 1.3.1.RELEASE for RabbitMQ. Actually, I'm using this yml from Spring Cloud Data Flow Repository.

I used the GUI for creating the stream, that's why I did no use the fully qualified properties names. After reading your comment I tried with the fully names but the problem still remains.

artembilan commented 5 years ago

OK. Any guidelines how to reproduce? What Spring Cloud Data Flow version do you use also, please?

robertovillarejo commented 5 years ago

I noticed it only happens when use mysql instead h2 database for the dataflow-server. It seems like the stream inherits the datasource properties from the dataflow-server.

version: '3'

services:
  oracle:
    image: sath89/oracle-12c:latest
    environment:
      - WEB_CONSOLE=false
      - DBCA_TOTAL_MEMORY=1024
    ports:
      - "1521:1521"
    expose:
      - "1521"
  mysql:
    image: mysql:5.7.20
    environment:
      - MYSQL_DATABASE=dataflow
      - MYSQL_USER=root
      - MYSQL_ROOT_PASSWORD=root
    ports:
      - "3306:3306"
    expose:
      - "3306"
  rabbitmq:
    image: rabbitmq:3.7
    expose:
      - "5672"
  dataflow-server:
    image: springcloud/spring-cloud-dataflow-server-local:1.6.3.RELEASE
    container_name: dataflow-server
    ports:
      - "9393:9393"
      - "9000-9010:9000-9010"
      - "9090:9090"
    environment:
      - spring.cloud.dataflow.applicationProperties.stream.spring.rabbitmq.host=rabbitmq
      - spring_datasource_url=jdbc:mysql://mysql:3306/dataflow
      - spring_datasource_username=root
      - spring_datasource_password=root
      - spring_datasource_driver-class-name=org.mariadb.jdbc.Driver
    depends_on:
      - rabbitmq
      - mysql
  app-import:
    image: alpine:3.7
    depends_on:
      - dataflow-server
    command: >
      /bin/sh -c "
        while ! nc -z dataflow-server 9393;
        do
          sleep 1;
        done;
        wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Celsius-SR3-stream-applications-rabbit-maven&force=true';
        echo 'Stream apps imported'
        wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Clark-GA-task-applications-maven&force=true';
        echo 'Task apps imported'"
robertovillarejo commented 5 years ago

I've been modifying my docker-compose file and I noticed that when i use low dashed properties (as seen in spring tutorial) in the dataflow-server instance then the jdbc source in the stream use that properties as values for the 'dataSource' bean.

This is the infraestructure:

version: '3'

services:
  mysql-origin:
    image: mysql:5.7.20
    environment:
      - MYSQL_DATABASE=legacy
      - MYSQL_USER=root
      - MYSQL_ROOT_PASSWORD=root
    ports:
      - "3306:3306"
    expose:
      - "3306"
  mysql:
    image: mysql:5.7.20
    environment:
      - MYSQL_DATABASE=dataflow
      - MYSQL_USER=root
      - MYSQL_ROOT_PASSWORD=root
    expose:
      - "3306"
  rabbitmq:
    image: rabbitmq:3.7
    expose:
      - "5672"
  dataflow-server:
    image: springcloud/spring-cloud-dataflow-server-local:1.6.3.RELEASE
    ports:
      - "9393:9393"
      - "9000-9010:9000-9010"
      - "9090:9090"
    environment:
      - spring.cloud.dataflow.applicationProperties.stream.spring.rabbitmq.host=rabbitmq
      - spring_datasource_url=jdbc:mysql://mysql:3306/dataflow
      - spring_datasource_username=root
      - spring_datasource_password=root
      - spring_datasource_driver-class-name=org.mariadb.jdbc.Driver
    depends_on:
      - rabbitmq
      - mysql
  app-import:
    image: alpine:3.7
    depends_on:
      - dataflow-server
    command: >
      /bin/sh -c "
        while ! nc -z dataflow-server 9393;
        do
          sleep 1;
        done;
        wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Celsius-SR3-stream-applications-rabbit-maven&force=true';
        echo 'Stream apps imported'
        wget -qO- 'http://dataflow-server:9393/apps' --post-data='uri=http://bit.ly/Clark-GA-task-applications-maven&force=true';
        echo 'Task apps imported'"

In mysql-origin instance i runned:

CREATE TABLE IF NOT EXISTS `project` (
  `id` BIGINT(20) NOT NULL AUTO_INCREMENT,
  `name` VARCHAR(255) NOT NULL,
  `description` VARCHAR(255) NULL DEFAULT NULL,
  PRIMARY KEY (`id`))
ENGINE = InnoDB
AUTO_INCREMENT = 3
DEFAULT CHARACTER SET = utf8;

INSERT INTO project(name, description) VALUES('project1','description1');

I created a stream: jdbc --max-messages=10 --spring.datasource.password=root --query='SELECT * FROM STREAM_DEFINITIONS' --fixed-delay=5 --max-rows-per-poll=10 --time-unit=SECONDS --initial-delay=5 --spring.datasource.url='jdbc:mysql://mysql-origin:3306/legacy' --spring.datasource.username=root --spring.datasource.driver-class-name='org.mariadb.jdbc.Driver' --initialize=false | log

I obtained a SQL Exception for the jdbc source app that indicates it's looking the project table in database dataflow: Caused by: java.sql.SQLSyntaxErrorException: (conn:22) Table 'dataflow.project' doesn't exist

But, when i use dots instead low dashes in environment properties, i get the data as i wanted 2018-10-23 17:49:38.844 INFO 84 --- [jdbc.STREAM_9-1] log-sink : {id=3, name=project1, description=description1}

chrisjs commented 5 years ago

thanks for pointing this out, docs need to be updated. over time relaxed property conversion was tightened up and _'s weren't being passed down. closing this issue in favor of: https://github.com/spring-cloud/spring-cloud-dataflow/issues/2526 to update the docker-compose docs for the spring properties