spring-cloud / spring-cloud-dataflow-samples

Sample starter applications and code for use with the Spring Cloud Data Flow project
http://cloud.spring.io/spring-cloud-dataflow/
220 stars 203 forks source link

DB Credentials Exposed as part job parameters when executing a task from SCDF #137

Open Srkanna opened 4 years ago

Srkanna commented 4 years ago

I have Custom Built SCDF which is built as docker image in Openshift and referred in server-deployment.yaml as docker image.I use the Oracle db to store the task meta data and is an external source here. I pass the all db properties in configmap. The DB password is base64 encoded and added in config map as secret. These db details are being used by SCDF to store task metadata.

These job parameters are passed by SCDF to the executing job.But these job parameters which in turn are the datasource properties including the db password present in the configmap are being printed in logs as Job parameters, and batch_job_execution_params table.

I thought using the password as secret in configmap should resolve this. But it's not. Below is the logs and table snippet of job parameters being printed.

I would like to know how to avoid passing these db properties as job parameters to the executing job so to prevent the credentials being exposed?

12-06-2020 18:12:38.540 [main] INFO org.springframework.batch.core.launch.support.SimpleJobLauncher.run - Job: [FlowJob: [name=Job]] launched with the following parameters: [{ -spring.cloud.task.executionid=8010, -spring.cloud.data.flow.platformname=default, -spring.datasource.username=ACTUAL_USERNAME, -spring.cloud.task.name=Alljobs, Job.ID=1591985558466, -spring.datasource.password=ACTUAL_PASSWORD, -spring.datasource.driverClassName=oracle.jdbc.OracleDriver, -spring.datasource.url=DATASOURCE_URL, -spring.batch.job.names=Job_1}]

Pod Created for the Job execution - openshift screenshot These are all the properties read from configmap file image

Database Table Database Table image

Custom SCDF Dockerfile.yaml

FROM maven:3.5.2-jdk-8-alpine AS MAVEN_BUILD

COPY pom.xml /build/ COPY src /build/src/

WORKDIR /build/ RUN mvn package

FROM openjdk:8-jre-alpine

WORKDIR /app COPY --from=MAVEN_BUILD /build/target/BatchAdmin-0.0.1-SNAPSHOT.jar /app/

ENTRYPOINT ["java", "-jar", "BatchAdmin-0.0.1-SNAPSHOT.jar"]

Deployment.yaml

apiVersion: apps/v1 kind: Deployment metadata: name: scdf-server labels: app: scdf-server spec: selector: matchLabels: app: scdf-server replicas: 1 template: metadata: labels: app: scdf-server spec: containers:

- name: SPRING_CLOUD_DATAFLOW_FEATURES_TASKS_ENABLED

value : 'true'

server-config.yaml

apiVersion: v1 kind: ConfigMap metadata: name: scdf-server labels: app: scdf-server data: application.yaml: |- spring: cloud: dataflow: task: platform: kubernetes: accounts: default: limits: memory: 1024Mi cpu: 2 entry-point-style: exec image-pull-policy: always datasource: url: jdbc:oracle:thin:@db_url username: BATCH_APP password: ${oracle-root-password} driver-class-name: oracle.jdbc.OracleDriver testOnBorrow: true validationQuery: "SELECT 1" flyway: enabled: false jpa: hibernate: use-new-id-generator-mappings: true

oracle-secrets.yaml

apiVersion: v1 kind: Secret metadata: name: oracle labels: app: oracle data: oracle-root-password: a2xldT3ederhgyzFCajE4YQ== Any help would be much appreciated. Thanks.

Srkanna commented 4 years ago

@sabbyanandan , Do you have any inputs on this ?

sabbyanandan commented 4 years ago

Hi, @Srkanna. This appears to be the same topic discussed at https://github.com/spring-cloud/spring-cloud-dataflow/issues/3985.

Within SCDF itself, we have a redaction service, which will mask any sensitive credentials from the response when using the UI, API, or Shell.

However, in K8s, the fact that someone can describe to see the deployment or review the podspec, it then becomes a visible for folks even when using the configmap or secrets.

Feel free to engage in the primary issue and please take a stab at addressing it even.