jvm-operators / abstract-operator

Library/SDK for creating the operators for Kubernetes and Openshift.
Apache License 2.0
60 stars 17 forks source link

Make the CRD operation mode the default one. #36

Closed jkremser closed 5 years ago

jkremser commented 5 years ago

Description

If the CRD environment variable was empty, the default behavior was creating the watchers for config maps. This PR revers the logic, so that by default we create CR watchers. If user wants configmaps, he/she needs to explicitly specify CRD=false.

Types of changes

jkremser commented 5 years ago

@elmiko could you PTAL?

elmiko commented 5 years ago

i tested this and ran into something that i'm not sure about.

steps for testing

  1. build abstract-operator and install
  2. build spark-operator against new abstract-operator
  3. run spark-operator
  4. export CRD=false
  5. run spark-operator

the default worked as expected, here is the output from that run:

$ echo $CRD                                                                          

[mike@opb-ultra] master ~/workspace/spark-operator                                                                                               
$ java -jar target/spark-operator-0.3.6-SNAPSHOT.jar                                                                                             
2019-03-15 09:58:40 INFO  Entrypoint:56 - Starting..                                        
2019-03-15 09:58:41 INFO  Entrypoint:215 - https://shift.opb.studios:8443/oapi returned 200. We are on OpenShift.                                
2019-03-15 09:58:41 INFO  Manifests:193 - 10 attributes loaded from 1 stream(s) in 2ms, 10 saved, 0 ignored: ["Build-Jdk", "Built-By", "Created-By
", "Implementation-Build", "Implementation-Title", "Implementation-URL", "Implementation-Vendor-Id", "Implementation-Version", "Main-Class", "Manifest-Version"]                                                                                                                                   
2019-03-15 09:58:41 INFO  Entrypoint:236 -                                                                                                       
Operator has started in version 0.3.6-SNAPSHOT.                                                                                                   

2019-03-15 09:58:41 INFO  Entrypoint:239 - Git sha: d5e35a42                   
2019-03-15 09:58:41 INFO  Entrypoint:241 - ==================                                                                                    

2019-03-15 09:58:41 INFO  Entrypoint:74 - OpenShift environment detected.
2019-03-15 09:58:42 INFO  AbstractOperator:246 - Starting 'SparkApplication' operator for namespace abs                                          
2019-03-15 09:58:42 INFO  Entrypoint:174 - full reconciliation for 'SparkApplication' operator scheduled (periodically each 180 seconds)         
2019-03-15 09:58:42 INFO  Entrypoint:175 - the first full reconciliation for 'SparkApplication' operator is happening in 2 seconds               
2019-03-15 09:58:42 INFO  AbstractOperator:246 - Starting 'SparkCluster' operator for namespace abs                                              
2019-03-15 09:58:42 INFO  AbstractOperator:42 - SparkCluster operator default spark image = quay.io/jkremser/openshift-spark:2.4.0               
2019-03-15 09:58:42 INFO  AbstractOperator:264 - 'SparkApplication' operator running for namespace abs
2019-03-15 09:58:42 INFO  Entrypoint:154 - 'SparkApplication' operator started in namespace abs
2019-03-15 09:58:42 INFO  AbstractWatcher:157 - CustomResource watcher running for kinds SparkApplication
2019-03-15 09:58:42 INFO  Entrypoint:174 - full reconciliation for 'SparkCluster' operator scheduled (periodically each 180 seconds)
2019-03-15 09:58:42 INFO  Entrypoint:175 - the first full reconciliation for 'SparkCluster' operator is happening in 3 seconds
2019-03-15 09:58:42 INFO  AbstractOperator:264 - 'SparkCluster' operator running for namespace abs
2019-03-15 09:58:42 INFO  Entrypoint:154 - 'SparkCluster' operator started in namespace abs
2019-03-15 09:58:42 INFO  AbstractWatcher:157 - CustomResource watcher running for kinds SparkCluster            
2019-03-15 09:58:45 INFO  AbstractOperator:96 - Running full reconciliation for namespace abs and kind SparkCluster..                             
2019-03-15 09:58:46 INFO  AbstractOperator:166 - no change was detected during the reconciliation    

but when i set export CRD=false i was expecting to see the output confirm that the operator was looking for ConfigMap. instead the output is the same as for CRD.

 $ export CRD=false
[mike@opb-ultra] master ~/workspace/spark-operator
$ java -jar target/spark-operator-0.3.6-SNAPSHOT.jar
2019-03-15 09:59:03 INFO  Entrypoint:56 - Starting..
2019-03-15 09:59:04 INFO  Entrypoint:215 - https://shift.opb.studios:8443/oapi returned 200. We are on OpenShift.
2019-03-15 09:59:04 INFO  Manifests:193 - 10 attributes loaded from 1 stream(s) in 2ms, 10 saved, 0 ignored: ["Build-Jdk", "Built-By", "Created-By
", "Implementation-Build", "Implementation-Title", "Implementation-URL", "Implementation-Vendor-Id", "Implementation-Version", "Main-Class", "Mani
fest-Version"]
2019-03-15 09:59:04 INFO  Entrypoint:236 -
Operator has started in version 0.3.6-SNAPSHOT.

2019-03-15 09:59:04 INFO  Entrypoint:239 - Git sha: d5e35a42
2019-03-15 09:59:04 INFO  Entrypoint:241 - ==================

2019-03-15 09:59:04 INFO  Entrypoint:74 - OpenShift environment detected.
2019-03-15 09:59:06 INFO  AbstractOperator:246 - Starting 'SparkApplication' operator for namespace abs
2019-03-15 09:59:06 INFO  Entrypoint:174 - full reconciliation for 'SparkApplication' operator scheduled (periodically each 180 seconds)
2019-03-15 09:59:06 INFO  Entrypoint:175 - the first full reconciliation for 'SparkApplication' operator is happening in 2 seconds
2019-03-15 09:59:06 INFO  AbstractOperator:246 - Starting 'SparkCluster' operator for namespace abs
2019-03-15 09:59:06 INFO  AbstractOperator:42 - SparkCluster operator default spark image = quay.io/jkremser/openshift-spark:2.4.0
2019-03-15 09:59:06 INFO  AbstractOperator:264 - 'SparkApplication' operator running for namespace abs
2019-03-15 09:59:06 INFO  Entrypoint:154 - 'SparkApplication' operator started in namespace abs
2019-03-15 09:59:06 INFO  AbstractWatcher:157 - CustomResource watcher running for kinds SparkApplication
2019-03-15 09:59:06 INFO  Entrypoint:174 - full reconciliation for 'SparkCluster' operator scheduled (periodically each 180 seconds)
2019-03-15 09:59:06 INFO  Entrypoint:175 - the first full reconciliation for 'SparkCluster' operator is happening in 3 seconds
2019-03-15 09:59:06 INFO  AbstractOperator:264 - 'SparkCluster' operator running for namespace abs
2019-03-15 09:59:06 INFO  Entrypoint:154 - 'SparkCluster' operator started in namespace abs
2019-03-15 09:59:06 INFO  AbstractWatcher:157 - CustomResource watcher running for kinds SparkCluster
2019-03-15 09:59:09 INFO  AbstractOperator:96 - Running full reconciliation for namespace abs and kind SparkCluster..
2019-03-15 09:59:09 INFO  AbstractOperator:166 - no change was detected during the reconciliation
jkremser commented 5 years ago

good catch! I should have tested it better. It's because of the default value in the annotation parameter @Operator(...crd=true)

jkremser commented 5 years ago

now, it should be fixed

the crd parameter from the @Operator annotation is used only if the env variable CRD is empty

elmiko commented 5 years ago

ack, i'll give it a test

jkremser commented 5 years ago

thanks @elmiko