Closed rstpv closed 4 years ago
Thanks, @rstpv. I was able to reproduce the behavior.
We will have a look.
Indeed this is easy to demonstrate with
time | log --expression="payload + 'foo \n bar'"
Ok. I had a hell of a time recreating this but I have now.
Easiest way for me was (using a kafka setup):
time --date-format="hh \n mm" | log
Here is a the json output if I curl the streams list endpoint. A3 has been deployed whilst A4 hasn't yet - so you can see the A4 DSL is still OK whilst A3 is busted because it has the newline directly in it.
{
"name": "A3",
"dslText": "time --trigger.date-format='mm \n mm' | log",
"status": "deployed",
"statusDescription": "The stream has been successfully deployed",
"_links": {
"self": {
"href": "http://localhost:9393/streams/definitions/A3"
}
}
},
{
"name": "A4",
"dslText": "time --date-format='hh \\n mm' | log",
"status": "undeployed",
"statusDescription": "The app or group is known to the system, but is not currently deployed",
"_links": {
"self": {
"href": "http://localhost:9393/streams/definitions/A4"
}
}
},
The escaping of the \ in the string is lost when the stream is rebuilt in DefaultStreamService.doDeployStream()
. The manifest in the Release
object in that method looks like this (this is a snippet):
"metadata":
"name": "time"
"spec":
"resource": "maven://org.springframework.cloud.stream.app:time-source-kafka:jar"
"resourceMetadata": "maven://org.springframework.cloud.stream.app:time-source-kafka:jar:jar:metadata:2.1.1.RELEASE"
"version": "2.1.1.RELEASE"
"applicationProperties":
"spring.metrics.export.triggers.application.includes": "integration**"
"spring.cloud.dataflow.stream.app.label": "time"
"spring.cloud.stream.metrics.key": "A1.time.${spring.cloud.application.guid}"
"spring.cloud.stream.bindings.output.producer.requiredGroups": "A1"
"spring.cloud.stream.metrics.properties": "spring.application.name,spring.application.index,spring.cloud.application.*,spring.cloud.dataflow.*"
"spring.cloud.stream.bindings.output.destination": "A1.time"
"spring.cloud.dataflow.stream.name": "A1"
"spring.cloud.dataflow.stream.app.type": "source"
"trigger.date-format": "hh \n mm"
"deploymentProperties":
"spring.cloud.deployer.group": "A1"
and when that is read in by SpringCloudDeployerApplicationManifestReader.read()
and converted into a definition by StreamDefinitionToDslConverter.toDsl()
the property value is a string with a break in it, this is written over the existing (working) definition and then we are broken.
By the time the controller sees the DSL it has been escaped properly. It looks like the manifest is storing the 'correct data for the deployment'. Does that just mean we need to add escaping encoding to the StreamDefinitionToDslConverter..toDsl()
method so that when converting actual property values back into DSL text, we escape the \n
as \\n
I created this PR https://github.com/spring-cloud/spring-cloud-dataflow/pull/3657 - it does fix it. Is it the right fix in the right place? I'll leave that to folks who know that code better than me :)
Description: If any parameter of the deployed apps has a multi line attribute, the deploy might work but, any attempt to update or to redeploy the application will be broken due to the multiple lines.
Release versions:
Steps to reproduce: Try to create this from the create stream section:
time --testingmultilineparameter="a \n a" | log
Create the stream and deploy it (I know the
testingmultilineparameter
doesn't exist for the time app but is just to show the issue, another example might be using the scriptable transformer). After it deploys try to update the stream. It will show only the time app and if you access the free text it will show only the first line of the multi line parameter.