Open adityamaru opened 1 year ago
cc @cockroachdb/disaster-recovery
Another dimension in this issue is that CDC query params all need to be lowercase to pass in a CREATE EXTERNAL CONNECTION
statement, and also in a CREATE CHANGEFEED
statement.
To be clear on what I’ve found, to create an external connection:
COCKROACH_LOCALITY
will work (same for S3_STORAGE_CLASS
)TOPIC_NAME
, SASL_CLIENT_ID
, SASL_CLIENT_SECRET
, SASL_ENABLED
, SASL_MECHANISM
, SASL_TOKEN_URL
will not work, but all lowercase of the same will. Also, for changefeed if you try to directly pass uppercase params in your URI into a CREATE CHANGEFEED
statement, it will not work.e.g.
CREATE EXTERNAL CONNECTION kafka AS 'kafka://10.142.0.214:9093?TOPIC_NAME=vehicles&SASL_CLIENT_ID=redact&SASL_CLIENT_secret=redact&SASL_ENABLED=true&SASL_MECHANISM=OAUTHBEARER&SASL_TOKEN_URL=redact';
ERROR: failed to construct External Connection details: invalid changefeed sink URI: unknown kafka sink query parameters: SASL_CLIENT_secret, SASL_ENABLED, SASL_MECHANISM, SASL_TOKEN_URL, TOPIC_NAME, SASL_CLIENT_ID
These were just part of my testing, but all changefeed params act in this way (besides S3_STORAGE_CLASS
)
Just clarifying, do we want to allow mixed-case as well? E.g. SASL_enabled
, SASL_mechanism
. Or it's going to be either completely uppercase or lowercase? IMO it's fine to allow mixed-case.
cc: @stevendanna
Today, when a user passes in a cloud URI, we "consume" the parameters that we know how to process depending on the cloud provider. We do not normalize the parameters passed in by the user and so unless they match the capitalization of our baked in param names, the URI validation function will throw an error.
Egs:
s3_storage_class
is not a valid parameter to an s3 URI, butS3_STORAGE_CLASS
is. This is annoying and the error thrown could make the user question what they're doing wrong without any obvious next steps.Jira issue: CRDB-28198