Azure / azure-functions-kafka-extension

Kafka extension for Azure Functions
MIT License
114 stars 78 forks source link

Support for certificate and key in pem string format instead of location to the file #310

Open vkhose opened 2 years ago

vkhose commented 2 years ago

Conlfuent.kafka and librdkafka supports authentication certificate in pem string format using "ssl.certificate.pem" and "ssl.key.pem" attributes. You do not have to maintain certificate as file within application or certificate store with this support. This extension does not supports these methods of specifying certificate. It would be great if we can add this support to the extensions. https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md

image

shrohilla commented 1 year ago

This will cause issue with the connected service of Scale Controller so we can't support this functionality with Kafka Extension

HaroldH76 commented 1 year ago

I don't understand the reason why this is not possible. But this is very inconvenient. Is there some guidance/examples for an easy way to upload certificates to a function app?

s-pilo commented 1 year ago

Agreed, we are having an issue because keyvault only allows pfx to be attached to functions, we tried running in a linux plan, but that only imports a .p12 into the linux container, but the kafka package needs either pem or pem file. This is causing us to store private key pem in the repo and be deployed as part of the package, which is not desirable, by a long shot.

HaroldH76 commented 1 year ago

I was thinking about something like this:

And then we can specify the KafkaTrigger like:

        public async Task ReceiveTest(
            [KafkaTrigger(
                "%KafkaSettingsBootstrapServers%",
                "%KafkaSettingsTopicsTestTopic%",
                Protocol = BrokerProtocol.Ssl,
                AuthenticationMode = BrokerAuthenticationMode.Plain,
                //SslCertificateLocation = "%KafkaSettingsCertificateLocation%",
                //SslKeyLocation = "%KafkaSettingsKeyLocation%",
                SslCertificatePem = "%KafkaSettingsCertificatePem%",
                SslKeyPem = "%KafkaSettingsKeyPem%",
                ConsumerGroup = "default")] Microsoft.Azure.WebJobs.Extensions.Kafka.KafkaEventData<string> eventData)

And sample settings:

    "KafkaSettingsCertificatePem": "MIIDvjCCAqag....",
    "KafkaSettingsKeyPem": "MIIDvjCCAqag....",

And then we don't need to mount a blobstorage fileshare to the Azure function and upload a key and certificate to it.

In the end this is what I had to do btw. And it turns out to be not that big of an issue once I found the correct bicep.

Create fileshare in storage account of Azure function:

 resource fileShare 'Microsoft.Storage/storageAccounts/fileServices/shares@2021-04-01' = {
  name: '${storageAccount.name}/default/filemount'
}

Mount fileshare to Azure function:

resource storageSetting 'Microsoft.Web/sites/config@2021-01-15' = {
  parent: functionApp
  name: 'azurestorageaccounts'
  properties: {
    filemount: {
      type: 'AzureFiles'
      shareName: 'filemount'
      mountPath: '/filemount'
      accountName: storageAccount.name     
      accessKey: storageAccount.listKeys().keys[0].value   
    }
  }
}

But is is easier and more approachable to use the IConfiguration like we do for all other settings.

spilo-apex commented 1 year ago

I was thinking about something like this:

  • I think it is easy for the KafkaTrigger to suppor the sss.key.pem and ssl.certificate.pem. The KafkaTrigger only needs to pass them through to the Confuent library
  • then we can use IConfiguration to set these in the KafkaTrigger
  • and we can set the contents of the pem in appsettings.json or environment variables or whatever IConfiguration supports
  • and the IConfiguration settings can come from KeyVault secret (not KeyVault certificate but KeyVault secret) ...

But is is easier and more approachable to use the IConfiguration like we do for all other settings.

The only issue I have with the storage approach is it is not necessarily secure. In our case we got most of the way there by utilizing a linux app plan but Azure does not let us attach a keyvault-stored pem to an app service plan (only pfx supported). The linux container does get a .p12, which the kafka trigger doesn't like, so our next attempt is to convert to pem during function startup (crossing finger).

This would be a whole lot simpler if we could either A) utilize pem string secrets as noted above or B) azure would support PEM files from keyvault to app services.

Option B doesn't help windows-hosted services (non-container), so pem string secrets is really the preferred method.