Open andsel opened 4 years ago
I cc @robbavey for knowledge
this might be an issue with the JDK itself - reading PKCS#8 encrypted private keys. going to follow up with an isolated reproducer.
@andsel I couldn't reproduce the same behaviour, but what I did observe was:
-nocrypt
worked on 6.0.7
and 6.0.9
openssl pkcs8 -topk8 -inform PEM -in instance.key -out instancekey.pkcs8
and supplying a password and adding it to the configuration with ssl_key_passphrase
worked on 6.0.7
and 6.0.9
openssl pkcs8 -topk8 -inform PEM -in instance.key -out instancekey.pkcs8
and supplying an empty password and leaving defaults in the configuration did not work in 6.0.9
, and appeared to work in 6.0.7
. However when data started coming in, the same error happened in 6.0.7
. This is likely because the SSLContext was created per connection in 6.0.7
, and only once in 6.0.9
during plugin registrationssl_key_passphrase
worked in 6.0.7
and 6.0.9
There is an existing integration test that uses ssl_key_passphrase
and this still works - updating this to use an empty password or nocrypt
shows the same behaviour as described above.
What was the error you saw in your tests?
When I said "work with 6.0.7" I meant that LS started without problems, I hadn't sent any data to LS input beats.
The results of my tests are: | --nocrypt | empty pwd | "asd" pwd | |
---|---|---|---|---|
6.0.7 | v | v | v | |
6.0.9 | v | x | x |
v = LS started without error x = LS logged error during startup My full log is:
andrea@kalispera logstash-7.6.1 $ bin/logstash -f /home/andrea/workspace/logstash_andsel/andseldev/input_beats_pcks.conf
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/jars/jruby-complete-9.2.9.0.jar) to method sun.nio.ch.NativeThread.signal(long)
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /home/andrea/workspace/elastic_products/logstash-7.6.1/logs which is now configured via log4j2.properties
[2020-03-31T12:37:01,327][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-03-31T12:37:01,462][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.1"}
[2020-03-31T12:37:02,496][INFO ][org.reflections.Reflections] Reflections took 21 ms to scan 1 urls, producing 20 keys and 40 values
[2020-03-31T12:37:02,886][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-03-31T12:37:02,903][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/home/andrea/workspace/logstash_andsel/andseldev/input_beats_pcks.conf"], :thread=>"#<Thread:0x1b1bda65 run>"}
[2020-03-31T12:37:03,472][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-03-31T12:37:03,806][ERROR][logstash.javapipeline ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>java.lang.IllegalArgumentException: File does not contain valid private key: /home/andrea/workspace/elastic_products/elasticsearch-7.6.0/instance/instancekey.pkcs8, :backtrace=>[
" io.netty.handler.ssl.SslContextBuilder.keyManager(io/netty/handler/ssl/SslContextBuilder.java:270)",
" io.netty.handler.ssl.SslContextBuilder.forServer(io/netty/handler/ssl/SslContextBuilder.java:90)",
" org.logstash.netty.SslContextBuilder.buildContext(org/logstash/netty/SslContextBuilder.java:104)",
" jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
" jdk.internal.reflect.NativeMethodAccessorImpl.invoke(jdk/internal/reflect/NativeMethodAccessorImpl.java:62)",
" jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)",
" java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)",
" org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:426)",
" org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:293)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.create_server(/home/andrea/workspace/elastic_products/logstash-7.6.1/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:181)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$create_server$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//home/andrea/workspace/elastic_products/logstash-7.6.1/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.register(/home/andrea/workspace/elastic_products/logstash-7.6.1/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:157)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$register$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//home/andrea/workspace/elastic_products/logstash-7.6.1/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:200)",
" org.jruby.RubyArray.each(org/jruby/RubyArray.java:1814)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:199)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/logstash_minus_core/lib/logstash//home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.start_inputs(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:310)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/logstash_minus_core/lib/logstash//home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:270)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/logstash_minus_core/lib/logstash//home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.run(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:154)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$run$0$__VARARGS__(home/andrea/workspace/elastic_products/logstash_minus_7_dot_6_dot_1/logstash_minus_core/lib/logstash//home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb)",
" home.andrea.workspace.elastic_products.logstash_minus_7_dot_6_dot_1.logstash_minus_core.lib.logstash.java_pipeline.start(/home/andrea/workspace/elastic_products/logstash-7.6.1/logstash-core/lib/logstash/java_pipeline.rb:109)",
" org.jruby.RubyProc.call(org/jruby/RubyProc.java:274)",
" java.lang.Thread.run(java/lang/Thread.java:834)"], "pipeline.sources"=>["/home/andrea/workspace/logstash_andsel/andseldev/input_beats_pcks.conf"], :thread=>"#<Thread:0x1b1bda65 run>"}
[2020-03-31T12:37:03,827][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-03-31T12:37:04,010][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-31T12:37:09,078][INFO ][logstash.runner ] Logstash shut down.
extracted reproducer (~ pretty much what Netty does under the hood) :
keyFile = ...
keyPassword = 'pass'
# NOTE: reading PKCS8 fails regardless or whether we register BC :
# require 'jopenssl/load'
# java.security.Security.addProvider(org.bouncycastle.jce.provider.BouncyCastleProvider.new)
def generateKeySpec(password, key)
if password.nil?
return java.security.spec.PKCS8EncodedKeySpec.new(key)
end
encryptedPrivateKeyInfo = javax.crypto.EncryptedPrivateKeyInfo.new(key)
puts encryptedPrivateKeyInfo
puts " algName: #{encryptedPrivateKeyInfo.getAlgName()}"
# "1.2.840.113549.1.5.13" (OID: PBEWithMD5AndDES)
algName = encryptedPrivateKeyInfo.getAlgName()
#algName = "PBEWithMD5AndDES"
keyFactory = javax.crypto.SecretKeyFactory.getInstance(algName)
# java.security.NoSuchAlgorithmException: 1.2.840.113549.1.5.13 SecretKeyFactory not available
# <init> at javax/crypto/SecretKeyFactory.java:122
# getInstance at javax/crypto/SecretKeyFactory.java:168
pbeKeySpec = javax.crypto.spec.PBEKeySpec.new(password)
pbeKey = keyFactory.generateSecret(pbeKeySpec)
cipher = javax.crypto.Cipher.getInstance(algName)
cipher.init(javax.crypto.Cipher::DECRYPT_MODE, pbeKey, encryptedPrivateKeyInfo.getAlgParameters())
return encryptedPrivateKeyInfo.getKeySpec(cipher)
end
load 'vendor/jar-dependencies/io/netty/netty-all/4.1.30.Final/netty-all-4.1.30.Final.jar'
encodedKeyBuf = Java::io.netty.handler.ssl.PemReader.readPrivateKey File.new(keyFile).to_input_stream
encodedKey = Java::byte[ encodedKeyBuf.readableBytes() ].new
encodedKeyBuf.readBytes(encodedKey).release()
encodedKeySpec = generateKeySpec(keyPassword == nil ? nil : keyPassword.to_java.toCharArray(), encodedKey)
puts "generating private key ..."
key = java.security.KeyFactory.getInstance("RSA").generatePrivate(encodedKeySpec)
puts "generated private-key: #{key}"
JDK does not recognise the ObjectID (even if we help it supplying the name - it won't work):
java.security.NoSuchAlgorithmException: 1.2.840.113549.1.5.13 SecretKeyFactory not available
only way I was able to read the key is using BC specific (OpenSSL) APIs. given @robbavey's observation I believe that these PEM protected keys never really worked*, its only that the error was delayed in previous versions until TLS connection initialization started.
* maybe with an empty password - not sure if than the key storage format is a bit more readable than
Decrypting the key seems to work, when it is encrypted with a compatible cipher. (https://community.snowflake.com/s/article/Private-key-provided-is-invalid-or-not-supported-rsa-key-p8--data-isn-t-an-object-ID)
openssl pkcs8 -topk8 -inform PEM -v1 PBE-SHA1-RC4-128 -out rsa_key.p8
I don't know if it's related to #364, I suspect no since with version
6.0.7
it worked Check was done with Logstash7.6.1
Please post all product and debugging questions on our forum. Your questions will reach our wider community members there, and if we confirm that there is a bug, then we can open a new issue here.
Version: >=
6.0.8
Operating System: Linux
Config File :
From Elasticsearch we use the tool
elasticsearch-certutil
and thenopenssl
to convert the key in PEMSteps to Reproduce:
create a private key PCKS#8 PEM with and without passphrase
run Logstash with the previous configuration and see the errors in log for version
>= 6.0.8
while doesn't appear in version6.0.7
. In that version was changed fromorg.logstash.netty.SslSimpleBuilder
toorg.logstash.netty.SslContext
create a private key
./bin/elasticsearch-certutil ca --pem
goes with the defaultunzip elastic-stack-ca.zip
./bin/elasticsearch-certutil cert --pem --ca-key ca/ca.key --ca-cert ca/ca.crt
go with default valuesunzip certificate-bundle.zip
into/tmp/certs/
openssl pkcs8 -topk8 -inform PEM -in instance.key -out instancekey.pkcs8
put a password (empry or whatever string, then use it in the pipeline config)run Logstash bin/logstash -f config/input_beats_pcks.conf
Now to prove that with beats plugin version
6.0.7
works, remove the plugin and install the old version:Re re-run the config and see that the plugin works correctly.
If we use a PCKS#8 private key file without passphrase:
and comment out the setting
ssl_key_passphrase
then it will work with both plugin versions