Closed gnarlyman closed 6 years ago
@gnarlyman thanks for the issue and the good eye. I'll get this fixed asap.
But please note that the use of verify_certs
is depreicated.
Please try creating an ssl_context object and set the verification mode on the context.
import ssl
from elasticsearch.connection import create_ssl_context
ssl_context = create_ssl_context(<use `cafile`, or `cadata` or `capath` to set your CA or CAs)
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
es = Elasticsearch('localhost', ssl_context=context, timeout=60
thank you for the workaround!
I gave the workaround a try and I could not make it work. I tried two variations, one with a cafile
(from certifi) and one without a cafile
when creating the SSL context (+ I also explicitly set verify_certs
to False
).
This is my test program:
import ssl
from elasticsearch import Elasticsearch
from elasticsearch.connection import create_ssl_context
def main():
# no cafile!
ssl_context = create_ssl_context()
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
es = Elasticsearch(hosts=[{'host': 'localhost', 'port': 39200}],
scheme="https",
# to ensure that it does not use the default value `True`
verify_certs=False,
ssl_context=ssl_context,
http_auth=("rally", "rally-password"))
es.info()
if __name__ == '__main__':
main()
It always fails with:
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))
When running this in the REPL, I noticed that ssl_context.verify_mode
has been set to VerifyMode.CERT_REQUIRED
again after the (failing) call es.info()
.
Tbh, I did not completely debug the issue. I think the reason is that when creating Urllib3HttpConnection
ca_cert is always set and further down the line urrlib3 overrides the verification mode again when a certificate is provided.
@danielmitterdorfer
Can you give me some version info please?
OS, python, elasticsearch-py version, es version.
how did you create your certs?
i just created a new instance of ES running on AWS redhat7, ES version 6.1.2. I used certgen
bundled with Elasticsearch to create a CA cert for my instance.
locally to test i'm using python2.7.13, (running in a docker container, python:2
), on archlinux
i followed your exact steps:
In [24]: host = "ip.address.xx.xx"
In [25]: port=9200
In [26]: user='elastic'
In [27]: password='somepassword'
In [34]: def main():
...: ssl_context = create_ssl_context()
...: ssl_context.check_hostname = False
...: ssl_context.verify_mode = ssl.CERT_NONE
...:
...: es = Elasticsearch(hosts=[{'host': host, 'port': 9200}],
...: scheme="https",
...: # to ensure that it does not use the default value `True`
...: verify_certs=False,
...: ssl_context=ssl_context,
...: http_auth=(user, password))
...: return es.info()
...:
In [35]: print(main())
/usr/local/lib/python2.7/site-packages/urllib3/connectionpool.py:858: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecureRequestWarning)
{u'cluster_name': u'py-example', u'cluster_uuid': u'Uio1sfKxQlOj3sOQqmbNSQ', u'version': {u'build_date': u'2018-01-10T02:35:59.208Z', u'minimum_wire_compatibility_version': u'5.6.0', u'build_hash': u'5b1fea5', u'number': u'6.1.2', u'minimum_index_compatibility_version': u'5.0.0', u'build_snapshot': False, u'lucene_version': u'7.1.0'}, u'name': u'2pQh1Ls', u'tagline': u'You Know, for Search'}
I'm struggling to reproduce this issue locally :(
Also please note that verify_certs
is depreicated. So you're using ssl_context
you don't need to use verify_certs
.
that being said I tried to use both (verify_certs
and ssl_context
) and ssl_context
without verify_certs
.
both work for me. I'm hoping by dialing in the versions of everything I can get a reproducible environment.
Can you give me some version info please?
Sure:
uname -a
): Darwin io 16.7.0 Darwin Kernel Version 16.7.0: Mon Nov 13 21:56:25 PST 2017; root:xnu-3789.72.11~1/RELEASE_X86_64 x86_64
how did you create your certs?
I also generated the certificates with certgen
. They are static and only used for benchmarking (hence not meant to provide any security): https://github.com/elastic/rally-teams/tree/master/plugins/x_pack/security/config/x-pack/ca
Also please note that verify_certs is depreicated. So you're using ssl_context you don't need to use verify_certs.
I know and originally I did not set it. I just noticed that the default value is True
and thus I explicitly set it to False
.
I tried the sample program above now again in two clean virtualenvs:
The problem boils down to the presence of certifi. Everything works fine without certifi. As soon as I install it, its certificates are automatically chosen and I get the failure that I described above. Did you install certifi in your environment?
For completeness, pip list
shows the same list in both virtualenvs:
certifi (2018.1.18)
elasticsearch (6.1.1)
pip (9.0.1)
setuptools (38.4.0)
urllib3 (1.22)
wheel (0.30.0)
I went through the debugger a bunch and found that verify_certs
is ignored if ca_certs
is None
or set to some value (None
is taken as "use defaults", which results in certs being set to required). Simply set this to a False
value of some sort that isn't None
and it should work.
es = Elasticsearch("https://user:pass@myelasticsearch",
ca_certs=False,
verify_certs=False)
This seems to be an issue with the underlying Python library, but it's difficult to figure that out due to the way keyword args are passed around in the Elasticsearch library.
I'm working on a PR for this to clean things up. After a discussion we decided it was going to be better not to replace SSL kwargs with SSLcontext. And instead just provide the possibility of adding SSLContext, and if you use that all other SSL kwargs will be ignored.
There is definitely some change that has occurred with Python 3.6 or urllib3 after 1.19 to cause this. With Python 3.5 and urllib3 1.19 theverify_certs=False
just works. None of the workarounds provided above worked for me using Python 3.6 and urllib 1.22.
Did anyone manage to make it work with python3.5 and latest urllib3?
closed via: #714
Hi, sorry, I'm still confused: issue was closed but I still can't make it work (as other have recently pointed too)
I tried:
option 1
es = Elasticsearch("http://172.16.12.23:9200", ca_certs=False, verify_certs=False)
option 2
es = Elasticsearch("http://172.16.12.23:9200")
option 3
es = Elasticsearch(hosts=[{"host":"172.16.12.23", "port":"9200"}],
scheme="http", use_ssl=False, verify_certs=False)
option 4
ssl_context = create_ssl_context()
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
es = Elasticsearch(hosts=[{'172.16.12.23': host, 'port': 9200}], scheme="http", verify_certs=False, ssl_context=ssl_context)
always getting
org.apache.spark.SparkException: An exception was raised by Python:
Traceback (most recent call last):
File "/opt/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/streaming/util.py", line 68, in call
r = self.func(t, *rdds)
File "/opt/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/streaming/dstream.py", line 161, in <lambda>
File "/home/CT6CFE5N/soron/src/Spark/OpenstackEventProcessor_Streaming.py", line 114, in processEvents
def processEvents(events):
File "/home/CT6CFE5N/.local/lib/python2.7/site-packages/elasticsearch/client/utils.py", line 76, in _wrapped
return func(*args, params=params, **kwargs)
File "/home/CT6CFE5N/.local/lib/python2.7/site-packages/elasticsearch/client/__init__.py", line 319, in index
_make_path(index, doc_type, id), params=params, body=body)
File "/home/CT6CFE5N/.local/lib/python2.7/site-packages/elasticsearch/transport.py", line 318, in perform_request
status, headers_response, data = connection.perform_request(method, url, params, body, headers=headers, ignore=ignore, timeout=timeout)
File "/home/CT6CFE5N/.local/lib/python2.7/site-packages/elasticsearch/connection/http_urllib3.py", line 178, in perform_request
raise SSLError('N/A', str(e), e)
SSLError: ConnectionError([SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:618)) caused by: SSLError([SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:618))
Using python 2.7.5, elasticsearch 6.3.1, ulrllib3-1.24.1, kafka-python 1.4.4
Thanks Chris
@kcris Can you please tell me why you're trying to use SSL and http
?
I'm fairly certain the reason you're having issues is cause you're mixing SSL with a non SSL protocol.
@fxdgear
Hi, I'm trying to use plain http, and not https.
What am I doing wrong here?
The http url is fine, I checked via curl/browser.
Thanks
If you're doing plain http
then you shouldn't need to do anything with any of the SSL stuff.
I'd say try doing this:
es = Elasticsearch("172.16.12.23")
@fxdgear thanks a lot, it works!
I am trying to load several json files to my "secured" ES instance (search guard certs). I set the code like this:
es = Elasticsearch( hosts = [{'host': 'https://admin:admin@localhost:9200'}], use_ssl = False, verify_certs = False )
but still receive this error:
requests.exceptions.SSLError: HTTPSConnectionPool(host='localhost', port=9200): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:726)'),))
how to fix it?
i'm using python 2.7, elastic 6.7.1.
I experienced a similar problem and the following way seems to work for elastic 6.3.1 and urllib3 1.25.3.
from elasticsearch import Elasticsearch, RequestsHttpConnection
es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}],
verify_certs=False,
connection_class=RequestsHttpConnection)
In a nutshell, the default connection class is Urllib3HttpConnection, which raises the exception below:
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:720)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:720))
If the connection class is set to RequestsHttpConnection
, just a warning message will appear:
UserWarning: Connecting to https://localhost:9200 using SSL with verify_certs=False is insecure.
If you do want to verify requests and you have certifi
installed you can also do:
es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}],
verify_certs=True,
ca_certs=certifi.where())
I'm still having trouble with this. I have the OpenDistro ElasticSearch docker running on port 9200. I can get through to it with curl:
jack@Tower:~$ curl https://admin:admin@localhost:9200 --insecure
{
"name" : "70dcf08de37f",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "Ag1T5K2MR-aX8DbgWeO0AQ",
"version" : {
"number" : "7.1.1",
"build_flavor" : "oss",
"build_type" : "tar",
"build_hash" : "7a013de",
"build_date" : "2019-05-23T14:04:00.380842Z",
"build_snapshot" : false,
"lucene_version" : "8.0.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
Note that the connection is HTTPS, but the --insecure
option tells curl not to check the certificate. However, this code fails in Python:
from elasticsearch import Elasticsearch
client = Elasticsearch(
"https://admin:admin@localhost:9200",
verify_certs=False
)
client.indices.create(index="sessions")
yielding the same error as in the OP:
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056))
I'm using the latest version of elasticsearch as far as I know, I installed it with pip on Python 3.7 just today.
Hi @geajack ,
Have you tried the following code?
from elasticsearch import Elasticsearch, RequestsHttpConnection
es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}],
verify_certs=False,
connection_class=RequestsHttpConnection)
find more details here https://github.com/elastic/elasticsearch-py/issues/712#issuecomment-497251933
Hi @geajack ,
Have you tried the following code?
from elasticsearch import Elasticsearch, RequestsHttpConnection es = Elasticsearch([{'host': 'https://admin:admin@localhost:9200'}], verify_certs=False, connection_class=RequestsHttpConnection)
find more details here #712 (comment)
For me, it only worked after removing list and dict, and simply using the raw connection string. ES version 6.6.0 and elasticsearch6 (6.4.2) python package.
from elasticsearch import Elasticsearch, RequestsHttpConnection
es = Elasticsearch('https://admin:admin@localhost:9200',
verify_certs=False,
connection_class=RequestsHttpConnection)
For me, it only worked after removing list and dict, and simply using the raw connection string. ES version 6.6.0 and elasticsearch6 (6.4.2) python package.
from elasticsearch import Elasticsearch, RequestsHttpConnection es = Elasticsearch('https://admin:admin@localhost:9200', verify_certs=False, connection_class=RequestsHttpConnection)
Man that took me so long to figure it out. Thanks a lot!
Sorry to revive this issue, I'm stuck.
I'm trying to connect to an ES DB, if I perform the connection via chrome I can do it 100%, with a secure connection logo.
Using either CURL or python-elasticsearch I cannot.
elasticsearch.exceptions.SSLError: ConnectionError(HTTPSConnectionPool(host='****my-public-domain****', port=9200): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)')))) caused by: SSLError(HTTPSConnectionPool(host='****my-public-domain****', port=9200): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)'))))
I have now tried to:
The code:
try:
import certifi
self.db = Elasticsearch(hostname,
scheme=scheme,
http_auth=(username, password),
connection_class=RequestsHttpConnection,
# enable SSL
use_ssl=True,
# verify SSL certificates to authenticare
verify_certs=True,
# path to the certificates
ca_certs=certifi.where())
# sniff_on_start=True,
# sniff_on_connection_fail=True,
# sniffer_timeout=60)
except es.exceptions.ConnectionError: # pragma: no cover
print("Failed to Estabilish connection to {}!"
.format(hostname)) # pragma: no cover
# 502 Bad Gateway
return 502
except es.exceptions.AuthenticationException:
print("Failed to authenticate {} @ {}"
.format(username, hostname))
return 401 # 401 Unauthorized
except Exception as error:
print("Elasticsearch Client Error:", error)
return 200 # 200 OK
print(self.db.ping()) # returns false and it shouldn't
self.db.info() # raises the exception elasticsearch.exceptions.SSLError: ConnectionError(HTTPSConnectionPool...
Any help would be appreciated, running windows and the ES DB is a docker container @elasticsearch:7.9.2 the py:
client machine (windows 10):
server docker container @elasticsearch:7.9.2 running outside of my client machine.
PS: if I disable verify_certs flag it works but with a warning.
Best regards, André Carvalho
Hi @fxdgear @valeriocos @JamesHutchisonPremise
I am using python:2.7.16 Elasticsearch:6.8.12 with TLS enable
I am getting output on my server/host if i run curl --insecure $CREDS -X GET "localhost:9210/metricbeat*/_search" | python -m json.tool
if I use below code in my python script:
from elasticsearch import Elasticsearch, RequestsHttpConnection
url = Elasticsearch(['https://' + options.controller + ':9210/metricbeat*/_search'], verify_certs=False, connection_class=RequestsHttpConnection)
Error: collecting system information... /Users/Library/Python/2.7/lib/python/site-packages/elasticsearch/connection/http_requests.py:141: UserWarning: Connecting to https://x.x.x.x:9210 using SSL with verify_certs=False is insecure. % self.host waiting for query results... error: failed to query elastic search data: No connection adapters were found for u"<Elasticsearch([{u'url_prefix': '/metricbeat*/_search', u'use_ssl': True, u'host': 'x.x.x.x', u'port': 9210}])>"
[option.controller is my server/host]
I am still facing the same issue :( TlsError: TLS error caused by: TlsError(TLS error caused by: SSLError([SSL: SSLV3_ALERT_BAD_CERTIFICATE] sslv3 alert bad certificate (_ssl.c:2635)))
Python version 3.8.11 Elastic search version 8.1.0
es = Elasticsearch("https://localhost:9300",ca_certs=False,verify_certs=False)
I have set ca_certs to False and also verify_certs to False. Not understanding the why this is happening
@vibha0411 , you can try downgrading elasticsearch to 7.9.1. It helped in my case. The new version is too strict.
yes, this solve my problem, after degrade and add verify_certs=False
,
connection_class=RequestsHttpConnection
, thanks
try this if you have elastic on a host and don't have port:
pip install elasticsearch7
# your python script
from elasticsearch7 import Elasticsearch
client = Elasticsearch(
"https://elastic:your_password@your_elastic_host.com",
)
client.info()
I also have the same problem. This topic was opened 6 years ago and there isn't an official solution on how to turn off the certificate verification (apparently even setting the flag does nothing).
How is this possible?
In elasticsearch version 6.6.1 and elasticsearch-dsl version 6.1.0, ssl verification seems to ignore the verify_certs option. When set to True, the cert is still verified and fails on self-signed certs.
In version elasticsearch 5.5.1, and elasticsearch-dsl version 5.4.0, the verify_certs options works as expected.
client = Elasticsearch( hosts=['localhost'], verify_certs=False, timeout=60 )
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))