dutchcoders / marija

Data exploration and visualisation for Elasticsearch and Splunk.
GNU Affero General Public License v3.0
234 stars 29 forks source link

Unable to use marija with elasticsearch 5.0.2 over ssh tunnel #42

Open ckuethe opened 7 years ago

ckuethe commented 7 years ago

I'm connecting to a remote Elasticsearch 5.0.2 instance over an SSH tunnel; the tunnel works because:

Attempting to "use the refresh icon to refresh the list of available fields" or "add the fields [...] to use as nodes" gives an error

Error executing query: The connection was closed abnormally, e.g., without sending or receiving a Close control frame

In my terminal the following messages are visible:

$ ./marija 
Marija server started, listening on address 127.0.0.1:8080.
2016/11/30 13:46:47 Connection upgraded.
2016/11/30 13:46:53 Connection closed
2016/11/30 13:46:56 Connection upgraded.
2016/11/30 13:47:04 Connection closed
2016/11/30 13:47:07 Connection upgraded.

Any suggestions on what to do next?

ckuethe commented 7 years ago
{
  "name" : "my-remote-server",
  "cluster_name" : "my-elk-stack",
  "cluster_uuid" : "KmBuokUjSnmsO7ZUYGygOA",
  "version" : {
    "number" : "5.0.2",
    "build_hash" : "f6b4951",
    "build_date" : "2016-11-24T10:07:18.101Z",
    "build_snapshot" : false,
    "lucene_version" : "6.2.1"
  },
  "tagline" : "You Know, for Search"
}
nl5887 commented 7 years ago

@ckuethe did you use the latest commit? Or a release version?

nl5887 commented 7 years ago

Just pushed a version that disables sniffing (this resolves the ip from the Elasticsearch nodes), causing probably your issue.

ckuethe commented 7 years ago

Thanks for looking into this. Unfortunately even with the latest commit I still see this behavior... I'll clear my browser storage and do some more testing in the morning.

nl5887 commented 7 years ago

I haven't tested it yet with ES 5 over a tunnel, but it works with older versions. Will setup some things to see if we can reproduce. Keep us posted, we're eager to solve this issue.

nl5887 commented 7 years ago

Something I noticed before, sometimes there are too many fields in the index (we'll look into this issue), could you remove all but the relevant indexes, and retry refreshing the fields?

ckuethe commented 7 years ago

There about 125 fields in each of the indices as I'm exploring syslog with a bunch of different hosts, log types, and fields courtesy of filebeat and logstash

nl5887 commented 7 years ago

Could you export the mapping so we can try to reproduce the environment?

ckuethe commented 7 years ago

https://gist.github.com/ckuethe/a0c0ce3033c8250eb68e2362ffb30a85

ckuethe commented 7 years ago

Oops - 142 fields.

screenshot_2016-12-01_12-33-06

nl5887 commented 7 years ago

thx, we'll look into this issue