Closed TiNico22 closed 9 years ago
Are you trying to share with someone who has access to your elasticsearch cluster? If so, that's as easy as sending your url.
There's also this little button:
That said, I personally don't think sharing a visualization from one cluster to another would be too useful, since the other person will have different data, indices, fields, etc.
It might be useful for tutorials, or blog posts, but even then the author could provide just the hash part of the url.
Thoughts?
Cross-cluster sharing may not work, but sharing the URL with people who have access to the same cluster/index is handy!
I was talking of the export shema feature from kibana 3 Idem for the load feature from kibana 3. The same apply for visualize and dasboard
We are sharing dashboard shema without other people whitout having network access to each other ELK servers. This helps, as blog post & tutorial samples, to start a new deployement with some sample from friends without having to rewrite everything.
As a workaround you can go to settings -> objects -> visualizations and copy all the fields from there and paste them into another search (create a new one and save it) even on another elk server
@robozu thanks for the advice Not as easy as the export schema from K3. It works also for dashboard. But with K3, exporting a dashboard was the equivalent of exporting a dashboard + all related visualize in K4 The major issue is, if I have a lot of visualize, to export once by once. I would rather have an "export all my visualize" feature.
Also, if I have mutiple environments like Dev, Test and Production with separate ELK infrastructure, this feature would be nice so we can migrate visualizations from environments. Unless there is another way to accomplish the same task.
+1 for the dashboard export feature - really handy for being able to move between dev, test and prod
+1, for the same reason as mentioned by @abh1nav
+1
+1
related issues: #2100 and #2695
Copying comment from related ticket: URLs are great way to share but can get messy. Some users are used to manipulating dashboard through JSON, which makes it easy to modify parameters cleanly. Some info here: http://blog.trifork.com/2014/05/20/advanced-kibana-dashboard/ Additionally can be useful to share searches and visualizations. thanks!
+1
We really need the ability to import and export dashboards to a gist. We have an open source project that uses es and it's really nice for us to send our customers hosting locally a nice looking dashboard..
+1 to help with dev/test/prod synchronization
:+1: @spenceralger: re:
"I personally don't think sharing a visualization from one cluster to another would be too useful, since the other person will have different data, indices, fields, etc."
For our use case, we have several elasticsearch instances that share the same indices and fields, but different data. So while the the data changes, the format won't. This would be super helpful.
Thanks all
:+1:
FYI: We were able to work around this by using Elasticdump -- simply pull the .kibana mapping and data from one instance and shove it into another :)
So I really liked the kibana 3 ability where you have a set of "static" dashboards stored in json files on disk, and managed via config management, rather than everyone being able to change everything inside the elasticsearch index.
+1
+1
:+1: for backup/restore and environment migration via export/import
+1 for the export feature - this is useful for migrations across clusters that share the same data.
You may find this useful for backup/restore/deploy in kibana4. It's still experimental but is working well in our tests: https://github.com/godaddy/kibana4-backup. Feedback is appreciated!
+1
Had a discussion with @rashidkpc today about exporting a Dashboard and all of it's associated objects. Adding a comment here at his suggestion.
I think what would be nice would be a way in the UI to choose to export a dashboard's schema, and all of the related object's schemas as well for transportation between environments. This could let you develop visualizations, dashboards and such and then promote them through to production (also track them in git or such...) Also discussed exposing this as an API call that would return the schemas.
Along with this would need to be a way to import this dashboard definition.
++1
+1 this is a bit of a problem for us going through a dev-test-staging CI process where all the indexes and data are the same.
In fact, +2
+1
+1 for export and +1 for import
+1
I've had success copying my Kibana definitions from one environment to another by simply backing up & then restoring the .kibana
index from one machine to another using the documented Elasticsearch method:
Would be interested to know if this the 'correct' way of doing it - certainly worked for me but YMMV.
@rmoff that is the best way to move an entire installation from one cluster to another, but it won't allow you to export a subset of your objects and import them into an existing install without removing all of the objects that already exist there. @lukasolson is working on a solution that will do just that for 4.1
+1
+1, feature is absolutely necessary
+1
+1
FWIW, for those who are dumping the .kibana index (without ES snapshot/restore), you only want 3 _types: dashboard, visualization, search. You may want to manually edit the dump and remove the _types: index-pattern, config. That'll make a restore successful even if you upgrade Kibana (the config is version dependent) plus it'll keep you from importing possibly invalid mapping caches (stored in the index-pattern).
:+1: * 1000
+1
+1
On Tue, Mar 31, 2015, 9:01 PM Alan Allegret notifications@github.com wrote:
+1
— Reply to this email directly or view it on GitHub https://github.com/elastic/kibana/issues/1552#issuecomment-88323419.
+1 for distributed setups that parse the same type of data instead of using a shared/central setup.
I had a question similar to this issue :
I need to generate 100+ search objects with line chart followed. Then pull the line chart to dashboard. Actually, the 100+ search objects were almost the same except the "search keyword" So what I needed to do is to manually key in the keyword one by one , then save to search , give it a name. Then from the visualization , looked from saved search, and make it a line chart , save it again. I did for 100+ times.
What I think the more efficient way is to get one search and visualization JSON; then use script to replace the "search keyword", and then put it back. Maybe using curl or elasticsearch.py , but is it possible to do that in kibana4 without export function ? Thanks
@t0mst0ne absolutely. All of kibana's saved objects are just elasticsearch documents, and while they aren't always super easy to work with (since they have embedded json and such) you can absolutely modify the objects outside of kibana.
After creating the first search source you can take a look in the kibana index and find the saved object that's created – http://localhost:9200/.kibana/_search Then, you would just write a script that indexes documents just like it, but with modified kibanaSavedObjectMeta.searchSourceJSON
values. The same is true for visualizations and dashboards. They are all just elasticsearch documents, but it might take a bit of reverse engineering.
+1
+1000 :)
:+1:
+1
+1
As visualize configuration can be more complex in Kibana 4 than in 3 and visualize are part of dashboard, import / export feature should be great to share visualize with other people.
Update: request applies to dashboard as well, see #1551