openintegrity / openintegrity-metrics

Discussing, designing and building the next steps for the open integrity index.
0 stars 0 forks source link

Collect and make available metrics data #2

Open jmatsushita opened 9 years ago

jmatsushita commented 9 years ago

What is the minimum viable data structure? The first best thing with regards to collecting and publishing metrics data? Versioned JSON files on Github? NoSQL database? Open scrapers on https://morph.io ? CKAN ?

jmatsushita commented 9 years ago

Also http://dat-data.com/ ?

andrew commented 9 years ago

I was pondering this over the weekend, Elasticsearch seems like a good fit for the kind of data modeling planned, it has a very flexible schema, can easily grow to handle more data and replication options and has some very powerful ways with "Aggregations": https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations.html

Also since v1.7 you can export whole dumps of the data for sharing publicly: https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html

jmatsushita commented 9 years ago

Elasticsearch as the main store? It would probably be the smoothest way to get started. Have you seen this about resilience.

I'm quite curious as to how well the MySQL / MongoDB combo is working in practice for @gousiosg with GHTorrent.

Another option is to borrow infrastructure like Big Query, Red Shift or Cloud Data Flow.

gousiosg commented 9 years ago

The MySQL and MongoDB combo is working quite well, scaling is just starting to become an issue. The real issue is consistency across the two.