This change adds the ability to upload some Sightglass measurement data
to an ElasticSearch server. To do so, the parts of the data that are
environment-specific (machine, engine, benchmark) are "fingerprinted,"
or given a unique ID to differentiate them from other data points. The
"fingerprinting" also serves to unify data points more safely. E.g., if
the same benchmark is run on two different machines with the same
engine, the database will contain two machine entries and a single entry
each for the engine and benchmark; the measurement entries will
reference the de-duplicated ID.
For example, to upload some measurements to the (default) localhost server:
$ sightglass-cli upload -f measurements.json
The mechanism for uploading the various data points to an HTTP endpoint
is specific to ElasticSearch in expectation of a future commit defining
how to run such a database. With some work, this upload functionality
could be made more generic or adapted to other database types.
This commit also contains the ability to upload measurement data at a
later time. This is helpful if the ElasticSearch endpoint is not
available from wherever the measurements are collected. In this scenario
(which I have often been in), the fingerprinted, timestamped
measurements can be "packaged up" into a JSON file using the --dry-run
flag and moved to where they can subsequently be uploaded. For example:
$ sightglass-cli upload --dry-run -f measurements.json > package.json
[move package.json to some other place]
$ sightglass-cli upload --from-package package.json
This change adds the ability to upload some Sightglass measurement data to an ElasticSearch server. To do so, the parts of the data that are environment-specific (machine, engine, benchmark) are "fingerprinted," or given a unique ID to differentiate them from other data points. The "fingerprinting" also serves to unify data points more safely. E.g., if the same benchmark is run on two different machines with the same engine, the database will contain two machine entries and a single entry each for the engine and benchmark; the measurement entries will reference the de-duplicated ID.
For example, to upload some measurements to the (default) localhost server:
The mechanism for uploading the various data points to an HTTP endpoint is specific to ElasticSearch in expectation of a future commit defining how to run such a database. With some work, this
upload
functionality could be made more generic or adapted to other database types.This commit also contains the ability to upload measurement data at a later time. This is helpful if the ElasticSearch endpoint is not available from wherever the measurements are collected. In this scenario (which I have often been in), the fingerprinted, timestamped measurements can be "packaged up" into a JSON file using the
--dry-run
flag and moved to where they can subsequently be uploaded. For example: