This repo is the app_frame/jekyll/pym version of something that was built in 2013. The original repo is found here.
IMPORTANT: When editing the gh-pages branch of this repository, you will be editing the content that lives at energy.gov/egallon.
Read the blogs if you're curious about how this came into being. The cms map itself is located here.
Read about it here and the methodology can be found here.
More articles:
The most important part of this is knowing where the data is from, and how it is collated. All of the data is from EIA. The electricity data is updated on a monthly basis and can be found in tabular form here. The gasoline is updated on a weekly basis but is not present for every state, therefore some states are grouped by region. It can be found here. Both sets of data are gathered from the EIA API using a script on energy.gov's servers (electricity price, gas price). This script runs weekly and the resulting data can be found at https://energy.gov/api/egallon/current/combined.json.
A weekly report of this data is emailed to Atiq Warriach and Ernest Ambrose to ensure that it is working properly. If they do not receive it, something is wrong.
NOTE: At the time being, the javascript at js/script.js
references a local version of the data from js/combined.json
rather than from https://energy.gov/api/egallon/current/combined.json. This is due to energy.gov/api needing to allow energyapps.github.io as an allowable origin for cross-domain serving. We are working to resolve this through a CORS module addition to the energy.gov drupal platform. Ticket is in Chauncey's hands.
To provide weekly data updates to eGallon, follow these steps
js/combined.json
file with the above latest combined.json file. When CORS is enabled for energyapps.github.io on energy.gov/api, do the following
js/script.js
$.getJSON("js/combined.json",function(result){
$.getJSON("https://energy.gov/api/egallon/current/combined.json",function(result){
The eGallon data is updated automatically and served at https://energy.gov/api/egallon/current/combined.json. This is achieved through a jenkins job scraping the above data sources once a week, and then adding it to our server. Though this is hosted privately, it is similar in setup to this utilities scraper. Questions on the location and functionality of the scraper should be addressde to Chauncey.
The css stuff is heavily modified but one essential part is based on this countdown clock.