thegreenwebfoundation / co2.js

An npm module for accessing the green web API, and estimating the carbon emissions from using digital services
Other
389 stars 49 forks source link

refactor(hosting): remove https package for fetch #72

Closed drydenwilliams closed 2 years ago

drydenwilliams commented 2 years ago

Removing the https package to use cross-fetch for both node and the browser

drydenwilliams commented 2 years ago

@mrchrisadams just a little PR for you here 👍

mrchrisadams commented 2 years ago

hi thanks for this Dryden!

I see the appeal of using the cross-fetch, but we went to some effort before to make co2js have as few dependencies as possible, and cross-fetch depends on a few modules that have a their own sets of dependencies we'd need to remember to audit.

I agree with the idea of not including node specific code, but would you consider doing it this way instead?

If we did, I think we'd still achieve the cross platform compatibility that the PR seems to be aiming for, without introducing any new dependencies.

If we keep the numbers of dependencies down, then it means we're not asking people to trust so much code that we don't have control over - particularly on the server side.

add a small function to let us check for nodejs at runtime

Something like this might work - do please investigate if there's a nicer way, that doesn't include new dependencies

/**
 * check if we're running inside a node environment by looking for the
 * existence of a process object, with accompanying properties
 * @return {Boolean}
 */
function isNodejs() {
  return typeof process === 'object' &&
    typeof process.versions === 'object' &&
    typeof process.versions.node !== 'undefined';
}

Change the getbody function to either use the existing node function, otherwise fallback to native fetch to return the string ready to parse as JSON

If we keep the function, but tweak it slightly to take into account it's being run inside, we don't need to throw it out to replace with a whole set of new modules brought in by cross-fetch.


/**
 * Accept a url and perform an http request returning the body
 * for parsing as JSON
 *
 * @param {string} url
 * @return {string}
 */
async function getBody(url) {
  // check what kind of runtime we have.
  // use the native node https libraries, otherwise fallback to fetch
  if (isNodejs()) {
    // Do the usual node thing, wrapping it in a Promise,
    // that resolves to a string of text that we can JSON.parse()

    return new Promise(function (resolve, reject) {
      // Do async job
      const req = https.get(url, function (res) {
        if (res.statusCode < 200 || res.statusCode >= 300) {
          log(
            "Could not get info from the Green Web Foundation API, %s for %s",
            res.statusCode,
            url
          );
          return reject(new Error(`Status Code: ${res.statusCode}`));
        }
        const data = [];

        res.on("data", (chunk) => {
          data.push(chunk);
        });

        res.on("end", () => resolve(Buffer.concat(data).toString()));
      });
      req.end();
    });
  }
  // otherwise we're in a webby environment like a browser or Deno.
  // These have native fetch - so use that instead, returning the
  // same Promise that resolves to a text string we can JSON.parse()
  else {
    const res = await fetch(url)
    return textResult.text()
  }
}

Adjust the checkagainstAPI and checkDomainsAgainstAPI functions to JSON parse the results of getBody function

Finally I see why you'd use the convenient response.json() provided by fetch - it is a nice API.

Would you mind if we kept the JSON parsing to use the global JSON object? It's pretty much the same length.

This way we can keep using the getBody function, and avoid relying on any external dependencies for our polyfill, that we can't control.

async function checkAgainstAPI(domain) {
  const checkRequest = await getBody(`https://api.thegreenwebfoundation.org/greencheck/${domain}`)
  const res = JSON.parse(checkRequest)
  return res.green;
}

I'd be open to adding cross-fetch later if we end up using more of the features it offers, but in this case, all we seem to be doing is parsing text as JSON after making a GET request.

This seems like something we already have working code for node to use its existing standard library, and for other platfoms we can rely on native fetch as your PR suggests.

Let me know what you think 👍🏽

mrchrisadams commented 2 years ago

If it helps, this stackoverflow post gives some more background on detecting nodejs versus other runtimes and platforms

https://stackoverflow.com/questions/17575790/environment-detection-node-js-or-browser

And this zero dependency npm module provides an expanded API for detecting other runtimes, if we need to resort to including another dependency. https://github.com/flexdinesh/browser-or-node

But before we do that, the specific node detecting code is here - it's literally 4 short lines long: https://github.com/flexdinesh/browser-or-node/blob/master/src/index.js#L4-L7

drydenwilliams commented 2 years ago

Closing to keep the current implimentation.