Green-Software-Foundation / projects

The Green Software Foundation's Projects repository
Creative Commons Attribution 4.0 International
0 stars 4 forks source link

[Project] Software Carbon Efficiency Rating (SCER) #44

Closed mvaltas closed 1 year ago

mvaltas commented 1 year ago

Working Title: Software Carbon Efficiency Rating (SCER)

Related issues or discussions:

Tagline: A consumer-friendly rating for software carbon intensity

Abstract:

This project aims to develop a standard for a benchmark platform and test metrics for evaluating the carbon efficiency of software. The initiative will provide comparable scores for software with the same functionality, informing procurement decisions and potentially shaping regulations similar to ENERGY STAR or EPA Fuel Economy Ratings.

Quote:

"The Software Carbon Efficiency Rating standard is a game-changer. It guides us towards more sustainable software choices, aligning our digital transformation with our commitment to environmental stewardship. Now, we can make procurement decisions that not only enhance operational efficiency but also help reduce our carbon footprint." - Alex Smith, CTO of NexTech Innovations

"As a consumer concerned about my carbon footprint, the Software Carbon Efficiency Rating (SCER) is an invaluable tool. It allows me to make informed choices about the digital products I use daily, knowing their environmental impact. SCER empowers us all to contribute to a sustainable future." - Emily Jones, Innovations User

Audience:

This standard is targeted toward individuals who use software either as a standalone product or as a service. By providing a rating system for carbon efficiency, consumers will be able to make informed decisions when choosing between similar offerings, such as music streaming services. Large organizations' procurement departments will also benefit from this rating system when making decisions about service contracts or software purchases. Additionally, software companies can use this rating system as a marketing tool to compete based on the efficiency of their offerings.

Governance: Which working group(s) do you think should govern this project?

Problem:

  1. Currently, there are no widely recognized standards for measuring the carbon efficiency of software. As a result, consumers are left in the dark when it comes to choosing software that is environmentally friendly.
  2. Many companies lack sufficient information to make decisions regarding large-scale contracts for software services, especially when it comes to ensuring that these contracts align with the company's goals for reducing carbon emissions.
  3. To ensure comparability, certain parameters must be controlled when implementing our Software Carbon Intensity (SCI) standard. Companies cannot advertise SCI as a competitive action.

There are no broadly accepted standards that implement a consumer-friendly rating for software carbon intensity. In Germany, the Blue Angels - DE-UZ215 Resources and Energy Efficient Software Products is the closest proposed. But this standard lacks adoption and does not address software other than desktop ones with graphical user interfaces.

Possible strategies and approaches were proposed in academic research, for example, Sustainable software products - Towards assessment criteria for resource and energy efficiency. This indicates a growing interest in software ratings that address sustainability.

Our proposed Software Carbon Efficiency Rating (SCER) presents a solution by establishing a comprehensive, globally applicable standard. SCER focuses on software's core functionality across various platforms, not just desktop ones. It enables comparability, aiding consumers and corporations alike in making environmentally-conscious decisions. By creating a controlled testing environment, SCER ensures fair competition among software providers and promotes transparency and accountability in the software industry. Ultimately, SCER is the next step in harmonizing digital innovation and environmental sustainability.

Solution: Try to make this as detailed as possible. The topics given below are just suggestions; address them only if they are relevant to your problem:

Our solution builds upon Aveva's research, which was presented in the Standards-WG. Aveva developed a test platform using standard hardware to assess various energy consumption levels for different software configurations. Our belief is that we can establish specific parameters, to a certain degree, for software of a particular category (e.g., databases) to make them comparable on this platform.

We'll define standard workloads for different categories of software and measure them accordingly. This involves categorizing software and setting standards based on the functional unit of the SCI standard. For instance, for a music streaming service, the functional unit could be "per non-cached minute of streamed music". However, our challenge is to develop guidelines that help us make informed decisions without being overwhelmed by the complexities of modern software architecture.

Closure:

The project's success can be determined by how widely accepted the standard is among consumers and technology companies. To measure consumer adoption, third-party entities can use our guidelines to evaluate software and provide scores. This third party can also create a business by offering these evaluation services, which could be seen as a measure of success. Additionally, if technology companies use their scores in marketing campaigns, it could also be considered a success for the project.

Henry-WattTime commented 1 year ago

WG: Adrian - seems valuable to have. TPC benchmark includes energy, but a moribund part of standard, optional, but not actively used. Are we interested in getting them to resuscitate? Good measure for energy methodology, but measuring maximum performance. https://www.tpc.org/tpc_energy/default5.asp, http://www.tpc.org/tpc_energy/presentations/tpc-energy-2.pdf Henry - Maybe first step is to create a report/test for a few applications using approach to see how viable this process is. Test reception. Navveen - spec -> reporting -> benchmark? as the process.

WG approves project.

Start may be delayed due to leave.

jawache commented 1 year ago

@tmcclell / @seanmcilroy29 to be clear is this issue closed or open? Has it been approved?

seanmcilroy29 commented 1 year ago

@jawache - The Standards WG has approved this and has now been submitted for the OC to review and approve

jawache commented 1 year ago

Thanks will add an agenda item to the next meeting to discuss clarification on the process here. The intention wasn't to add months to the startup time for an incubation project and take decision making out of the hands of the WG. More to give the OC a chance to object than to approve. We should also move the objection async, the OC meets infrequently.

seanmcilroy29 commented 1 year ago

Project Approved