Closed seanmcilroy29 closed 3 weeks ago
Attended
Attended
Attended
Attended
Attended
Attended
Is training in scope for AI carbon intensity? - https://www.rockpool.tech/posts/is-training-in-scope-for-ai-sci
Attended
Attended
attended
MoM - Henry opens the meeting
Meeting Notes in brief
The GSF Standards Working Group meeting focused on extending the SCI to AI, addressing the challenges of measuring AI carbon emissions. Key points included the need for boundary definitions for AI, data challenges, and tooling extensions. Asim emphasized the importance of real-life computations in informing the spec. Gadhu highlighted the significance of measuring the entire AI life cycle. Chris Xie discussed the development of a software carbon labelling system to enhance transparency and align with policy mandates. The group agreed to design a workshop to draft the AI specification, with a tentative timeline for January 2024. The meeting also covered project updates on real-time cloud and software carbon labelling.
Minutes Extending SCI to AI: Naveen's Update Naveen discusses the need to focus on boundary definitions for AI, both classical and generative. He emphasizes the importance of functional unit rates for AI applications. Naveen highlights the challenges in data measurement for AI, including the need for proxies like token size. He outlines the need for tooling to extend the impact framework for AI and the importance of breaking the project into manageable parts.
Key challenges in measuring the entire AI life cycle include:
- Measuring the impact of the data collection and preprocessing stages for training AI models. The Green AI committee discussed the significant impact of these early stages in the AI lifecycle.
- Measuring the impact of the training process can be computationally intensive and have a large carbon footprint. There were discussions around whether training should be included in the SCI metric or if a separate metric is needed.
- Measuring the impact of using managed AI services, where the underlying infrastructure and model details may not be transparent to the end user. The group discussed the need for a reference architecture and estimating proxy metrics in such cases.
To address these challenges, the group proposed:
- Involving AI experts in the workshop to define the boundaries and scope of what should be measured in the AI lifecycle.
- Experiment with real-life computations of AI emissions to inform the specification and uncover practical measurement challenges.
- Designing the workshop collaboratively between the Green AI committee and the Standards Working Group to ensure the right expertise is involved.
- Consider nesting the SCI for AI specification under the overall SCI standard to leverage existing frameworks and processes.
Discussion on Real-Time Cloud and Software Carbon Labeling Asim suggests real attempts to compute AI emissions to inform the specification. Henry and Asim discuss the need for a workshop to design the specification and the involvement of AI experts. Gadhu shares insights from the Green AI committee on defining green AI and the importance of measuring the entire AI life cycle. Chris Xie and Asim discuss the challenges of measuring managed services and the need for a reference architecture.
Workshop Planning and Next Steps Henry proposes designing a workshop to draft the AI specification and discusses the involvement of the Green AI committee. Asim suggests involving Russ from the Green AI committee in creating the workshop. The group agrees to dedicate the next Standards Working Group meeting on October 24 to workshop planning. Sean raises the need for a timeline and suggests nesting the AI specification under the SCI.
Project Updates: Real-Time Cloud and SCER Pindy provides an update on the Real-Time Cloud project, including data collection and estimating missing metrics. Sean Mcilroy outlines the plan to test and adjust the RTC data set over six months. Chris Xie and Asim discuss the progress of the Software Carbon Labeling project, including the focus on transparency and business value. The group considers the need for a timestamp on the label to indicate when the score was calculated.
Key ways the SCER project can ensure it provides meaningful and transparent information to businesses:
- Focus on disclosure and evidence-based labeling: The group discussed shifting the project's focus to more information disclosure rather than just a comparative rating. The emphasis should be on providing the evidence and data behind the carbon footprint claims.
- Align the label with business needs: The group discussed designing the label to help businesses meet their carbon emission mandates and policy alignment goals. This would make the label more valuable and meaningful to the target audience.
- Incorporate a timestamp or validity period: The group recognized the need to include a timestamp or validity period on the label to indicate when the underlying data was calculated. This would help businesses understand how current and relevant the information is.
- Leverage the manifest file for transparency: Linking the label to a detailed manifest file that provides the evidence and methodology behind the carbon footprint calculation was seen as crucial for transparency.
- Consider a modular approach: The group suggested the possibility of creating a family of disclosure-focused labels (e.g., for SCI, ENERGY STAR, etc.) that follow a similar template. This could make the overall system more transparent and easier to understand.
- Involve stakeholders in the design: Continuing to engage with businesses, policymakers, and other stakeholders in the design of the label and disclosure system would help ensure it meets their needs. The key is to focus on providing meaningful, evidence-based information that businesses can use to make informed decisions, rather than just a simplistic rating or comparison.
Final Discussions and Next Steps Henry suggests considering organizational structure and renaming projects for clarity. Asim proposes renaming the Software Carbon Labeling project to reflect its purpose better. The group agrees to continue discussing the workshop and project updates in the next meeting. Henry wraps up the meeting, noting the need to address the Toss project in the next meeting.
Action Items
2024.10.10 Agenda/Minutes
Time 1600 (GMT) - See the time in your timezone
Antitrust Policy
Joint Development Foundation meetings may involve participation by industry competitors, and the Joint Development Foundation intends to conduct all of its activities in accordance with applicable antitrust and competition laws. It is, therefore, extremely important that attendees adhere to meeting agendas and be aware of and not participate in any activities that are prohibited under applicable US state, federal or foreign antitrust and competition laws.
If you have questions about these matters, please contact your company counsel or counsel to the Joint Development Foundation, DLA Piper.
Recordings
WG agreed to record all Meetings. This meeting recording will be available until the next scheduled meeting.
Roll Call
Please add 'Attended' to this issue during the meeting to denote attendance.
Any untracked attendees will be added by the GSF team below:
Agenda
Announcement
Diversifying and Developing the SCI Standard - Navveen and Henry
Project Review updates
Articles
For Review
Note: WG use case template submission - After submitting this issue, your use case will be submitted to the WG Agenda for discussion. Article submission—Once you submit this issue, it will be assigned to the GSF Editor for review.
Future meeting Agenda submissions
Next Meeting
Adjourn
Standing Agenda / Future Agenda submissions