owid / etl

A compute graph for loading and transforming OWID's data
https://docs.owid.io/projects/etl
MIT License
58 stars 18 forks source link

typo in epoch dataset #2865

Closed veronikasamborska1994 closed 1 week ago

owidbot commented 1 week ago
Quick links (staging server): Site Admin Wizard

Login: ssh owid@staging-site-ai-fix

chart-diff: ✅ No charts for review.
data-diff: ❌ Found differences ```diff ~ Dataset garden/artificial_intelligence/2024-06-19/epoch_compute_intensive - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Table epoch_compute_intensive (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column domain (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column organization_categorization (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column parameters (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column publication_date (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column training_computation_petaflop (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column training_dataset_size__datapoints (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ = Dataset garden/artificial_intelligence/2024-06-19/epoch_compute_intensive_countries = Table epoch_compute_intensive_countries ~ Column cumulative_count (changed metadata) - - Refers to the location of the primary organization with which the authors of a large-scale AI systems are affiliated. An AI system can have multiple authors, each potentially affiliated with different institutions, thus contributing to the count for multiple countries. The 2024 data is incomplete and was last updated 19 June 2024. ? ^^ + + Refers to the location of the primary organization with which the authors of a large-scale AI systems are affiliated. An AI system can have multiple authors, each potentially affiliated with different institutions, thus contributing to the count for multiple countries. The 2024 data is incomplete and was last updated 20 June 2024. ? ^^ - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ ~ Column yearly_count (changed metadata) - - Refers to the location of the primary organization with which the authors of a large-scale AI systems are affiliated. An AI system can have multiple authors, each potentially affiliated with different institutions, thus contributing to the count for multiple countries. The 2024 data is incomplete and was last updated 19 June 2024. ? ^^ + + Refers to the location of the primary organization with which the authors of a large-scale AI systems are affiliated. An AI system can have multiple authors, each potentially affiliated with different institutions, thus contributing to the count for multiple countries. The 2024 data is incomplete and was last updated 20 June 2024. ? ^^ - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ = Dataset garden/artificial_intelligence/2024-06-19/epoch_compute_intensive_domain = Table epoch_compute_intensive_domain ~ Column cumulative_count (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ - - The count of large-scale AI models AI systems per domain is derived by tallying the instances of machine learning models classified under each domain category. It's important to note that a single machine learning model can fall under multiple domains. The classification into domains is determined by the specific area, application, or field that the AI system is primarily designed to operate within. System domains with less than 10 systems are grouped under "Other." ? -------------------------------------------------------------------- + + The count of large-scale AI models AI systems per domain is derived by tallying the instances of machine learning models classified under each domain category. It's important to note that a single machine learning model can fall under multiple domains. The classification into domains is determined by the specific area, application, or field that the AI system is primarily designed to operate within. ~ Column yearly_count (changed metadata) - - A dataset that tracks compute-intensive AI models, with training compute over 1023 floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ + + A dataset that tracks compute-intensive AI models, with training compute over 10²³ floating point operations (FLOP). This corresponds to training costs of hundreds of thousands of dollars or more. ? ^^ - - The count of large-scale AI models AI systems per domain is derived by tallying the instances of machine learning models classified under each domain category. It's important to note that a single machine learning model can fall under multiple domains. The classification into domains is determined by the specific area, application, or field that the AI system is primarily designed to operate within. System domains with less than 10 systems are grouped under "Other." ? -------------------------------------------------------------------- + + The count of large-scale AI models AI systems per domain is derived by tallying the instances of machine learning models classified under each domain category. It's important to note that a single machine learning model can fall under multiple domains. The classification into domains is determined by the specific area, application, or field that the AI system is primarily designed to operate within. Legend: +New ~Modified -Removed =Identical Details Hint: Run this locally with etl diff REMOTE data/ --include yourdataset --verbose --snippet ``` Automatically updated datasets matching _weekly_wildfires|excess_mortality|covid|fluid|flunet|country_profile|garden/ihme_gbd/2019/gbd_risk_ are not included

Edited: 2024-06-20 19:08:08 UTC Execution time: 26.82 seconds