microbiomedata / sheets_and_friends

Enhance a LinkML model with imported and optionally modified slots
0 stars 0 forks source link

Google Sheets reads per user per minute exceeded?! #61

Closed turbomam closed 2 years ago

turbomam commented 2 years ago

from @subdavis

poetry run cogs fetch
Traceback (most recent call last):
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/bin/cogs", line 8, in <module>
    sys.exit(main())
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/cogs/cli.py", line 265, in main
    args.func(args)
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/cogs/cli.py", line 353, in run_fetch
    fetch(verbose=args.verbose)
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/cogs/fetch.py", line 318, in fetch
    cells = get_cell_data(cogs_dir, sheet)
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/cogs/fetch.py", line 115, in get_cell_data
    resp = request.execute()
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/googleapiclient/_helpers.py", line 131, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/home/brandon/.pyenv/versions/3.9.10/envs/sheets_friends/lib/python3.9/site-packages/googleapiclient/http.py", line 937, in execute
    raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 429 when requesting https://sheets.googleapis.com/v4/spreadsheets/1RACmVPhqpfm2ELm152CzmiEy2sDmULmbN9G0qXK8NDs?ranges=%27enums%27&fields=sheets%28data%28rowData%28values%28%2A%29%29%29%29&alt=json returned "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute per user' of service 'sheets.googleapis.com' for consumer 'project_number:306297140388'.". Details: "[{'@type': 'type.googleapis.com/google.rpc.ErrorInfo', 'reason': 'RATE_LIMIT_EXCEEDED', 'domain': 'googleapis.com', 'metadata': {'service': 'sheets.googleapis.com', 'quota_limit': 'ReadRequestsPerMinutePerUser', 'quota_metric': 'sheets.googleapis.com/read_requests', 'consumer': 'projects/306297140388'}}]">
Makefile:30: recipe for target '.cogs/tracked/placeholders_awaiting_auto_imp.tsv' failed
make: *** [.cogs/tracked/placeholders_awaiting_auto_imp.tsv] Error 1
turbomam commented 2 years ago

I did the following:

subdavis commented 2 years ago

I've managed to get by this a few times. It's still failing 50% of the time, but sometimes it works.

I'm now getting to the end of make all and finding nothing new in the DataHarmonizer/template dir.

~$ make all
rm -rf DataHarmonizer/template/soil_emsl_jgi_mg
rm -rf artifacts/*yaml
rm -rf docs/*
rm -rf logs/*log
rm -rf project/*py
rm -rf project/docs/*
rm -rf project/excel/*
rm -rf project/graphql/*
rm -rf project/java/*
rm -rf project/jsonld/*
rm -rf project/jsonschema/*
rm -rf project/owl/*
rm -rf project/prefixmap/*
rm -rf project/protobuf/*
rm -rf project/shacl/*
rm -rf project/shex/*
rm -rf project/sqlschema/*
rm -rf target/*log
rm -rf target/*tsv
rm -rf target/*txt
poetry run cogs fetch
poetry run sheets2linkml -o artifacts/from_sheets2linkml.yaml .cogs/tracked/cornerstone.tsv .cogs/tracked/new_terms.tsv .cogs/tracked/enums.tsv .cogs/tracked/sections_as_classes.tsv .cogs/tracked/placeholders_awaiting_auto_imp.tsv 2>> logs/sheets2linkml.log
poetry run do_shuttle --config_tsv .cogs/tracked/import_slots_regardless.tsv --yaml_output artifacts/with_shuttles.yaml --recipient_model artifacts/from_sheets2linkml.yaml 2>> logs/do_shuttle.log
poetry run mod_by_path \
    --config_tsv .cogs/tracked/modifications_long.tsv \
    --yaml_input artifacts/with_shuttles.yaml \
    --yaml_output artifacts/nmdc_dh.yaml  2>> logs/mod_by_path.log
poetry run gen-project --exclude shacl --exclude owl --dir project artifacts/nmdc_dh.yaml 2>> logs/gen-project.log
ALL_SCHEMAS = ['artifacts/nmdc_dh.yaml']
# metamodel_version: 1.7.0
// metamodel_version: 1.7.0
# metamodel_version: 1.7.0
/* metamodel_version: 1.7.0 */
/* metamodel_version: 1.7.0 */
/* metamodel_version: 1.7.0 */
poetry run linkml2dataharmonizer --model_file artifacts/nmdc_dh.yaml --selected_class soil_emsl_jgi_mg 2> logs/linkml2dataharmonizer.log
rm -rf artifacts/from_sheets2linkml.yaml
rm -rf artifacts/with_shuttles.yaml
grep env_broad_scale .cogs/tracked/envo_terms_for_mixs_env_triad.tsv | grep soil | cut -f2 > target/soil-env_broad_scale-extracted_list.txt
java -jar bin/robot.jar query --input downloads/envo.owl --query sparql/sco.sparql target/envo_sco.tsv
ERROR Input ontology contains 1 triple(s) that could not be parsed:
 - <https://www.wikidata.org/wiki/Q2306597> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> _:genid2147499456.

java -jar bin/robot.jar query --input downloads/envo.owl --query sparql/labels.sparql target/envo_labs.tsv
ERROR Input ontology contains 1 triple(s) that could not be parsed:
 - <https://www.wikidata.org/wiki/Q2306597> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> _:genid2147499456.

poetry run hident \
    --sco_tab_file_name target/envo_sco.tsv \
    --lab_tab_file_name target/envo_labs.tsv \
    --curie_file_name target/soil-env_broad_scale-extracted_list.txt \
    --pad_char _ \
    --pad_count 2 \
    --parent_term 'broad-scale environmental context' \
    --indented_tsv target/soil-env_broad_scale-indented.tsv
grep env_local_scale .cogs/tracked/envo_terms_for_mixs_env_triad.tsv | grep soil | cut -f2 > target/soil-env_local_scale-extracted_list.txt
poetry run hident \
    --sco_tab_file_name target/envo_sco.tsv \
    --lab_tab_file_name target/envo_labs.tsv \
    --curie_file_name target/soil-env_local_scale-extracted_list.txt \
    --pad_char _ \
    --pad_count 2 \
    --parent_term 'local environmental context' \
    --indented_tsv target/soil-env_local_scale-indented.tsv
grep env_medium .cogs/tracked/envo_terms_for_mixs_env_triad.tsv | grep soil | cut -f2 > target/soil-env_medium-extracted_list.txt
poetry run hident \
    --sco_tab_file_name target/envo_sco.tsv \
    --lab_tab_file_name target/envo_labs.tsv \
    --curie_file_name target/soil-env_medium-extracted_list.txt \
    --pad_char _ \
    --pad_count 2 \
    --parent_term 'environmental medium' \
    --indented_tsv target/soil-env_medium-indented.tsv
poetry run promote_to_select \
    --promote 'broad-scale environmental context' \
    --promote 'local environmental context' \
    --promote 'environmental medium' \
    --data_tsv_in target/data.tsv \
    --extra_row_files target/soil-env_broad_scale-indented.tsv \
    --extra_row_files target/soil-env_local_scale-indented.tsv \
    --extra_row_files target/soil-env_medium-indented.tsv \
    --data_tsv_out target/data_promoted.tsv
broad-scale environmental context
local environmental context
environmental medium
target/soil-env_broad_scale-indented.tsv
target/soil-env_local_scale-indented.tsv
target/soil-env_medium-indented.tsv
mkdir -p DataHarmonizer/template/soil_emsl_jgi_mg
# copy target/data_promoted.tsv to DataHarmonizer/template/soil_emsl_jgi_mg/data.tsv
# so we can generate the data.js in it's canonical place
cp target/data_promoted.tsv DataHarmonizer/template/soil_emsl_jgi_mg/data.tsv
# copy other files required by make_data.py
cp -r artifacts/for_data_harmonizer_template/* DataHarmonizer/template/soil_emsl_jgi_mg
# generate DataHarmonizer/template/soil_emsl_jgi_mg/data.js
cd DataHarmonizer/template/soil_emsl_jgi_mg && poetry run python ../../script/make_data.py 2> make_data.log && cd -
vocabulary field: Analysis/Data Type
vocabulary field: observed biotic relationship
vocabulary field: current land use
vocabulary field: DNA Container Type
vocabulary field: DNAse Treatment DNA
vocabulary field: DNA Sample Format
vocabulary field: drainage classification
vocabulary field: ecosystem
vocabulary field: ecosystem_category
vocabulary field: ecosystem_subtype
vocabulary field: ecosystem_type
vocabulary field: environmental package
vocabulary field: soil_taxonomic/FAO classification
vocabulary field: growth facility
vocabulary field: oxygenation status of sample
vocabulary field: profile position
vocabulary field: relationship to oxygen
vocabulary field: sample type
vocabulary field: soil horizon
vocabulary field: specific_ecosystem
vocabulary field: storage conditions
vocabulary field: history/tillage
vocabulary field: broad-scale environmental context
vocabulary field: local environmental context
vocabulary field: environmental medium
/home/brandon/github.com/sheets_and_friends
# move all of the DataHarmonizer submodule into docs/, for GH pages to see
cp -r DataHarmonizer/* docs
# restore the DataHarmonizer submodule to "the way we found it"
rm -rf DataHarmonizer/template/soil_emsl_jgi_mg
# move in the main.js that Brandon and Mark have modified vs https://github.com/cidgoh/DataHarmonizer/blob/master/script/main.js
cp -r artifacts/for_data_harmonizer_scripts/* docs/script
# move artifacts, including the example valid data file into the docs directory (monitored by GH pages)
# cp -r artifacts docs
# remove DH stuff that's not relevant to interacting with the NMDC interface
rm -rf docs/README.md
rm -rf docs/images
rm -rf docs/requirements.txt
rm -rf docs/script/make_data.py
rm -rf docs/template/canada_covid19
rm -rf docs/template/export.js
rm -rf docs/template/gisaid
rm -rf docs/template/grdi
rm -rf docs/template/pha4ge
rm -rf docs/template/phac_dexa
rm -rf docs/template/reference_template.html
rm -rf docs/template/soil_emsl_jgi_mg/reference_template.html
#
# rm -rf docs/script/exampleInput docs/script/reference_template.html # how are these getting in there?
poetry run compare_enums \
    --left_model artifacts/nmdc_dh.yaml \
    --right_model mixs-source/model/schema/mixs.yaml \
    --yaml_output artifacts/compare_enums.yaml
turbomam commented 2 years ago

@subdavis

nothing new in the DataHarmonizer/template dir.

Look in docs/template

turbomam commented 2 years ago

I was spelling minified wrong. This looks good to me now.

https://microbiomedata.github.io/sheets_and_friends/main.html?minified=true

I deleted a previous post saying that minify wasn't working

turbomam commented 2 years ago

I have not experienced this in several weeks.