Open ebarti opened 1 year ago
Actually I did not notice, but the bigQuery execution limit is reached on retry - so not really something that can be error handled. I think we should consider using a different data store, as bigQuery standard tables are subject to 1500 Table modifications per day
🙈 this is indeed quite exceptional do you think requesting that 1500 BQ table modifications quota increase could help?
🙈 this is indeed quite exceptional do you think requesting that 1500 BQ table modifications quota increase could help?
Hey @binamov thanks for your response. I think we might still hit the 100k step limit if we increase the table modifications - as we do process both new and claimed findings.
@ebarti you're looking to process an exceptional amount of recommendations, it makes sense you would be hitting default service quota limits. For all limits mentioned so far, you can, and in this case it feels like you ought to, request increases for these quotas.
Hello there, it's me again.
We have quite a messy GCP infra with thousands of projects that require deletion. As a result, the
recommendations_workflow_process_recommendations
workflow fails after 100,000 steps.Although this is a GCP limitation, I think it could be handled with soem fault recovery. Any thoughts?