Closed mariaeduaruda closed 1 year ago
@mariaeduaruda tem que verificar caso a caso. Essa pipeline falha, mas algumas são bem sucedidas como pode ser visto na imagem abaixo:
Tem algum data asset específico? Porque consertar essa dag completamente envolve resolver diversos problemas de catalogação.
Exatamente @mariaeduaruda A Dag está falhando mas isso não significa que ela inteira está falhando. Ela executa a catalogação de todos os data assets do Snowflake. Basta que ela falhe para apenas um deles e já vai aparecer como falha lá. Então é preciso ver cada data asset ali, o porque de estarem falhando. Ela não necessariamente impacta todos os customers.
@rafaelsantanaep Vc conseguiu pegar algum dos logs que falharam? Eu não consegui baixar nenhum aqui. O Airflow fica buscando o log e não consegue trazer nada.
@samirleao essa dag tem esse comportamento mesmo, infelizmente. Não sei dizer a causa, mas acredito que seja algo relacionado ao volume de logs.
Na última vez que isso ocorreu, eu fiz o debug utilizando os logs que estão armazenados lá no s3 diretamente.
@rafaelsantanaep @samirleao Os data assets são relacionados ao cliente MAPA
@rafaelsantanaep @cicerojmm Investigando task a task aqui da DAG, encontrei problemas variados. Fiz uma amostragem aqui dos problemas pra cada customer name:
-- task 0 -- (dadosferademo)
[2023-02-07 06:01:35,918] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("ALSO_BUY") as varchar) as min,cast(max("ALSO_BUY") as varchar) as max,count(distinct("ALSO_BUY")) as unique_count, case when count() > 0 then cast(cast(count(distinct("ALSO_BUY")) as double) / count() as double) else null end as unique_rate ,count_if("ALSO_BUY" is null) as missing_count, case when count() > 0 then cast(count_if("ALSO_BUY" is null) / count() as double) else null end as missing_rate from "STAGING"."TB_AMAZON_REVIEWS_ALSO_BOUGHT_DATASET" [2023-02-07 06:01:35,918] {logging_mixin.py:115} INFO - Reason: 002016 (22000): 01aa2a49-0602-fed7-0072-be8307061b5a: SQL compilation error: Function MIN does not support ARRAY argument type
-- task 1 -- (basealpha)
[2023-02-07 06:08:44,361] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("IMAGE_URL") as varchar) as min,cast(max("IMAGE_URL") as varchar) as max,count(distinct("IMAGE_URL")) as unique_count, case when count() > 0 then cast(cast(count(distinct("IMAGE_URL")) as double) / count() as double) else null end as unique_rate ,count_if("IMAGE_URL" is null) as missing_count, case when count() > 0 then cast(count_if("IMAGE_URL" is null) / count() as double) else null end as missing_rate from "STAGING"."TB_AMAZON_REVIEWS_BASE_DATASET" [2023-02-07 06:08:44,361] {logging_mixin.py:115} INFO - Reason: 002016 (22000): 01aa2a50-0602-fed7-0072-be830706386e: SQL compilation error: Function MIN does not support ARRAY argument type
[2023-02-07 06:08:45,927] {logging_mixin.py:115} INFO - /api/catalog/column-metadata/ [2023-02-07 06:08:45,937] {logging_mixin.py:115} INFO - [["This field may not be null."],["This field may not be null."],{},{},{},["This field may not be null."],["This field may not be null."],{},{},{},{}] [2023-02-07 06:08:46,011] {catalog.py:47} INFO - StatusCode.OK [2023-02-07 06:08:46,011] {logging_mixin.py:115} INFO - message: "Ok" [2023-02-07 06:08:46,012] {taskinstance.py:1889} ERROR - Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 179, in execute return_value = super().execute(context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 171, in execute return_value = self.execute_callable() File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 189, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/opt/airflow/dags/snowflake_catalog.py", line 306, in catalog_snowflake_databases raise Exception(f"One or more tables have failed, here is the job_metadata: {str(exception_table_list)}") Exception: One or more tables have failed, here is the job_metadata: [{'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'PUBLIC', 'table_name': 'TBGUQHFQAMAZON_REVIEWS_METADATA', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_ALSO_BOUGHT_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_ALSO_VIEWED_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_BASE_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}] [2023-02-07 06:08:46,030] {taskinstance.py:1395} INFO - Marking task as FAILED. dag_id=snowflake_catalog_engine, task_id=catalog_snowflake_databases, map_index=1, execution_date=20230206T050000, start_date=20230207T060345, end_date=20230207T060846 [2023-02-07 06:08:46,056] {standard_task_runner.py:92} ERROR - Failed to execute job 1072360 for task catalog_snowflake_databases (One or more tables have failed, here is the job_metadata: [{'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'PUBLIC', 'table_name': 'TBGUQHFQAMAZON_REVIEWS_METADATA', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_ALSO_BOUGHT_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_ALSO_VIEWED_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_BASEALPHA', 'table_schema': 'STAGING', 'table_name': 'TB_AMAZON_REVIEWS_BASE_DATASET', 'table_owner': 'DADOSFERA_PRD_BASEALPHA'}, 'exception': HTTPError('400 Client Error: Bad Request for url: https://nimbus-basealpha.dadosfera.ai/api/catalog/column-metadata/')}]; 32303) [2023-02-07 06:08:46,083] {local_task_job.py:156} INFO - Task exited with return code 1 [2023-02-07 06:08:46,127] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
-- task 5 -- (dadosferafin)
[2023-02-07 06:11:32,871] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("CONSULTING") as varchar) as min,cast(max("CONSULTING") as varchar) as max,cast(avg("CONSULTING") as double) as mean,count(distinct("CONSULTING")) as unique_count, case when count() > 0 then cast(cast(count(distinct("CONSULTING")) as double) / count() as double) else null end as unique_rate ,count_if("CONSULTING" is null) as missing_count, case when count() > 0 then cast(count_if("CONSULTING" is null) / count() as double) else null end as missing_rate ,stddev(cast("CONSULTING" as double)) as stddev,cast(median("CONSULTING") as double) as median from "PUBLIC"."view_dadosfera_fcst_3" [2023-02-07 06:11:32,872] {logging_mixin.py:115} INFO - Reason: 000904 (42000): 01aa2a53-0602-fee2-0072-be8307064bb2: SQL compilation error: error line 3 at position 8 invalid identifier 'SAAS' [2023-02-07 06:11:32,872] {cursor.py:700} INFO - query: [select cast(min("MONTH") as varchar) as min,cast(max("MONTH") as varchar) as max...] [2023-02-07 06:11:32,952] {cursor.py:724} INFO - query execution done [2023-02-07 06:11:32,953] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("MONTH") as varchar) as min,cast(max("MONTH") as varchar) as max,cast(avg("MONTH") as double) as mean,count(distinct("MONTH")) as unique_count, case when count() > 0 then cast(cast(count(distinct("MONTH")) as double) / count() as double) else null end as unique_rate ,count_if("MONTH" is null) as missing_count, case when count() > 0 then cast(count_if("MONTH" is null) / count() as double) else null end as missing_rate ,stddev(cast("MONTH" as double)) as stddev,cast(median("MONTH") as double) as median from "PUBLIC"."view_dadosfera_fcst_3" [2023-02-07 06:11:32,953] {logging_mixin.py:115} INFO - Reason: 000904 (42000): 01aa2a53-0602-fed7-0072-be8307063d4e: SQL compilation error: error line 3 at position 8 invalid identifier 'SAAS' [2023-02-07 06:11:32,953] {cursor.py:700} INFO - query: [select cast(min("SAAS") as varchar) as min,cast(max("SAAS") as varchar) as max,c...] [2023-02-07 06:11:33,020] {cursor.py:724} INFO - query execution done [2023-02-07 06:11:33,021] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("SAAS") as varchar) as min,cast(max("SAAS") as varchar) as max,cast(avg("SAAS") as double) as mean,count(distinct("SAAS")) as unique_count, case when count() > 0 then cast(cast(count(distinct("SAAS")) as double) / count() as double) else null end as unique_rate ,count_if("SAAS" is null) as missing_count, case when count() > 0 then cast(count_if("SAAS" is null) / count() as double) else null end as missing_rate ,stddev(cast("SAAS" as double)) as stddev,cast(median("SAAS") as double) as median from "PUBLIC"."view_dadosfera_fcst_3" [2023-02-07 06:11:33,021] {logging_mixin.py:115} INFO - Reason: 000904 (42000): 01aa2a53-0602-fed7-0072-be8307063d52: SQL compilation error: error line 3 at position 8 invalid identifier 'SAAS' [2023-02-07 06:11:33,021] {cursor.py:700} INFO - query: [select cast(min("TRAINING") as varchar) as min,cast(max("TRAINING") as varchar) ...] [2023-02-07 06:11:33,077] {cursor.py:724} INFO - query execution done [2023-02-07 06:11:33,078] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("TRAINING") as varchar) as min,cast(max("TRAINING") as varchar) as max,cast(avg("TRAINING") as double) as mean,count(distinct("TRAINING")) as unique_count, case when count() > 0 then cast(cast(count(distinct("TRAINING")) as double) / count() as double) else null end as unique_rate ,count_if("TRAINING" is null) as missing_count, case when count() > 0 then cast(count_if("TRAINING" is null) / count() as double) else null end as missing_rate ,stddev(cast("TRAINING" as double)) as stddev,cast(median("TRAINING") as double) as median from "PUBLIC"."view_dadosfera_fcst_3" [2023-02-07 06:11:33,079] {logging_mixin.py:115} INFO - Reason: 000904 (42000): 01aa2a53-0602-fee2-0072-be8307064bb6: SQL compilation error: error line 3 at position 8 invalid identifier 'SAAS'
[2023-02-07 06:11:57,499] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("BUDGET") as varchar) as min,cast(max("BUDGET") as varchar) as max,cast(avg("BUDGET") as double) as mean,count(distinct("BUDGET")) as unique_count, case when count() > 0 then cast(cast(count(distinct("BUDGET")) as double) / count() as double) else null end as unique_rate ,count_if("BUDGET" is null) as missing_count, case when count() > 0 then cast(count_if("BUDGET" is null) / count() as double) else null end as missing_rate ,stddev(cast("BUDGET" as double)) as stddev,cast(median("BUDGET") as double) as median from "PUBLIC"."view_gross_aux1" [2023-02-07 06:11:57,499] {logging_mixin.py:115} INFO - Reason: 100038 (22018): 01aa2a53-0602-fee2-0072-be8307064c36: Numeric value '#VALUE! (Function ADD parameter 1 expects number values. But 'SaaS' is a text and cannot be coerced to a number.)' is not recognized
-- task 7 -- (dadosferatech)
[2023-02-07 06:33:38,149] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("PRIMEIRA_DATA_CEO") as varchar) as min,cast(max("PRIMEIRA_DATA_CEO") as varchar) as max,count(distinct("PRIMEIRA_DATA_CEO")) as unique_count, case when count() > 0 then cast(cast(count(distinct("PRIMEIRA_DATA_CEO")) as double) / count() as double) else null end as unique_rate ,count_if("PRIMEIRA_DATA_CEO" is null) as missing_count, case when count() > 0 then cast(count_if("PRIMEIRA_DATA_CEO" is null) / count() as double) else null end as missing_rate from "PUBLIC"."COMPRAS" [2023-02-07 06:33:38,150] {logging_mixin.py:115} INFO - Reason: 100097 (22007): 01aa2a69-0602-fee2-0072-be830706a772: Can't parse '44643.52529212963' as date with format 'YYYY-MM-DD'
[2023-02-07 06:33:41,288] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("ANO") as varchar) as min,cast(max("ANO") as varchar) as max,cast(avg("ANO") as double) as mean,count(distinct("ANO")) as unique_count, case when count() > 0 then cast(cast(count(distinct("ANO")) as double) / count() as double) else null end as unique_rate ,count_if("ANO" is null) as missing_count, case when count() > 0 then cast(count_if("ANO" is null) / count() as double) else null end as missing_rate ,stddev(cast("ANO" as double)) as stddev,cast(median("ANO") as double) as median from "PUBLIC"."CONSULTORIA_VIEW_HORAS" [2023-02-07 06:33:41,288] {logging_mixin.py:115} INFO - Reason: 002057 (42601): 01aa2a69-0602-fed7-0072-be8307069a06: SQL compilation error: View definition for 'DADOSFERA_PRD_DADOSFERATECH.PUBLIC.CONSULTORIA_VIEW_HORAS' declared 10 column(s), but view query produces 11 column(s).
-- task 9 -- (dadosfera)
[2023-02-07 06:46:49,023] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("AGE") as varchar) as min,cast(max("AGE") as varchar) as max,cast(avg("AGE") as double) as mean,count(distinct("AGE")) as unique_count, case when count() > 0 then cast(cast(count(distinct("AGE")) as double) / count() as double) else null end as unique_rate ,count_if("AGE" is null) as missing_count, case when count() > 0 then cast(count_if("AGE" is null) / count() as double) else null end as missing_rate ,stddev(cast("AGE" as double)) as stddev,cast(median("AGE") as double) as median from "PUBLIC"."HR_EMPLOYEE_ATTRITION" [2023-02-07 06:46:49,023] {logging_mixin.py:115} INFO - Reason: 002037 (42601): 01aa2a76-0602-fed7-0072-be830706feea: SQL compilation error: Failure during expansion of view 'HR_EMPLOYEE_ATTRITION': SQL compilation error: Object 'DADOSFERA_PRD_DADOSFERA.PUBLIC.TBBSG3HKHR_EMPLOYEE_ATTRITION' does not exist or not authorized.
-- task 10 -- (cashu)
[2023-02-07 07:04:34,979] {catalog.py:21} INFO - Getting user_id for email: DADOSFERA_PRD_CASHU [2023-02-07 07:04:35,012] {catalog.py:27} INFO - This e-mail is invalid, returning null [2023-02-07 07:04:35,040] {logging_mixin.py:115} INFO - Running process for table: TB_AMC_RECEIVABLE [2023-02-07 07:04:35,040] {cursor.py:700} INFO - query: [with prep AS ( select table_catalog as table_catalog, table_schema as table_sche...] [2023-02-07 07:04:36,253] {cursor.py:724} INFO - query execution done
[2023-02-07 07:05:25,262] {taskinstance.py:1889} ERROR - Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 179, in execute return_value = super().execute(context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 171, in execute return_value = self.execute_callable() File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 189, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/opt/airflow/dags/snowflake_catalog.py", line 306, in catalog_snowflake_databases raise Exception(f"One or more tables have failed, here is the job_metadata: {str(exception_table_list)}") Exception: One or more tables have failed, here is the job_metadata: [{'exception_metadata': {'database_name': 'DADOSFERA_PRD_CASHU', 'table_schema': 'PUBLIC', 'table_name': 'BNPL_TESTE', 'table_owner': 'DADOSFERA_PRD_CASHU'}, 'exception': HTTPError('503 Server Error: Service Temporarily Unavailable for url: https://nimbus-cashu.dadosfera.ai/api/catalog/table-metadata/?table_name=BNPL_TESTE')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_CASHU', 'table_schema': 'PUBLIC', 'table_name': 'TB_AMC_RECEIVABLE', 'table_owner': 'DADOSFERA_PRD_CASHU'}, 'exception': HTTPError('503 Server Error: Service Temporarily Unavailable for url: https://nimbus-cashu.dadosfera.ai/api/catalog/table-metadata/?table_name=TB_AMC_RECEIVABLE')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_CASHU', 'table_schema': 'PUBLIC', 'table_name': 'TB1818C6PARANA_MODA_PARKRECEIVABLECAMPOS_ADICIONAIS', 'table_owner': 'DADOSFERA_PRD_CASHU'}, 'exception': HTTPError('503 Server Error: Service Temporarily Unavailable for url: https://nimbus-cashu.dadosfera.ai/api/catalog/table-metadata/?table_name=TB__1818C6__PARANA_MODA_PARK__RECEIVABLE__CAMPOS_ADICIONAIS')}, {'exception_metadata': {'database_name': 'DADOSFERA_PRD_CASHU', 'table_schema': 'PUBLIC', 'table_name': 'TB1WH87QSKYONE__RECEIVABLE', 'table_owner': 'DADOSFERA_PRD_CASHU'}
-- task 11 -- (elvenworks)
[2023-02-07 07:50:50,780] {catalog.py:21} INFO - Getting user_id for email: DADOSFERA_PRD_ELVENWORKS [2023-02-07 07:50:50,867] {catalog.py:27} INFO - This e-mail is invalid, returning null
[2023-02-07 08:51:02,590] {logging_mixin.py:115} INFO - Unable to process the following query: select cast(min("USER_NAME") as varchar) as min,cast(max("USER_NAME") as varchar) as max,count(distinct("USER_NAME")) as unique_count, case when count() > 0 then cast(cast(count(distinct("USER_NAME")) as double) / count() as double) else null end as unique_rate ,count_if("USER_NAME" is null) as missing_count, case when count() > 0 then cast(count_if("USER_NAME" is null) / count() as double) else null end as missing_rate from "PUBLIC"."VW_USER_ACTION_ZEUS" [2023-02-07 08:51:02,590] {logging_mixin.py:115} INFO - Reason: 002037 (42601): 01aa2af3-0602-fed7-0072-be83070812e6: SQL compilation error: Failure during expansion of view 'VW_USER_ACTION_ZEUS': SQL compilation error: Object 'DADOSFERA_PRD_ELVENWORKS.PUBLIC.TBOQF1SNUSER_ACTION_ZEUS' does not exist or not authorized.
-- task 12 -- (dadosferatrial)
[2023-02-07 08:53:49,779] {catalog.py:21} INFO - Getting user_id for email: DADOSFERA_PRD_DADOSFERATRIAL [2023-02-07 08:53:49,804] {catalog.py:27} INFO - This e-mail is invalid, returning null [2023-02-07 08:53:49,816] {catalog.py:47} INFO - StatusCode.OK [2023-02-07 08:53:49,817] {logging_mixin.py:115} INFO - message: "data.data_assets not found"
[2023-02-07 08:53:49,818] {taskinstance.py:1889} ERROR - Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 179, in execute return_value = super().execute(context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 171, in execute return_value = self.execute_callable() File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 189, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) File "/opt/airflow/dags/snowflake_catalog.py", line 306, in catalog_snowflake_databases raise Exception(f"One or more tables have failed, here is the job_metadata: {str(exception_table_list)}") Exception: One or more tables have failed, here is the job_metadata: [{'exception_metadata': {'database_name': 'DADOSFERA_PRD_DADOSFERATRIAL', 'table_schema': 'PUBLIC', 'table_name': 'TB8UEUEUCOMPANIES', 'table_owner': 'DADOSFERA_PRD_DADOSFERATRIAL'}, 'exception': ConnectionError(MaxRetryError("HTTPSConnectionPool(host='nimbus-dadosferatrial.dadosfera.ai', port=443): Max retries exceeded with url: /api/catalog/table-metadata/?table_name=TB8UEUEUCOMPANIES (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f04473cc400>: Failed to establish a new connection: [Errno -2] Name or service not known'))"))}
No caso da MAPA foi identificado que o usuário DADOSFERA_PRD_MAPA que faz a parte de catalogação das tables e view não tinha acesso aos novos schemas que foram criados pelo usuário, assim as views não iriam ser catalogadas refined_mapa e staging_mapa. Como uma solução "workaround" foi adicionado as permissões necessárias ao usuário padrão da mapa.
--- GRANT para os schemas existentes e futuros grant usage on future schemas in database DADOSFERA_PRD_MAPA to role DADOSFERA_PRD_MAPA grant usage on all schemas in database DADOSFERA_PRD_MAPA to role DADOSFERA_PRD_MAPA
--- GRANT para a role em todas tabelas e views existentes e futuras
GRANT SELECT ON ALL VIEWS IN SCHEMA dadosfera_prd_mapa.refined_mapa to ROLE dadosfera_prd_mapa;
GRANT SELECT ON ALL VIEWS IN SCHEMA dadosfera_prd_mapa.staging_mapa to ROLE dadosfera_prd_mapa;
GRANT SELECT ON ALL TABLES IN SCHEMA dadosfera_prd_mapa.refined_mapa to ROLE dadosfera_prd_mapa; GRANT SELECT ON ALL TABLES IN SCHEMA dadosfera_prd_mapa.staging_mapa to ROLE dadosfera_prd_mapa;
GRANT SELECT ON FUTURE VIEWS IN SCHEMA dadosfera_prd_mapa.refined_mapa to ROLE dadosfera_prd_mapa; GRANT SELECT ON FUTURE VIEWS IN SCHEMA dadosfera_prd_mapa.staging_mapa to ROLE dadosfera_prd_mapa;
GRANT SELECT ON FUTURE TABLES IN SCHEMA dadosfera_prd_mapa.refined_mapa to ROLE dadosfera_prd_mapa; GRANT SELECT ON FUTURE TABLES IN SCHEMA dadosfera_prd_mapa.staging_mapa to ROLE dadosfera_prd_mapa;
A correção em definitivo será executado pelo Samir.
@samirleao depois documenta a parte que você fez no terraform.
PR atualizando versão do módulo de customers que supostamente irá resolver: https://github.com/dadosfera/terraform-dadosfera/pulls
Tivemos problema para fazer o terraform-dadosfera ver a nova versão do módulo.
Mandatory information:
There are customers are directly impacted by this bug? Which?
Yes, we believe all customers are affected.
Bug Category
Describe the bug
All created views are not appearing due to cataloging failure
Screenshots
Does this bug impact any demo or a sales?
Yes
Customer Datasphere:
all customers
What environment of software are you using?