Closed jukorv closed 8 months ago
Hi. Same problem with a fresh env, after import a zip file from another superset instance.
The fix number 2 worked for me too.
Hi there, and sorry this has gone quiet for so long. We're no longer supporting Superset 2.x or prior, and since it's been a while since this thread saw any activity, I'll close this as stale. If it is still an issue in Superset 3.x or newer, we can re-open this, or feel free to open a new issue with updated context. Thanks!
When user opens a dashboard they get a notification of filters possibly not working correctly. However, upon testing, it seems that all filters are working as expected. This problem arose after upgrading Superset from 2.0.0 to 2.1.1. I found another issue, which sounds a bit similar: https://github.com/apache/superset/issues/24261
How to reproduce the bug
Expected results
If filters are working correctly, no notification should be shown to users
Actual results
Users gets notification
Logs from the backend
docker-compose-superset-1 | Datasource not found datasource_type: DatasourceType.TABLE, datasource_id: 2 docker-compose-superset-1 | 2023-09-12 10:14:14,810:WARNING:superset.datasource.dao:Datasource not found datasource_type: DatasourceType.TABLE, datasource_id: 2 docker-compose-superset-1 | 2023-09-12 10:14:14,810:ERROR:flask_appbuilder.api:Datasource does not exist docker-compose-superset-1 | Traceback (most recent call last): docker-compose-superset-1 | File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/api/__init__.py", line 110, in wraps docker-compose-superset-1 | return f(self, *args, **kwargs) docker-compose-superset-1 | File "/app/superset/views/base_api.py", line 122, in wraps docker-compose-superset-1 | raise ex docker-compose-superset-1 | File "/app/superset/views/base_api.py", line 113, in wraps docker-compose-superset-1 | duration, response = time_function(f, self, *args, **kwargs) docker-compose-superset-1 | File "/app/superset/utils/core.py", line 1594, in time_function docker-compose-superset-1 | response = func(*args, **kwargs) docker-compose-superset-1 | File "/app/superset/utils/log.py", line 266, in wrapper docker-compose-superset-1 | value = f(*args, **kwargs) docker-compose-superset-1 | File "/app/superset/dashboards/api.py", line 394, in get_datasets docker-compose-superset-1 | datasets = DashboardDAO.get_datasets_for_dashboard(id_or_slug) docker-compose-superset-1 | File "/app/superset/dashboards/dao.py", line 64, in get_datasets_for_dashboard docker-compose-superset-1 | return dashboard.datasets_trimmed_for_slices() docker-compose-superset-1 | File "/usr/local/lib/python3.8/site-packages/flask_caching/__init__.py", line 905, in decorated_function docker-compose-superset-1 | return f(*args, **kwargs) docker-compose-superset-1 | File "/app/superset/models/dashboard.py", line 318, in datasets_trimmed_for_slices docker-compose-superset-1 | result.append(datasource.data_for_slices(slices)) docker-compose-superset-1 | File "/app/superset/connectors/base/models.py", line 336, in data_for_slices docker-compose-superset-1 | query_context = slc.get_query_context() docker-compose-superset-1 | File "/app/superset/models/slice.py", line 280, in get_query_context docker-compose-superset-1 | return self.get_query_context_factory().create( docker-compose-superset-1 | File "/app/superset/common/query_context_factory.py", line 60, in create docker-compose-superset-1 | datasource_model_instance = self._convert_to_model(datasource) docker-compose-superset-1 | File "/app/superset/common/query_context_factory.py", line 98, in _convert_to_model docker-compose-superset-1 | return DatasourceDAO.get_datasource( docker-compose-superset-1 | File "/app/superset/datasource/dao.py", line 68, in get_datasource docker-compose-superset-1 | raise DatasourceNotFound() docker-compose-superset-1 | superset.dao.exceptions.DatasourceNotFound: Datasource does not exist
Another possibly related error from the logs:
docker-compose-supersetinit-1 | /usr/local/lib/python3.8/site-packages/flask_appbuilder/models/sqla/interface.py:64: SAWarning: relationship 'SqlaTable.slices' will copy column tables.id to column slices.datasource_id, which conflicts with relationship(s): 'Slice.table' (copies tables.id to slices.datasource_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. To silence this warning, add the parameter 'overlaps="table"' to the 'SqlaTable.slices' relationship. (Background on this error at: https://sqlalche.me/e/14/qzyx)
Environment
superset version
: 2.1.1python --version
: 3.8.18node -v
: 14FEATURE_FLAGS = { "DYNAMIC_PLUGINS": False, "ENABLE_TEMPLATE_PROCESSING": True, "DASHBOARD_RBAC": True, "DASHBOARD_CROSS_FILTERS": True, "DASHBOARD_FILTERS_EXPERIMENTAL": True, "ENABLE_TEMPLATE_REMOVE_FILTERS": True, "DASHBOARD_NATIVE_FILTERS": True, "VERSIONED_EXPORT": False, "ALLOW_FULL_CSV_EXPORT": True, "HORIZONTAL_FILTER_BAR": True, "DRILL_TO_DETAIL": True }
Checklist
Make sure to follow these steps before submitting your issue - thank you!
Additional context
I have investigated this issue a bit, and I have found two possible fixes for this.
UPDATE slices SET query_context = jsonb_set(query_context::jsonb, '{datasource,id}', ('"' || datasource_id || '"')::jsonb) WHERE (query_context::json->'datasource'->>'id')::int != datasource_id;
After either of the fixes is applied the user is not getting notifications anymore and everything seems to work fine. Fix number 2 feels quite tempting as we have almost 1000 charts in production, I am a bit concerned about direct DB modifications. What do you think? Is the SQL fix appropriate or should I just stick with manual work? What are possible issues and complications with direct DB modifications?