apache / superset

Apache Superset is a Data Visualization and Data Exploration Platform
https://superset.apache.org/
Apache License 2.0
61.33k stars 13.36k forks source link

Unable to login to new server? #7637

Closed SamuelMarks closed 5 years ago

SamuelMarks commented 5 years ago

Removed the ~/.superset directory multiple times, tried configuring postgres and redis also.

Expected results

Should be able to login.

Actual results

No errors in console or network tab, server log just says:

[2019-06-02 13:50:29 +0000] [20401] [INFO] Starting gunicorn 19.8.0
[2019-06-02 13:50:29 +0000] [20401] [INFO] Listening at: http://0.0.0.0:8080 (20401)
[2019-06-02 13:50:29 +0000] [20401] [INFO] Using worker: sync
[2019-06-02 13:50:29 +0000] [20406] [INFO] Booting worker with pid: 20406

Warning in console though:

Source map error: request failed with status 404
Resource URL: https://omitted/static/appbuilder/css/bootstrap.min.css
Source Map URL: bootstrap.min.css.map

Navigated to https://omitted/login/
Using //@ to indicate sourceMappingURL pragmas is deprecated. Use //# instead
jquery-latest.js:2:21
Source map error: request failed with status 404
Resource URL: https://omitted/static/appbuilder/js/jquery-latest.js
Source Map URL: jquery-1.10.2.min.map

Source map error: request failed with status 404
Resource URL: https://omitted/static/appbuilder/css/bootstrap.min.css
Source Map URL: bootstrap.min.css.map

Screenshots

If applicable, add screenshots to help explain your problem.

How to reproduce the bug

  1. Follow this guide https://superset.incubator.apache.org/installation.html#superset-installation-and-initialization
  2. Setup postgres
  3. Setup redis
  4. Setup config

Environment

(please complete the following information):

Checklist

Make sure these boxes are checked before submitting your issue - thank you!

Additional context

/etc/nginx/sites-enabled/omitted from code example

server {
    server_name omitted from code example;
    listen 80;
    return 301 https://$server_name$request_uri;
}
server {
    server_name omitted from code example;
    listen 443;
    ssl on;
    ssl_certificate /etc/letsencrypt/live/omitted from code example/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/omitted from code example/privkey.pem;
    fastcgi_param HTTPS               on;
    fastcgi_param HTTP_SCHEME         https;

    location / {
        proxy_buffers 16 4k;
        proxy_buffer_size 2k;
        proxy_pass http://0.0.0.0:8080;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_http_version 1.1;
        proxy_connect_timeout 600;
        proxy_read_timeout 600;
        send_timeout 600;
    }
}

superset_config.py

import os

from werkzeug.contrib.cache import RedisCache

MAPBOX_API_KEY = os.getenv('MAPBOX_API_KEY', '')
CACHE_CONFIG = {
    'CACHE_TYPE': 'redis',
    'CACHE_DEFAULT_TIMEOUT': 300,
    'CACHE_KEY_PREFIX': 'superset_',
    'CACHE_REDIS_HOST': 'redis',
    'CACHE_REDIS_PORT': 6379,
    'CACHE_REDIS_DB': 1,
    'CACHE_REDIS_URL': 'redis://localhost:6379/1'}
SQLALCHEMY_DATABASE_URI = \
    'postgresql+psycopg2://omitted from code example:omitted from code example@localhost:5433/superset_db'
SQLALCHEMY_TRACK_MODIFICATIONS = True
SECRET_KEY = 'omitted from code example'

class CeleryConfig(object):
    BROKER_URL = 'redis://localhost:6379/0'
    CELERY_IMPORTS = ('superset.sql_lab', )
    CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
    CELERY_ANNOTATIONS = {'tasks.add': {'rate_limit': '10/s'}}

CELERY_CONFIG = CeleryConfig
RESULTS_BACKEND = RedisCache(
    host='redis',
    port=6379,
    key_prefix='superset_results'
)

/etc/circus/conf.d/circus.ini

[circus]
check_delay = 5
endpoint = tcp://127.0.0.1:5555
pubsub_endpoint = tcp://127.0.0.1:5556
statsd = true

[watcher:supserset]
working_dir = /home/ubuntu/repos/incubator-superset
cmd = gunicorn
args = superset:app
uid = ubuntu
numprocesses = 1
autostart = true
send_hup = true
stdout_stream.class = FileStream
stdout_stream.filename = /home/ubuntu/repos/logs/supserset.stdout.log
stdout_stream.max_bytes = 10485760
stdout_stream.backup_count = 4
stderr_stream.class = FileStream
stderr_stream.filename = /home/ubuntu/repos/logs/supserset.stderr.log
stderr_stream.max_bytes = 10485760
stderr_stream.backup_count = 4
virtualenv = /opt/venvs/superset
virtualenv_py_ver = 3.6
copy_env = true

[env:supserset]
TERM=rxvt-256color
SHELL=/bin/bash
USER=ubuntu
LANG=en_US.UTF-8
HOME=/home/ubuntu/repos
PORT=8080
SERVER=gunicorn
issue-label-bot[bot] commented 5 years ago

Issue-Label Bot is automatically applying the label #bug to this issue, with a confidence of 0.81. Please mark this comment with :thumbsup: or :thumbsdown: to give our bot feedback!

Links: app homepage, dashboard and code for this bot.

dpgaspar commented 5 years ago

Hi @SamuelMarks,

Can you be more specific "Should be able to login."?

By the way, you probably have a typo on your circus config [env:supserset]

SamuelMarks commented 5 years ago

Good find on the typo, fixed that. Didn't resolve any of the issues though. Checking the Postgres database, it's fairly baron:

=> SELECT relname,n_live_tup 
      FROM pg_stat_user_tables 
      ORDER BY n_live_tup DESC;
         relname         | n_live_tup 
-------------------------+------------
 ab_role                 |          2
 ab_user_role            |          0
 ab_permission_view_role |          0
 ab_user                 |          0
 ab_permission           |          0
 ab_register_user        |          0
 ab_view_menu            |          0
 ab_permission_view      |          0
=> SELECT name FROM ab_role;
  name  
--------
 Admin
 Public
(2 rows)

To get more specific, everything seems to succeed, no errors in stderr/stdout, web console, and no bad HTTP responses. But login always fails:

Screen Shot 2019-06-04 at 12 05 54 am

BTW: I've also tried reverting to the superset_config.py from your master branch, and setting the relevant environment variables (FLASK_ENV=development, POSTGRES_USER, POSTGRES_PASSWORD,POSTGRES_HOST, POSTGRES_PORT, POSTGRES_DB, REDIS_HOST, REDIS_PORT).

dpgaspar commented 5 years ago

Yes, you need to create an admin user first has well has bootstrap all permissions

Run once on each deploy:

$ flask fab create-admin
....
$ superset init
SamuelMarks commented 5 years ago

Tried your way, have also followed this whole guide: https://superset.incubator.apache.org/installation.html#superset-installation-and-initialization

mistercrunch commented 5 years ago

Sounds like we need versioned docs...

mistercrunch commented 5 years ago

Latest: https://apache-superset.readthedocs.io/en/latest/installation.html

Previous versions: https://readthedocs.org/projects/apache-superset/

Actually looks like older versions of the docs won't build for reasons I don't have time to investigate at the moment.

dpgaspar commented 5 years ago

Hi @SamuelMarks,

You have tried and did not work?

If not, send me:

SamuelMarks commented 5 years ago

Okay, so I've removed the previous virtualenv; added the relevant env to Circus; updated to f99ae1ad247298c8b8df3721768f69b13152aef0

Wrote this little shell script to parse out the Circus environment:

#!/usr/bin/env bash

declare desired_env="${DESIRED_ENV:-'superset'}";
declare -i in_desired_env=0;

while read -r line; do
  if [[ "${line:0:5}" =~ '[env:'  ]]; then
    [[ "$line" =~ "$desired_env"  ]]; in_desired_env=$?;
  else
    if [ "$in_desired_env" = 1 ]; then
      echo "$line";
    fi
  fi
done</etc/circus/conf.d/circus.ini

Then exported it to my current shell:

$ while read -r line; do
  [[ "${line:0:5}" =~ 'SHELL' || "${line:0:4}" =~ 'TERM' || "${line:0:4}" =~ 'HOME' ]] || export "$line";
done< <( ~/repos/a.bash)

Went through all the steps of https://apache-superset.readthedocs.io/en/latest/installation.html#superset-installation-and-initialization

Still left with the same login screen as before, no progress.


$ pip freeze
alembic==1.0.0
amqp==2.3.2
apache-superset==0.999.0.dev0
apispec==1.2.0
asn1crypto==0.24.0
attrs==19.1.0
Babel==2.6.0
billiard==3.5.0.4
bleach==3.0.2
celery==4.2.0
certifi==2018.8.24
cffi==1.11.5
chardet==3.0.4
click==6.7
colorama==0.3.9
contextlib2==0.5.5
croniter==0.3.29
cryptography==2.4.2
decorator==4.3.0
defusedxml==0.5.0
Flask==1.0.2
Flask-AppBuilder==2.0.0
Flask-Babel==0.11.1
Flask-Caching==1.4.0
Flask-Compress==1.4.0
Flask-JWT-Extended==3.18.1
Flask-Login==0.4.1
Flask-Migrate==2.1.1
Flask-OpenID==1.2.5
Flask-SQLAlchemy==2.3.2
flask-talisman==0.6.0
Flask-WTF==0.14.2
geopy==1.11.0
gunicorn==19.8.0
humanize==0.5.1
idna==2.6
isodate==0.6.0
itsdangerous==0.24
Jinja2==2.10.1
jsonschema==3.0.1
kombu==4.2.1
Mako==1.0.7
Markdown==3.0
MarkupSafe==1.0
marshmallow==2.19.2
marshmallow-enum==1.4.1
marshmallow-sqlalchemy==0.16.2
numpy==1.15.2
pandas==0.23.4
parsedatetime==2.0
pathlib2==2.3.0
pkg-resources==0.0.0
polyline==1.3.2
prison==0.1.0
py==1.7.0
pycparser==2.19
pydruid==0.5.3
PyJWT==1.7.1
pyrsistent==0.14.11
python-dateutil==2.6.1
python-dotenv==0.10.1
python-editor==1.0.3
python-geohash==0.8.5
python3-openid==3.1.0
pytz==2018.5
PyYAML==5.1
requests==2.22.0
retry==0.9.2
selenium==3.141.0
simplejson==3.15.0
six==1.11.0
SQLAlchemy==1.3.1
SQLAlchemy-Utils==0.33.11
sqlparse==0.2.4
urllib3==1.24.3
vine==1.1.4
webencodings==0.5.1
Werkzeug==0.14.1
WTForms==2.2.1
WTForms-JSON==0.3.
$ superset db upgrade
INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> 4e6a06bad7a8, Init
INFO  [alembic.runtime.migration] Running upgrade 4e6a06bad7a8 -> 5a7bad26f2a7, empty message
INFO  [alembic.runtime.migration] Running upgrade 5a7bad26f2a7 -> 1e2841a4128, empty message
INFO  [alembic.runtime.migration] Running upgrade 1e2841a4128 -> 2929af7925ed, TZ offsets in data sources
INFO  [alembic.runtime.migration] Running upgrade 2929af7925ed -> 289ce07647b, Add encrypted password field
INFO  [alembic.runtime.migration] Running upgrade 289ce07647b -> 1a48a5411020, adding slug to dash
INFO  [alembic.runtime.migration] Running upgrade 1a48a5411020 -> 315b3f4da9b0, adding log model
INFO  [alembic.runtime.migration] Running upgrade 315b3f4da9b0 -> 55179c7f25c7, sqla_descr
INFO  [alembic.runtime.migration] Running upgrade 55179c7f25c7 -> 12d55656cbca, is_featured
/opt/venvs/superset/lib/python3.6/site-packages/alembic/util/messaging.py:69: UserWarning: Skipping unsupported ALTER for creation of implicit constraint
  warnings.warn(msg)
INFO  [alembic.runtime.migration] Running upgrade 12d55656cbca -> 2591d77e9831, user_id
INFO  [alembic.runtime.migration] Running upgrade 2591d77e9831 -> 8e80a26a31db, empty message
INFO  [alembic.runtime.migration] Running upgrade 8e80a26a31db -> 7dbf98566af7, empty message
INFO  [alembic.runtime.migration] Running upgrade 7dbf98566af7 -> 43df8de3a5f4, empty message
INFO  [alembic.runtime.migration] Running upgrade 43df8de3a5f4 -> d827694c7555, css templates
INFO  [alembic.runtime.migration] Running upgrade d827694c7555 -> 430039611635, log more
INFO  [alembic.runtime.migration] Running upgrade 430039611635 -> 18e88e1cc004, making audit nullable
INFO  [alembic.runtime.migration] Running upgrade 18e88e1cc004 -> 836c0bf75904, cache_timeouts
INFO  [alembic.runtime.migration] Running upgrade 18e88e1cc004 -> a2d606a761d9, adding favstar model
INFO  [alembic.runtime.migration] Running upgrade a2d606a761d9, 836c0bf75904 -> d2424a248d63, empty message
INFO  [alembic.runtime.migration] Running upgrade d2424a248d63 -> 763d4b211ec9, fixing audit fk
INFO  [alembic.runtime.migration] Running upgrade d2424a248d63 -> 1d2ddd543133, log dt
INFO  [alembic.runtime.migration] Running upgrade 1d2ddd543133, 763d4b211ec9 -> fee7b758c130, empty message
INFO  [alembic.runtime.migration] Running upgrade fee7b758c130 -> 867bf4f117f9, Adding extra field to Database model
INFO  [alembic.runtime.migration] Running upgrade 867bf4f117f9 -> bb51420eaf83, add schema to table model
INFO  [alembic.runtime.migration] Running upgrade bb51420eaf83 -> b4456560d4f3, change_table_unique_constraint
INFO  [alembic.runtime.migration] Running upgrade b4456560d4f3 -> 4fa88fe24e94, owners_many_to_many
INFO  [alembic.runtime.migration] Running upgrade 4fa88fe24e94 -> c3a8f8611885, Materializing permission
INFO  [alembic.runtime.migration] Running upgrade c3a8f8611885 -> f0fbf6129e13, Adding verbose_name to tablecolumn
INFO  [alembic.runtime.migration] Running upgrade f0fbf6129e13 -> 956a063c52b3, adjusting key length
INFO  [alembic.runtime.migration] Running upgrade 956a063c52b3 -> 1226819ee0e3, Fix wrong constraint on table columns
INFO  [alembic.runtime.migration] Running upgrade 1226819ee0e3 -> d8bc074f7aad, Add new field 'is_restricted' to SqlMetric and DruidMetric
INFO  [alembic.runtime.migration] Running upgrade d8bc074f7aad -> 27ae655e4247, Make creator owners
INFO  [alembic.runtime.migration] Running upgrade 27ae655e4247 -> 960c69cb1f5b, add dttm_format related fields in table_columns
INFO  [alembic.runtime.migration] Running upgrade 960c69cb1f5b -> f162a1dea4c4, d3format_by_metric
INFO  [alembic.runtime.migration] Running upgrade f162a1dea4c4 -> ad82a75afd82, Update models to support storing the queries.
INFO  [alembic.runtime.migration] Running upgrade ad82a75afd82 -> 3c3ffe173e4f, add_sql_string_to_table
INFO  [alembic.runtime.migration] Running upgrade 3c3ffe173e4f -> 41f6a59a61f2, database options for sql lab
INFO  [alembic.runtime.migration] Running upgrade 41f6a59a61f2 -> 4500485bde7d, allow_run_sync_async
INFO  [alembic.runtime.migration] Running upgrade 4500485bde7d -> 65903709c321, allow_dml
INFO  [alembic.runtime.migration] Running upgrade 41f6a59a61f2 -> 33d996bcc382
INFO  [alembic.runtime.migration] Running upgrade 33d996bcc382, 65903709c321 -> b347b202819b, empty message
INFO  [alembic.runtime.migration] Running upgrade b347b202819b -> 5e4a03ef0bf0, Add access_request table to manage requests to access datastores.
INFO  [alembic.runtime.migration] Running upgrade 5e4a03ef0bf0 -> eca4694defa7, sqllab_setting_defaults
INFO  [alembic.runtime.migration] Running upgrade eca4694defa7 -> ab3d66c4246e, add_cache_timeout_to_druid_cluster
INFO  [alembic.runtime.migration] Running upgrade eca4694defa7 -> 3b626e2a6783, Sync DB with the models.py.
WARNI [root] Constraint must have a name
INFO  [alembic.runtime.migration] Running upgrade 3b626e2a6783, ab3d66c4246e -> ef8843b41dac, empty message
INFO  [alembic.runtime.migration] Running upgrade ef8843b41dac -> b46fa1b0b39e, Add json_metadata to the tables table.
INFO  [alembic.runtime.migration] Running upgrade b46fa1b0b39e -> 7e3ddad2a00b, results_key to query
INFO  [alembic.runtime.migration] Running upgrade 7e3ddad2a00b -> ad4d656d92bc, Add avg() to default metrics
INFO  [alembic.runtime.migration] Running upgrade ad4d656d92bc -> c611f2b591b8, dim_spec
INFO  [alembic.runtime.migration] Running upgrade c611f2b591b8 -> e46f2d27a08e, materialize perms
INFO  [alembic.runtime.migration] Running upgrade e46f2d27a08e -> f1f2d4af5b90, Enable Filter Select
INFO  [alembic.runtime.migration] Running upgrade e46f2d27a08e -> 525c854f0005, log_this_plus
INFO  [alembic.runtime.migration] Running upgrade 525c854f0005, f1f2d4af5b90 -> 6414e83d82b7, empty message
INFO  [alembic.runtime.migration] Running upgrade 6414e83d82b7 -> 1296d28ec131, Adds params to the datasource (druid) table
INFO  [alembic.runtime.migration] Running upgrade 1296d28ec131 -> f18570e03440, Add index on the result key to the query table.
INFO  [alembic.runtime.migration] Running upgrade f18570e03440 -> bcf3126872fc, Add keyvalue table
INFO  [alembic.runtime.migration] Running upgrade f18570e03440 -> db0c65b146bd, update_slice_model_json
INFO  [alembic.runtime.migration] Running upgrade db0c65b146bd -> a99f2f7c195a, rewriting url from shortner with new format
INFO  [alembic.runtime.migration] Running upgrade a99f2f7c195a, bcf3126872fc -> d6db5a5cdb5d, empty message
INFO  [alembic.runtime.migration] Running upgrade d6db5a5cdb5d -> b318dfe5fb6c, adding verbose_name to druid column
INFO  [alembic.runtime.migration] Running upgrade d6db5a5cdb5d -> 732f1c06bcbf, add fetch values predicate
INFO  [alembic.runtime.migration] Running upgrade 732f1c06bcbf, b318dfe5fb6c -> ea033256294a, empty message
INFO  [alembic.runtime.migration] Running upgrade b318dfe5fb6c -> db527d8c4c78, Add verbose name to DruidCluster and Database
INFO  [alembic.runtime.migration] Running upgrade db527d8c4c78, ea033256294a -> 979c03af3341, empty message
INFO  [alembic.runtime.migration] Running upgrade 979c03af3341 -> a6c18f869a4e, query.start_running_time
INFO  [alembic.runtime.migration] Running upgrade a6c18f869a4e -> 2fcdcb35e487, saved_queries
INFO  [alembic.runtime.migration] Running upgrade 2fcdcb35e487 -> a65458420354, add_result_backend_time_logging
INFO  [alembic.runtime.migration] Running upgrade a65458420354 -> ca69c70ec99b, tracking_url
INFO  [alembic.runtime.migration] Running upgrade ca69c70ec99b -> a9c47e2c1547, add impersonate_user to dbs
INFO  [alembic.runtime.migration] Running upgrade ca69c70ec99b -> ddd6ebdd853b, annotations
INFO  [alembic.runtime.migration] Running upgrade a9c47e2c1547, ddd6ebdd853b -> d39b1e37131d, empty message
INFO  [alembic.runtime.migration] Running upgrade ca69c70ec99b -> 19a814813610, Adding metric warning_text
INFO  [alembic.runtime.migration] Running upgrade 19a814813610, a9c47e2c1547 -> 472d2f73dfd4, empty message
INFO  [alembic.runtime.migration] Running upgrade 472d2f73dfd4, d39b1e37131d -> f959a6652acd, empty message
INFO  [alembic.runtime.migration] Running upgrade f959a6652acd -> 4736ec66ce19, empty message
/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/dialects/sqlite/base.py:1838: SAWarning: WARNING: SQL-parsed foreign key constraint '('datasource_name', 'datasources', 'datasource_name')' could not be located in PRAGMA foreign_keys for table metrics
  "foreign_keys for table %s" % (sig, table_name)
INFO  [alembic.runtime.migration] Running upgrade 4736ec66ce19 -> 67a6ac9b727b, update_spatial_params
INFO  [alembic.runtime.migration] Running upgrade 67a6ac9b727b -> 21e88bc06c02
INFO  [alembic.runtime.migration] Running upgrade 21e88bc06c02 -> e866bd2d4976, smaller_grid
Revision ID: e866bd2d4976
Revises: 21e88bc06c02
Create Date: 2018-02-13 08:07:40.766277
INFO  [alembic.runtime.migration] Running upgrade e866bd2d4976 -> e68c4473c581, allow_multi_schema_metadata_fetch
INFO  [alembic.runtime.migration] Running upgrade e68c4473c581 -> f231d82b9b26, empty message
INFO  [alembic.runtime.migration] Running upgrade f231d82b9b26 -> bf706ae5eb46, cal_heatmap_metric_to_metrics
INFO  [alembic.runtime.migration] Running upgrade f231d82b9b26 -> 30bb17c0dc76, empty message
INFO  [alembic.runtime.migration] Running upgrade 30bb17c0dc76, bf706ae5eb46 -> c9495751e314, empty message
INFO  [alembic.runtime.migration] Running upgrade f231d82b9b26 -> 130915240929, is_sqllab_view
INFO  [alembic.runtime.migration] Running upgrade 130915240929, c9495751e314 -> 5ccf602336a0, empty message
INFO  [alembic.runtime.migration] Running upgrade 5ccf602336a0 -> e502db2af7be, add template_params to tables
INFO  [alembic.runtime.migration] Running upgrade e502db2af7be -> c5756bec8b47, Time grain SQLA
INFO  [alembic.runtime.migration] Running upgrade c5756bec8b47 -> afb7730f6a9c, remove empty filters
INFO  [alembic.runtime.migration] Running upgrade afb7730f6a9c -> 80a67c5192fa, single pie chart metric
INFO  [alembic.runtime.migration] Running upgrade 80a67c5192fa -> bddc498dd179, adhoc filters
INFO  [alembic.runtime.migration] Running upgrade bddc498dd179 -> 3dda56f1c4c6, Migrate num_period_compare and period_ratio_type
INFO  [alembic.runtime.migration] Running upgrade 3dda56f1c4c6 -> 1d9e835a84f9, empty message
INFO  [alembic.runtime.migration] Running upgrade bddc498dd179 -> 4451805bbaa1, remove double percents
INFO  [alembic.runtime.migration] Running upgrade 4451805bbaa1, 1d9e835a84f9 -> 705732c70154, empty message
INFO  [alembic.runtime.migration] Running upgrade 4451805bbaa1, 1d9e835a84f9 -> fc480c87706c, empty message
INFO  [alembic.runtime.migration] Running upgrade fc480c87706c -> bebcf3fed1fe, Migrate dashboard position_json data from V1 to V2
INFO  [alembic.runtime.migration] Running upgrade bebcf3fed1fe, 705732c70154 -> ec1f88a35cc6, empty message
INFO  [alembic.runtime.migration] Running upgrade 4451805bbaa1, 1d9e835a84f9 -> e3970889f38e, empty message
INFO  [alembic.runtime.migration] Running upgrade 705732c70154, e3970889f38e -> 46ba6aaaac97, empty message
INFO  [alembic.runtime.migration] Running upgrade 46ba6aaaac97, ec1f88a35cc6 -> c18bd4186f15, empty message
INFO  [alembic.runtime.migration] Running upgrade c18bd4186f15 -> 7fcdcde0761c, Reduce position_json size by remove extra space and component id prefix
INFO  [alembic.runtime.migration] Running upgrade 7fcdcde0761c -> 0c5070e96b57, add user attributes table
INFO  [alembic.runtime.migration] Running upgrade 0c5070e96b57 -> 1a1d627ebd8e, position_json
INFO  [alembic.runtime.migration] Running upgrade 1a1d627ebd8e -> 55e910a74826, add_metadata_column_to_annotation_model.py
INFO  [alembic.runtime.migration] Running upgrade 55e910a74826 -> 4ce8df208545, empty message
INFO  [alembic.runtime.migration] Running upgrade 4ce8df208545 -> 46f444d8b9b7, remove_coordinator_from_druid_cluster_model.py
INFO  [alembic.runtime.migration] Running upgrade 46f444d8b9b7 -> a61b40f9f57f, remove allow_run_sync
INFO  [alembic.runtime.migration] Running upgrade a61b40f9f57f -> 6c7537a6004a, models for email reports
INFO  [alembic.runtime.migration] Running upgrade 6c7537a6004a -> 3e1b21cd94a4, change_owner_to_m2m_relation_on_datasources.py
INFO  [alembic.runtime.migration] Running upgrade 6c7537a6004a -> cefabc8f7d38, Increase size of name column in ab_view_menu
INFO  [alembic.runtime.migration] Running upgrade 55e910a74826 -> 0b1f1ab473c0, Add extra column to Query
INFO  [alembic.runtime.migration] Running upgrade 0b1f1ab473c0, cefabc8f7d38, 3e1b21cd94a4 -> de021a1ca60d, empty message
INFO  [alembic.runtime.migration] Running upgrade de021a1ca60d -> fb13d49b72f9, better_filters
INFO  [alembic.runtime.migration] Running upgrade fb13d49b72f9 -> a33a03f16c4a, Add extra column to SavedQuery
INFO  [alembic.runtime.migration] Running upgrade 4451805bbaa1, 1d9e835a84f9 -> c829ff0b37d0, empty message
INFO  [alembic.runtime.migration] Running upgrade c829ff0b37d0 -> 7467e77870e4, remove_aggs
INFO  [alembic.runtime.migration] Running upgrade 7467e77870e4, de021a1ca60d -> fbd55e0f83eb, empty message
INFO  [alembic.runtime.migration] Running upgrade fbd55e0f83eb, fb13d49b72f9 -> 8b70aa3d0f87, empty message
INFO  [alembic.runtime.migration] Running upgrade 8b70aa3d0f87, a33a03f16c4a -> 18dc26817ad2, empty message
INFO  [alembic.runtime.migration] Running upgrade 18dc26817ad2 -> c617da68de7d, form nullable
INFO  [alembic.runtime.migration] Running upgrade c617da68de7d -> c82ee8a39623, Add implicit tags
INFO  [alembic.runtime.migration] Running upgrade 18dc26817ad2 -> e553e78e90c5, add_druid_auth_py.py
INFO  [alembic.runtime.migration] Running upgrade e553e78e90c5, c82ee8a39623 -> 45e7da7cfeba, empty message
INFO  [alembic.runtime.migration] Running upgrade 45e7da7cfeba -> 80aa3f04bc82, Add Parent ids in dashboard layout metadata
INFO  [alembic.runtime.migration] Running upgrade 80aa3f04bc82 -> d94d33dbe938, form strip
INFO  [alembic.runtime.migration] Running upgrade d94d33dbe938 -> 937d04c16b64, update datasources
INFO  [alembic.runtime.migration] Running upgrade 937d04c16b64 -> 7f2635b51f5d, update base columns
INFO  [alembic.runtime.migration] Running upgrade 7f2635b51f5d -> e9df189e5c7e, update base metrics
INFO  [alembic.runtime.migration] Running upgrade e9df189e5c7e -> afc69274c25a, update the sql, select_sql, and executed_sql columns in the
   query table in mysql dbs to be long text columns
$ flask fab create-admin
Username [admin]: foo
User first name [admin]: bar
User last name [user]: can
Email [admin@fab.org]: haz@bar.com
Password: 
Repeat for confirmation
$ superset load_examples
Loading examples into <SQLA engine=sqlite:////home/ubuntu/.superset/superset.db>
Creating default CSS templates
Loading energy related dataset
2019-06-04 08:24:56,204:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:24:56,688:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/energy.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:24:57,049:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/energy.json.gz HTTP/1.1" 302 151
2019-06-04 08:24:57,051:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:24:57,451:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/energy.json.gz HTTP/1.1" 200 985
Creating table [wb_health_population] reference
2019-06-04 08:24:57,565:INFO:root:Creating database reference
2019-06-04 08:24:57,690:INFO:root:Database.get_sqla_engine(). Masked URL: sqlite:////home/ubuntu/.superset/superset.db
Loading [World Bank's Health Nutrition and Population Stats]
2019-06-04 08:24:57,953:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:24:58,460:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/countries.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:24:58,843:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/countries.json.gz HTTP/1.1" 302 154
2019-06-04 08:24:58,845:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:00,625:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/countries.json.gz HTTP/1.1" 200 14752439
Creating table [wb_health_population] reference
2019-06-04 08:25:08,114:INFO:root:Creating database reference
Creating slices
Creating a World's Health Bank dashboard
Loading [Birth names]
2019-06-04 08:25:09,209:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:09,600:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/birth_names.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:10,007:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/birth_names.json.gz HTTP/1.1" 302 156
2019-06-04 08:25:10,009:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:12,160:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/birth_names.json.gz HTTP/1.1" 200 734913
Done loading table!
--------------------------------------------------------------------------------
Creating table [birth_names] reference
2019-06-04 08:25:13,527:INFO:root:Creating database reference
Creating some slices
Creating a dashboard
Loading [Unicode test data]
2019-06-04 08:25:14,789:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:15,121:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/unicode_utf8_unixnl_test.csv?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:15,494:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/unicode_utf8_unixnl_test.csv HTTP/1.1" 302 165
2019-06-04 08:25:15,496:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:15,868:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/unicode_utf8_unixnl_test.csv HTTP/1.1" 200 3247
Done loading table!
--------------------------------------------------------------------------------
Creating table [unicode_test] reference
2019-06-04 08:25:16,005:INFO:root:Creating database reference
Creating a slice
Creating a dashboard
Loading [Random time series data]
2019-06-04 08:25:16,266:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:17,853:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/random_time_series.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:18,220:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/random_time_series.json.gz HTTP/1.1" 302 163
2019-06-04 08:25:18,222:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:19,510:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/random_time_series.json.gz HTTP/1.1" 200 264250
Done loading table!
--------------------------------------------------------------------------------
Creating table [random_time_series] reference
2019-06-04 08:25:20,006:INFO:root:Creating database reference
Creating a slice
Loading [Random long/lat data]
2019-06-04 08:25:20,203:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:20,520:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/san_francisco.csv.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:20,998:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/san_francisco.csv.gz HTTP/1.1" 302 157
2019-06-04 08:25:21,000:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:21,479:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/san_francisco.csv.gz HTTP/1.1" 200 2103726
Done loading table!
--------------------------------------------------------------------------------
Creating table reference
2019-06-04 08:25:34,007:INFO:root:Creating database reference
Creating a slice
Loading [Country Map data]
2019-06-04 08:25:34,242:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:34,692:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/birth_france_data_for_country_map.csv?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:34,969:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/birth_france_data_for_country_map.csv HTTP/1.1" 302 174
2019-06-04 08:25:34,970:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:35,332:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/birth_france_data_for_country_map.csv HTTP/1.1" 200 3150
Done loading table!
--------------------------------------------------------------------------------
Creating table reference
2019-06-04 08:25:35,475:INFO:root:Creating database reference
Creating a slice
Loading [Multiformat time series]
2019-06-04 08:25:35,707:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:36,113:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/multiformat_time_series.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:36,429:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/multiformat_time_series.json.gz HTTP/1.1" 302 168
2019-06-04 08:25:36,431:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:36,801:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/multiformat_time_series.json.gz HTTP/1.1" 200 38387
Done loading table!
--------------------------------------------------------------------------------
Creating table [multiformat_time_series] reference
2019-06-04 08:25:36,943:INFO:root:Creating database reference
Creating Heatmap charts
Loading [Paris GeoJson]
2019-06-04 08:25:37,594:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:37,928:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/paris_iris.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:38,487:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/paris_iris.json.gz HTTP/1.1" 302 155
2019-06-04 08:25:38,489:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:41,291:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/paris_iris.json.gz HTTP/1.1" 200 4427142
Creating table paris_iris_mapping reference
2019-06-04 08:25:43,056:INFO:root:Creating database reference
Loading [San Francisco population polygons]
2019-06-04 08:25:43,188:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:43,589:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/sf_population.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:43,894:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/sf_population.json.gz HTTP/1.1" 302 158
2019-06-04 08:25:43,896:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:44,706:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/sf_population.json.gz HTTP/1.1" 200 47217
Creating table sf_population_polygons reference
2019-06-04 08:25:44,881:INFO:root:Creating database reference
Loading [Flights data]
2019-06-04 08:25:45,020:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:45,371:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/flight_data.csv.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:45,765:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/flight_data.csv.gz HTTP/1.1" 302 155
2019-06-04 08:25:45,767:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:46,153:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/flight_data.csv.gz HTTP/1.1" 200 1897423
2019-06-04 08:25:46,385:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:46,745:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/airports.csv.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:47,155:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/airports.csv.gz HTTP/1.1" 302 152
2019-06-04 08:25:47,157:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:47,518:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/airports.csv.gz HTTP/1.1" 200 9836
2019-06-04 08:25:49,920:INFO:root:Creating database reference
Done loading table!
Loading [BART lines]
2019-06-04 08:25:50,083:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:50,431:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/bart-lines.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:50,739:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/bart-lines.json.gz HTTP/1.1" 302 155
2019-06-04 08:25:50,741:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:51,109:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/bart-lines.json.gz HTTP/1.1" 200 1267
Creating table bart_lines reference
2019-06-04 08:25:51,236:INFO:root:Creating database reference
Loading [Multi Line]
2019-06-04 08:25:51,359:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:25:51,687:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/countries.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:25:51,695:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/countries.json.gz HTTP/1.1" 302 154
2019-06-04 08:25:51,697:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:25:51,714:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/countries.json.gz HTTP/1.1" 200 14752439
Creating table [wb_health_population] reference
2019-06-04 08:26:00,064:INFO:root:Creating database reference
Creating slices
Creating a World's Health Bank dashboard
2019-06-04 08:26:01,051:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): github.com:443
2019-06-04 08:26:01,416:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/blob/master/birth_names.json.gz?raw=true HTTP/1.1" 302 None
2019-06-04 08:26:01,889:DEBUG:urllib3.connectionpool:https://github.com:443 "GET /apache-superset/examples-data/raw/master/birth_names.json.gz HTTP/1.1" 302 None
2019-06-04 08:26:01,891:DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): raw.githubusercontent.com:443
2019-06-04 08:26:01,938:DEBUG:urllib3.connectionpool:https://raw.githubusercontent.com:443 "GET /apache-superset/examples-data/master/birth_names.json.gz HTTP/1.1" 200 734913
Done loading table!
--------------------------------------------------------------------------------
Creating table [birth_names] reference
2019-06-04 08:26:03,461:INFO:root:Creating database reference
Creating some slices
Creating a dashboard
Loading [Misc Charts] dashboard
Creating the dashboard
Loading DECK.gl demo
Loading deck.gl dashboard
Creating Scatterplot slice
Creating Screen Grid slice
Creating Hex slice
Creating Grid slice
Creating Polygon slice
Creating Arc slice
Creating Path slice
Creating a dashboard
Loading [Tabbed dashboard]
Creating a dashboard with nested tabs
$ superset init
2019-06-04 08:26:32,942:INFO:root:Creating database reference
2019-06-04 08:27:19,598:INFO:root:Syncing role definition
2019-06-04 08:27:20,065:INFO:root:Syncing Admin perms
2019-06-04 08:27:20,205:INFO:root:Syncing Alpha perms
2019-06-04 08:27:20,725:INFO:root:Syncing Gamma perms
2019-06-04 08:27:21,193:INFO:root:Syncing granter perms
2019-06-04 08:27:21,595:INFO:root:Syncing sql_lab perms
2019-06-04 08:27:22,033:INFO:root:Fetching a set of all perms to lookup which ones are missing
2019-06-04 08:27:22,112:INFO:root:Creating missing datasource permissions.
2019-06-04 08:27:22,118:INFO:root:Creating missing database permissions.
2019-06-04 08:27:22,222:INFO:root:Creating missing metrics permissions
2019-06-04 08:27:22,226:INFO:root:Cleaning faulty perms
=> SELECT relname,n_live_tup FROM pg_stat_user_tables ORDER BY n_live_tup DESC;
         relname         | n_live_tup 
-------------------------+------------
 ab_role                 |          2
 ab_user_role            |          0
 ab_permission_view_role |          0
 ab_user                 |          0
 ab_permission           |          0
 ab_register_user        |          0
 ab_view_menu            |          0
 ab_permission_view      |          0
(8 rows)

Actually taking a close look at this, it seems to be storing in SQLite rather than Postgres. How do I switch? - I've set all the relevant environment variables.

dpgaspar commented 5 years ago

Yes, that seems to be the problem, you still have no users on PG. your superset db upgrade show it is using SQLlite.

Some more questions:

SamuelMarks commented 5 years ago

~/repos/incubator-superset/superset/superset_config.py

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied.  See the License for the
# specific language governing permissions and limitations
# under the License.
import os

def get_env_variable(var_name, default=None):
    """Get the environment variable or raise exception."""
    try:
        return os.environ[var_name]
    except KeyError:
        if default is not None:
            return default
        else:
            error_msg = 'The environment variable {} was missing, abort...'\
                        .format(var_name)
            raise EnvironmentError(error_msg)

POSTGRES_USER = get_env_variable('POSTGRES_USER')
POSTGRES_PASSWORD = get_env_variable('POSTGRES_PASSWORD')
POSTGRES_HOST = get_env_variable('POSTGRES_HOST')
POSTGRES_PORT = get_env_variable('POSTGRES_PORT')
POSTGRES_DB = get_env_variable('POSTGRES_DB')

# The SQLAlchemy connection string.
SQLALCHEMY_DATABASE_URI = 'postgresql://%s:%s@%s:%s/%s' % (POSTGRES_USER,
                                                           POSTGRES_PASSWORD,
                                                           POSTGRES_HOST,
                                                           POSTGRES_PORT,
                                                           POSTGRES_DB)

REDIS_HOST = get_env_variable('REDIS_HOST')
REDIS_PORT = get_env_variable('REDIS_PORT')

class CeleryConfig(object):
    BROKER_URL = 'redis://%s:%s/0' % (REDIS_HOST, REDIS_PORT)
    CELERY_IMPORTS = ('superset.sql_lab', )
    CELERY_RESULT_BACKEND = 'redis://%s:%s/1' % (REDIS_HOST, REDIS_PORT)
    CELERY_ANNOTATIONS = {'tasks.add': {'rate_limit': '10/s'}}
    CELERY_TASK_PROTOCOL = 1

CELERY_CONFIG = CeleryConfig

Thanks to the Bash script I wrote above, I am exporting the Circus environment into my current shell before running these commands.


Hmm, is it in the right location? - I do pip uninstall -y apache-superset; pip install . each time


EDIT: Hmm, adding superset_config.py path explicitly to my PYTHONPATH is making progress.

SamuelMarks commented 5 years ago

Great news! - made progress. 500 server error:

[2019-06-04 10:10:19 +0000] [118228] [ERROR] Error handling request /superset/welcome
Traceback (most recent call last):
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
    cursor, statement, parameters, context
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
    cursor.execute(statement, parameters)
psycopg2.errors.UndefinedTable: relation "user_attribute" does not exist
LINE 2: FROM user_attribute 
             ^

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/venvs/superset/lib/python3.6/site-packages/gunicorn/workers/sync.py", line 135, in handle
    self.handle_request(listener, req, client, addr)
  File "/opt/venvs/superset/lib/python3.6/site-packages/gunicorn/workers/sync.py", line 176, in handle_request
    respiter = self.wsgi(environ, resp.start_response)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 2309, in __call__
    return self.wsgi_app(environ, start_response)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
    response = self.handle_exception(e)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 1741, in handle_exception
    reraise(exc_type, exc_value, tb)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
    raise value
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
    response = self.full_dispatch_request()
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
    raise value
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
    rv = self.dispatch_request()
  File "/opt/venvs/superset/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/ubuntu/repos/incubator-superset/superset/views/core.py", line 2892, in welcome
    .filter_by(user_id=g.user.get_id())
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3305, in scalar
    ret = self.one()
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3275, in one
    ret = self.one_or_none()
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3244, in one_or_none
    ret = list(self)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3317, in __iter__
    return self._execute_and_instances(context)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3342, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 988, in execute
    return meth(self, multiparams, params)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 287, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1107, in _execute_clauseelement
    distilled_params,
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1248, in _execute_context
    e, statement, parameters, cursor, context
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1466, in _handle_dbapi_exception
    util.raise_from_cause(sqlalchemy_exception, exc_info)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 383, in raise_from_cause
    reraise(type(exception), exception, tb=exc_tb, cause=cause)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 128, in reraise
    raise value.with_traceback(tb)
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
    cursor, statement, parameters, context
  File "/opt/venvs/superset/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.UndefinedTable) relation "user_attribute" does not exist
LINE 2: FROM user_attribute 
             ^

[SQL: SELECT user_attribute.welcome_dashboard_id AS user_attribute_welcome_dashboard_id 
FROM user_attribute 
WHERE user_attribute.user_id = %(user_id_1)s]
[parameters: {'user_id_1': '1'}]
(Background on this error at: http://sqlalche.me/e/f405)

EDIT: Going to remove the database in postgres and go through this whole run-around again

dpgaspar commented 5 years ago

that happens when superset db upgrade did not run (on the correct db)

SamuelMarks commented 5 years ago

Confirmed! - After DROP DATABASE and CREATE DATABASE, then doing that whole run-around, it's now working.

🍾

Only weirdness is I have a .superset folder being created in two places, $HOME/.superset and $HOME/repos/.superset. Weird.

Thanks for the help @dpgaspar @mistercrunch

SamuelMarks commented 5 years ago

Actually maybe I spoke a tad too early:

Screen Shot 2019-06-04 at 8 45 04 pm
{
  "cache_key": null,
  "cached_dttm": null,
  "cache_timeout": 86400,
  "error": "column \"sp_rur_totl_zs\" does not exist\nLINE 2:        sum(SP_RUR_TOTL_ZS) AS \"sum__SP_RUR_TOTL_ZS\",\n                   ^\n",
  "form_data": {
    "datasource": "2__table",
    "viz_type": "world_map",
    "slice_id": 47,
    "granularity_sqla": "year",
    "time_grain_sqla": "P1D",
    "time_range": "2014-01-01 : 2014-01-02",
    "entity": "country_code",
    "country_fieldtype": "cca3",
    "metric": "sum__SP_RUR_TOTL_ZS",
    "adhoc_filters": [],
    "row_limit": 50000,
    "show_bubbles": true,
    "secondary_metric": "sum__SP_POP_TOTL",
    "max_bubble_size": "25",
    "label_colors": {},
    "where": "",
    "having": "",
    "having_filters": [],
    "filters": []
  },
  "is_cached": false,
  "query": "SELECT country_code AS country_code,\n       sum(SP_RUR_TOTL_ZS) AS \"sum__SP_RUR_TOTL_ZS\",\n       sum(SP_POP_TOTL) AS \"sum__SP_POP_TOTL\"\nFROM wb_health_population\nWHERE year >= '2014-01-01 00:00:00'\n  AND year <= '2014-01-02 00:00:00'\nGROUP BY country_code\nORDER BY \"sum__SP_RUR_TOTL_ZS\" DESC\nLIMIT 50000;",
  "status": "failed",
  "stacktrace": null,
  "rowcount": 0
}
2019-06-04 10:45:52,659:INFO:root:Database.get_sqla_engine(). Masked URL: postgresql://omitted:XXXXXXXXXX@localhost:5433/omitted
2019-06-04 10:45:52,667:ERROR:root:Query SELECT region AS region,
       sum(SP_POP_TOTL) AS "sum__SP_POP_TOTL"
FROM wb_health_population
WHERE year >= '1960-01-01 00:00:00'
  AND year <= '2019-06-04 10:45:52'
GROUP BY region
ORDER BY "sum__SP_POP_TOTL" DESC
LIMIT 50000 on schema None failed
Traceback (most recent call last):
  File "/home/ubuntu/repos/incubator-superset/superset/connectors/sqla/models.py", line 855, in query
    df = self.database.get_df(sql, self.schema, mutator)
  File "/home/ubuntu/repos/incubator-superset/superset/models/core.py", line 874, in get_df
    self.db_engine_spec.execute(cursor, sqls[-1])
  File "/home/ubuntu/repos/incubator-superset/superset/db_engine_specs.py", line 478, in execute
    cursor.execute(query)
psycopg2.errors.UndefinedColumn: column "sp_pop_totl" does not exist
LINE 2:        sum(SP_POP_TOTL) AS "sum__SP_POP_TOTL"
                   ^

2019-06-04 10:45:53,775:DEBUG:root:[stats_logger] (incr) log
dpgaspar commented 5 years ago

This is a different problem, can you please close it. For this new question, search if it's already answered and if not open a new one.