getsentry / self-hosted

Sentry, feature-complete and packaged up for low-volume deployments and proofs-of-concept
https://develop.sentry.dev/self-hosted/
Other
7.75k stars 1.75k forks source link

When running docker-compose up -d ,relay an error is reported #989

Closed FeiBa0125 closed 3 years ago

FeiBa0125 commented 3 years ago

Version Information

Version: Sentry 21.6.0.dev0 Ubuntu 20.04.2 LTS Ram:8G Core:4

Steps to Reproduce

  1. Multiple reload failures This question is similar to it, but I didn't try to succeed https://github.com/getsentry/onpremise/issues/963

Expected Result

going on smoothly

Actual Result

relay is fault

Logs

image


▶ Setting up error handling ...

▶ Checking minimum requirements ...

▶ Creating volumes for persistent storage ...
Created sentry-clickhouse.
Created sentry-data.
Created sentry-kafka.
Created sentry-postgres.
Created sentry-redis.
Created sentry-symbolicator.
Created sentry-zookeeper.

▶ Ensuring files from examples ...
../sentry/sentry.conf.py already exists, skipped creation.
../sentry/config.yml already exists, skipped creation.
../symbolicator/config.yml already exists, skipped creation.
../sentry/requirements.txt already exists, skipped creation.

▶ Generating secret key ...

▶ Replacing TSDB ...

▶ Fetching and updating Docker images ...
nightly: Pulling from getsentry/sentry
Digest: sha256:e20b2b4eddac9d357f533063d5319cfa00ca0e66d071758b91c46e57145b8418
Status: Image is up to date for getsentry/sentry:nightly
docker.io/getsentry/sentry:nightly

▶ Building and tagging Docker images ...

smtp uses an image, skipping
memcached uses an image, skipping
redis uses an image, skipping
postgres uses an image, skipping
zookeeper uses an image, skipping
kafka uses an image, skipping
clickhouse uses an image, skipping
geoipupdate uses an image, skipping
snuba-api uses an image, skipping
snuba-consumer uses an image, skipping
snuba-outcomes-consumer uses an image, skipping
snuba-sessions-consumer uses an image, skipping
snuba-transactions-consumer uses an image, skipping
snuba-replacer uses an image, skipping
snuba-subscription-consumer-events uses an image, skipping
snuba-subscription-consumer-transactions uses an image, skipping
symbolicator uses an image, skipping
web uses an image, skipping
cron uses an image, skipping
worker uses an image, skipping
ingest-consumer uses an image, skipping
post-process-forwarder uses an image, skipping
subscription-consumer-events uses an image, skipping
subscription-consumer-transactions uses an image, skipping
relay uses an image, skipping
nginx uses an image, skipping
Building snuba-cleanup
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> 6a53f9b490bb
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron &&     rm -r /var/lib/apt/lists/*
 ---> Using cache
 ---> 92d756dc17ee
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
 ---> Using cache
 ---> 625b907acb32
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
 ---> Using cache
 ---> c839ede3f5b7

Successfully built c839ede3f5b7
Successfully tagged snuba-cleanup-onpremise-local:latest
Building snuba-transactions-cleanup
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> 6a53f9b490bb
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron &&     rm -r /var/lib/apt/lists/*
 ---> Using cache
 ---> 92d756dc17ee
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
 ---> Using cache
 ---> 625b907acb32
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
 ---> Using cache
 ---> c839ede3f5b7

Successfully built c839ede3f5b7
Successfully tagged snuba-cleanup-onpremise-local:latest
Building symbolicator-cleanup
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> b3722686db05
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron &&     rm -r /var/lib/apt/lists/*
 ---> Using cache
 ---> aa59f84c7d78
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
 ---> Using cache
 ---> e2a5eb307174
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
 ---> Using cache
 ---> 305f0b1bd183

Successfully built 305f0b1bd183
Successfully tagged symbolicator-cleanup-onpremise-local:latest
Building sentry-cleanup
Step 1/5 : ARG BASE_IMAGE
Step 2/5 : FROM ${BASE_IMAGE}
 ---> addf5f722be8
Step 3/5 : RUN apt-get update && apt-get install -y --no-install-recommends cron &&     rm -r /var/lib/apt/lists/*
 ---> Using cache
 ---> 8bd16899bb81
Step 4/5 : COPY entrypoint.sh /entrypoint.sh
 ---> Using cache
 ---> db7ef361897c
Step 5/5 : ENTRYPOINT ["/entrypoint.sh"]
 ---> Using cache
 ---> 01f8467c7223

Successfully built 01f8467c7223
Successfully tagged sentry-cleanup-onpremise-local:latest

Docker images built.

▶ Turning things off ...
Removing network onpremise_default
Network onpremise_default not found.
Removing network sentry_onpremise_default

▶ Setting up Zookeeper ...
Creating network "sentry_onpremise_default" with the default driver
Creating volume "sentry_onpremise_sentry-secrets" with default driver
Creating volume "sentry_onpremise_sentry-smtp" with default driver
Creating volume "sentry_onpremise_sentry-zookeeper-log" with default driver
Creating volume "sentry_onpremise_sentry-kafka-log" with default driver
Creating volume "sentry_onpremise_sentry-smtp-log" with default driver
Creating volume "sentry_onpremise_sentry-clickhouse-log" with default driver

▶ Downloading and installing wal2json ...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  5416  100  5416    0     0   5958      0 --:--:-- --:--:-- --:--:--  5958

▶ Bootstrapping and migrating Snuba ...
Creating sentry_onpremise_clickhouse_1 ... 
Creating sentry_onpremise_redis_1      ... 
Creating sentry_onpremise_zookeeper_1  ... 
Creating sentry_onpremise_zookeeper_1  ... done
Creating sentry_onpremise_kafka_1      ... 
Creating sentry_onpremise_clickhouse_1 ... done
Creating sentry_onpremise_redis_1      ... done
Creating sentry_onpremise_kafka_1      ... done
2021-06-11 06:31:47,649 Attempting to connect to Kafka (attempt 0)...
2021-06-11 06:31:49,652 Attempting to connect to Kafka (attempt 1)...
2021-06-11 06:31:51,655 Attempting to connect to Kafka (attempt 2)...
2021-06-11 06:31:51,661 Connected to Kafka on attempt 2
2021-06-11 06:31:51,662 Creating Kafka topics...
2021-06-11 06:31:52,047 Topic events created
2021-06-11 06:31:52,047 Topic event-replacements created
2021-06-11 06:31:52,047 Topic event-replacements-legacy created
2021-06-11 06:31:52,047 Topic snuba-commit-log created
2021-06-11 06:31:52,047 Topic cdc created
2021-06-11 06:31:52,047 Topic ingest-metrics created
2021-06-11 06:31:52,047 Topic outcomes created
2021-06-11 06:31:52,047 Topic ingest-sessions created
2021-06-11 06:31:52,047 Topic events-subscription-results created
2021-06-11 06:31:52,047 Topic transactions-subscription-results created
2021-06-11 06:31:52,048 Topic snuba-queries created
Starting sentry_onpremise_redis_1 ... 
Starting sentry_onpremise_redis_1 ... done
Starting sentry_onpremise_clickhouse_1 ... 
Starting sentry_onpremise_clickhouse_1 ... done
Starting sentry_onpremise_zookeeper_1  ... 
Starting sentry_onpremise_zookeeper_1  ... done
Starting sentry_onpremise_kafka_1      ... 
Starting sentry_onpremise_kafka_1      ... done
Starting migration from 2021-06-07
Migrated 2021-06-07. (1 of 13 partitions done)
Migrated 2021-05-31. (2 of 13 partitions done)
Migrated 2021-05-24. (3 of 13 partitions done)
Migrated 2021-05-17. (4 of 13 partitions done)
Migrated 2021-05-10. (5 of 13 partitions done)
Migrated 2021-05-03. (6 of 13 partitions done)
Migrated 2021-04-26. (7 of 13 partitions done)
Migrated 2021-04-19. (8 of 13 partitions done)
Migrated 2021-04-12. (9 of 13 partitions done)
Migrated 2021-04-05. (10 of 13 partitions done)
Migrated 2021-03-29. (11 of 13 partitions done)
Migrated 2021-03-22. (12 of 13 partitions done)
Migrated 2021-03-15. (13 of 13 partitions done)
Done. Optimizing.
Finished running migrations

▶ Creating additional Kafka topics ...
Starting sentry_onpremise_zookeeper_1 ... 
Starting sentry_onpremise_zookeeper_1 ... done
Created topic ingest-attachments.

Starting sentry_onpremise_zookeeper_1 ... 
Starting sentry_onpremise_zookeeper_1 ... done
Created topic ingest-transactions.

Starting sentry_onpremise_zookeeper_1 ... 
Starting sentry_onpremise_zookeeper_1 ... done
Created topic ingest-events.

▶ Ensuring proper PostgreSQL version ...

▶ Setting up / migrating database ...
Creating sentry_onpremise_memcached_1 ... 
Starting sentry_onpremise_zookeeper_1 ... 
Starting sentry_onpremise_zookeeper_1 ... done
Creating sentry_onpremise_symbolicator_1 ... 
Starting sentry_onpremise_redis_1        ... 
Creating sentry_onpremise_smtp_1         ... 
Starting sentry_onpremise_redis_1        ... done
Creating sentry_onpremise_postgres_1     ... 
Starting sentry_onpremise_clickhouse_1   ... 
Starting sentry_onpremise_clickhouse_1   ... done
Starting sentry_onpremise_kafka_1        ... 
Starting sentry_onpremise_kafka_1        ... done
Creating sentry_onpremise_snuba-subscription-consumer-events_1 ... 
Creating sentry_onpremise_snuba-sessions-consumer_1            ... 
Creating sentry_onpremise_snuba-api_1                          ... 
Creating sentry_onpremise_snuba-outcomes-consumer_1            ... 
Creating sentry_onpremise_snuba-subscription-consumer-transactions_1 ... 
Creating sentry_onpremise_snuba-consumer_1                           ... 
Creating sentry_onpremise_snuba-replacer_1                           ... 
Creating sentry_onpremise_snuba-transactions-consumer_1              ... 
Creating sentry_onpremise_symbolicator_1                             ... done
Creating sentry_onpremise_smtp_1                                     ... done
Creating sentry_onpremise_memcached_1                                ... done
Creating sentry_onpremise_postgres_1                                 ... done
Creating sentry_onpremise_snuba-outcomes-consumer_1                  ... done
Creating sentry_onpremise_snuba-subscription-consumer-transactions_1 ... done
Creating sentry_onpremise_snuba-replacer_1                           ... done
Creating sentry_onpremise_snuba-consumer_1                           ... done
Creating sentry_onpremise_snuba-subscription-consumer-events_1       ... done
Creating sentry_onpremise_snuba-sessions-consumer_1                  ... done
Creating sentry_onpremise_snuba-api_1                                ... done
Creating sentry_onpremise_snuba-transactions-consumer_1              ... done
Installing additional dependencies...
WARNING: Running pip as root will break packages and permissions. You should install packages reliably by using venv: https://pip.pypa.io/warnings/venv

06:32:18 [INFO] sentry.plugins.github: apps-not-configured
Operations to perform:
  Apply all migrations: admin, auth, contenttypes, jira_ac, nodestore, sentry, sessions, sites, social_auth
Running migrations:
  Applying sentry.0001_initial... OK
  Applying contenttypes.0001_initial... OK
  Applying admin.0001_initial... OK
  Applying admin.0002_logentry_remove_auto_add... OK
  Applying contenttypes.0002_remove_content_type_name... OK
  Applying auth.0001_initial... OK
  Applying auth.0002_alter_permission_name_max_length... OK
  Applying auth.0003_alter_user_email_max_length... OK
  Applying auth.0004_alter_user_username_opts... OK
  Applying auth.0005_alter_user_last_login_null... OK
  Applying auth.0006_require_contenttypes_0002... OK
  Applying auth.0007_alter_validators_add_error_messages... OK
  Applying auth.0008_alter_user_username_max_length... OK
  Applying jira_ac.0001_initial... OK
  Applying nodestore.0001_initial... OK
  Applying nodestore.0002_nodestore_no_dictfield... OK
  Applying sentry.0002_912_to_recent... OK
  Applying sentry.0003_auto_20191022_0122... OK
  Applying sentry.0004_bitfieldtestmodel_blankjsonfieldtestmodel_callabledefaultmodel_jsonfieldtestmodel_jsonfieldwithdefau... OK
  Applying sentry.0005_fix_content_types... OK
  Applying sentry.0006_sentryapp_date_published... OK
  Applying sentry.0007_auto_20191029_0131... OK
  Applying sentry.0008_auto_20191030_0016... OK
  Applying sentry.0009_auto_20191101_1608... OK
  Applying sentry.0010_auto_20191104_1641... OK
  Applying sentry.0011_remove_pagerdutyservice_service_id_from_state... OK
  Applying sentry.0012_remove_pagerdutyservice_service_id... OK
  Applying sentry.0013_auto_20191111_1829... OK
  Applying sentry.0014_delete_sentryappwebhookerror... OK
  Applying sentry.0015_delete_sentryappwebhookerror_db... OK
  Applying sentry.0016_delete_alert_rule_deprecated_fields... OK
  Applying sentry.0017_incident_aggregation... OK
  Applying sentry.0018_discoversavedquery_version... OK
  Applying sentry.0019_auto_20191114_2040... OK
  Applying sentry.0020_auto_20191125_1420... OK
  Applying sentry.0021_auto_20191203_1803... OK
  Applying sentry.0021_auto_20191202_1716... OK
  Applying sentry.0022_merge... OK
  Applying sentry.0023_hide_environment_none_20191126... OK
  Applying sentry.0024_auto_20191230_2052...Nothing to do, skipping migration.

 OK
  Applying sentry.0025_organizationaccessrequest_requester... OK
  Applying sentry.0026_delete_event... OK
  Applying sentry.0027_exporteddata... OK
  Applying sentry.0028_user_reports... OK
  Applying sentry.0029_discover_query_upgrade... OK
  Applying sentry.0030_auto_20200201_0039... OK
  Applying sentry.0031_delete_alert_rules_and_incidents... OK
  Applying sentry.0032_delete_alert_email... OK
  Applying sentry.0033_auto_20200210_2137... OK
  Applying sentry.0034_auto_20200210_2311... OK
  Applying sentry.0035_auto_20200127_1711... OK
  Applying sentry.0036_auto_20200213_0106... OK
  Applying sentry.0037_auto_20200213_0140... OK
  Applying sentry.0038_auto_20200213_1904... OK
  Applying sentry.0039_delete_incidentsuspectcommit... OK
  Applying sentry.0040_remove_incidentsuspectcommittable... OK
  Applying sentry.0041_incidenttrigger_date_modified... OK
  Applying sentry.0042_auto_20200214_1607... OK
  Applying sentry.0043_auto_20200218_1903... OK
  Applying sentry.0044_auto_20200219_0018... OK
  Applying sentry.0045_remove_incidentactivity_event_stats_snapshot... OK
  Applying sentry.0046_auto_20200221_1735... OK
  Applying sentry.0047_auto_20200224_2319... OK
  Applying sentry.0048_auto_20200302_1825... OK
  Applying sentry.0049_auto_20200304_0254... OK
  Applying sentry.0050_auto_20200306_2346... OK
  Applying sentry.0051_fix_auditlog_pickled_data... OK
  Applying sentry.0052_organizationonboardingtask_completion_seen... OK
  Applying sentry.0053_migrate_alert_task_onboarding... OK
  Applying sentry.0054_create_key_transaction... OK
  Applying sentry.0055_query_subscription_status... OK
  Applying sentry.0056_remove_old_functions... OK
  Applying sentry.0057_remove_unused_project_flag... OK
  Applying sentry.0058_project_issue_alerts_targeting... OK
  Applying sentry.0059_add_new_sentry_app_features... OK
  Applying sentry.0060_add_file_eventattachment_index... OK
  Applying sentry.0061_alertrule_partial_index... OK
  Applying sentry.0062_key_transactions_unique_with_owner... OK
  Applying sentry.0063_drop_alertrule_constraint... OK
  Applying sentry.0064_project_has_transactions... OK
  Applying sentry.0065_add_incident_status_method... OK
  Applying sentry.0066_alertrule_manager... OK
  Applying sentry.0067_migrate_rules_alert_targeting... OK
  Applying sentry.0068_project_default_flags... OK
  Applying sentry.0069_remove_tracked_superusers... OK
  Applying sentry.0070_incident_snapshot_support... OK
  Applying sentry.0071_add_default_fields_model_subclass... OK
  Applying sentry.0072_alert_rules_query_changes... OK
  Applying sentry.0073_migrate_alert_query_model... OK
  Applying sentry.0074_add_metric_alert_feature... OK
  Applying sentry.0075_metric_alerts_fix_releases... OK
  Applying sentry.0076_alert_rules_disable_constraints... OK
  Applying sentry.0077_alert_query_col_drop_state... OK
  Applying sentry.0078_incident_field_updates... OK
  Applying sentry.0079_incidents_remove_query_field_state... OK
  Applying sentry.0080_alert_rules_drop_unused_tables_cols... OK
  Applying sentry.0081_add_integraiton_upgrade_audit_log... OK
  Applying sentry.0082_alert_rules_threshold_float... OK
  Applying sentry.0083_add_max_length_webhook_url... OK
  Applying sentry.0084_exported_data_blobs... OK
  Applying sentry.0085_fix_error_rate_snuba_query... OK
  Applying sentry.0086_sentry_app_installation_for_provider... OK
  Applying sentry.0087_fix_time_series_data_type... OK
  Applying sentry.0088_rule_level_resolve_threshold_type... OK
  Applying sentry.0089_rule_level_fields_backfill... OK
  Applying sentry.0090_fix_auditlog_pickled_data_take_2... OK
  Applying sentry.0091_alertruleactivity... OK
  Applying sentry.0092_remove_trigger_threshold_type_nullable... OK
  Applying sentry.0093_make_identity_user_id_textfield... OK
  Applying sentry.0094_cleanup_unreferenced_event_files... OK
  Applying sentry.0095_ruleactivity... OK
  Applying sentry.0096_sentry_app_component_skip_load_on_open... OK
  Applying sentry.0097_add_sentry_app_id_to_sentry_alertruletriggeraction... OK
  Applying sentry.0098_add-performance-onboarding... OK
  Applying sentry.0099_fix_project_platforms... OK
  Applying sentry.0100_file_type_on_event_attachment... OK
  Applying sentry.0101_backfill_file_type_on_event_attachment... OK
  Applying sentry.0102_collect_relay_analytics... OK
  Applying sentry.0103_project_has_alert_filters... OK
  Applying sentry.0104_collect_relay_public_key_usage... OK
  Applying sentry.0105_remove_nullability_of_event_attachment_type... OK
  Applying sentry.0106_service_hook_project_id_nullable... OK
  Applying sentry.0107_remove_spaces_from_slugs... OK
  Applying sentry.0108_update_fileblob_action... OK
  Applying sentry.0109_sentry_app_creator... OK
  Applying sentry.0110_sentry_app_creator_backill... OK
  Applying sentry.0111_snuba_query_event_type... OK
  Applying sentry.0112_groupinboxmodel... OK
  Applying sentry.0113_add_repositoryprojectpathconfig... OK
  Applying sentry.0114_add_unhandled_savedsearch... OK
  Applying sentry.0115_add_checksum_to_debug_file... OK
  Applying sentry.0116_backfill_debug_file_checksum... OK
  Applying sentry.0117_dummy-activityupdate... OK
  Applying sentry.0118_backfill_snuba_query_event_types... OK
  Applying sentry.0119_fix_set_none... OK
  Applying sentry.0120_commit_author_charfield... OK
GroupInbox: 100% |#                                             | ETA:  --:--:--
 OK
  Applying sentry.0122_add_release_status... OK
  Applying sentry.0123_groupinbox_addprojandorg... OK
  Applying sentry.0124_add_release_status_model... OK
  Applying sentry.0125_add_platformexternalissue_project_id... OK
  Applying sentry.0126_make_platformexternalissue_group_id_flexfk... OK
  Applying sentry.0127_backfill_platformexternalissue_project_id... OK
  Applying sentry.0128_change_dashboards... OK
  Applying sentry.0129_remove_dashboard_keys... OK
  Applying sentry.0130_remove_old_widget_models... OK
  Applying sentry.0131_drop_widget_tables... OK
  Applying sentry.0132_groupownermodel... OK
  Applying sentry.0133_dashboard_delete_object_status... OK
  Applying sentry.0134_dashboard_drop_object_status_column... OK
  Applying sentry.0135_removinguniquegroupownerconstraint... OK
  Applying sentry.0136_issue_alert_filter_all_orgs... OK
  Applying sentry.0137_dashboard_widget_interval... OK
  Applying sentry.0138_widget_query_remove_interval... OK
  Applying sentry.0139_remove_widgetquery_interval... OK
  Applying sentry.0140_subscription_checker... OK
  Applying sentry.0141_remove_widget_constraints... OK
  Applying sentry.0142_add_dashboard_tombstone... OK
  Applying sentry.0143_add_alerts_integrationfeature... OK
  Applying sentry.0144_add_publish_request_inprogress_status... OK
  Applying sentry.0145_rename_alert_rule_feature... OK
  Applying sentry.0146_backfill_members_alert_write... OK
  Applying sentry.0147_add_groupinbox_date_added_index... OK
  Applying sentry.0148_group_id_bigint... OK
  Applying sentry.0149_bigint... OK
  Applying sentry.0150_remove_userreport_eventattachment_constraints... OK
  Applying sentry.0151_add_world_map_dashboard_widget_type... OK
  Applying sentry.0152_remove_slack_workspace_orgintegrations... OK
  Applying sentry.0153_add_big_number_dashboard_widget_type... OK
  Applying sentry.0154_groupedmessage_inbox_sort... OK
  Applying sentry.0155_add_dashboard_query_orderby... OK
  Applying sentry.0156_add_mark_reviewed_activity... OK
  Applying sentry.0157_make_repositoryprojectpathconfig_organization_integration_nullable... OK
  Applying sentry.0158_create_externalteam_table... OK
  Applying sentry.0159_create_externaluser_table... OK
  Applying sentry.0160_create_projectcodeowners_table... OK
  Applying sentry.0161_add_saved_search_sort... OK
  Applying sentry.0162_backfill_saved_search_sort... OK
  Applying sentry.0163_add_organizationmember_and_external_name... OK
  Applying sentry.0164_add_protect_on_delete_codeowners... OK
  Applying sentry.0165_metric_alerts_fix_group_ids... OK
  Applying sentry.0166_create_notificationsetting_table... OK
  Applying sentry.0167_rm_organization_integration_from_projectcodeowners... OK
  Applying sentry.0168_demo_orgs_users... OK
  Applying sentry.0169_delete_organization_integration_from_projectcodeowners... OK
  Applying sentry.0170_actor_introduction... OK
  Applying sentry.0171_backfill_actors... OK
  Applying sentry.0172_rule_owner_fields... OK
  Applying sentry.0173_remove_demo_flag... OK
  Applying sentry.0174_my_issues_saved_search... OK
  Applying sentry.0175_make_targets_nullable... OK
  Applying sentry.0176_remove_targets... OK
  Applying sentry.0177_drop_targets... OK
  Applying sentry.0178_add_new_target_column... OK
  Applying sentry.0179_update_legacy_discover_saved_query_timestamps... OK
  Applying sentry.0180_add_saved_search_sorts... OK
  Applying sentry.0181_copy_useroptions_to_notificationsettings... OK
  Applying sentry.0182_update_user_misery_on_saved_queries... OK
  Applying sentry.0183_make_codemapping_unique_on_projectcodeowners... OK
  Applying sentry.0184_copy_useroptions_to_notificationsettings_2... OK
  Applying sentry.0185_rm_copied_useroptions... OK
  Applying sentry.0186_add_externalactor... OK
Saved Searchs: 100% |############################################| Time: 0:00:00
 OK
  Applying sentry.0188_remove_externalteam_externaluser_fk_constraints... OK
  Applying sentry.0189_remove_externaluser_externalteam_models... OK
  Applying sentry.0190_drop_external_user_table... OK
  Applying sentry.0191_make_externalactor_integration_id_not_null... OK
  Applying sentry.0192_remove_fileblobowner_org_fk... OK
  Applying sentry.0193_grouprelease_indexes... OK
  Applying sentry.0194_add_custom_scm_provider... OK
  Applying sentry.0195_add_team_key_transactions... OK
  Applying sentry.0196_add_restricted_member_limit... OK
  Applying sentry.0197_add_scim_enabled_boolean... OK
  Applying sentry.0198_add_project_transaction_threshold... OK
  Applying sentry.0199_release_semver... OK
  Applying sentry.0200_release_indices... OK
  Applying sentry.0201_semver_package... OK
  Applying sentry.0202_org_slug_upper_idx... OK
  Applying sentry.0203_groupedmessage_status_index... OK
  Applying sentry.0204_use_project_team_for_team_key_transactions... OK
  Applying sentry.0205_semver_backfill... OK
  Applying sentry.0206_organization_require_email_verification_flag... OK
  Applying sessions.0001_initial... OK
  Applying sites.0001_initial... OK
  Applying sites.0002_alter_domain_unique... OK
  Applying social_auth.0001_initial... OK
06:33:07 [WARNING] sentry: Cannot initiate onboarding for organization (1) due to missing owners
Created internal Sentry project (slug=internal, id=1)

Would you like to create a user account now? [Y/n]: y
Email: wp_byy@163.com
Password: 
Repeat for confirmation: 
Added to organization: sentry
User created: wp_byy@163.com
Creating missing DSNs
Correcting Group.num_comments counter

▶ Migrating file storage ...

▶ Generating Relay credentials ...
../relay/config.yml already exists, skipped creation.

▶ Setting up GeoIP integration ...
Setting up IP address geolocation ...
IP address geolocation database already exists.
IP address geolocation is not configured for updates.
See https://develop.sentry.dev/self-hosted/geolocation/ for instructions.
Error setting up IP address geolocation.

-----------------------------------------------------------------

You're all done! Run the following command to get Sentry running:

  docker-compose up -d

-----------------------------------------------------------------
Attaching to sentry_onpremise_nginx_1, sentry_onpremise_relay_1, sentry_onpremise_sentry-cleanup_1, sentry_onpremise_post-process-forwarder_1, sentry_onpremise_subscription-consumer-events_1, sentry_onpremise_subscription-consumer-transactions_1, sentry_onpremise_worker_1, sentry_onpremise_ingest-consumer_1, sentry_onpremise_cron_1, sentry_onpremise_web_1, sentry_onpremise_snuba-cleanup_1, sentry_onpremise_snuba-transactions-cleanup_1, sentry_onpremise_symbolicator-cleanup_1, sentry_onpremise_geoipupdate_1, sentry_onpremise_snuba-transactions-consumer_1, sentry_onpremise_snuba-replacer_1, sentry_onpremise_snuba-subscription-consumer-transactions_1, sentry_onpremise_snuba-consumer_1, sentry_onpremise_snuba-subscription-consumer-events_1, sentry_onpremise_snuba-outcomes-consumer_1, sentry_onpremise_snuba-api_1, sentry_onpremise_snuba-sessions-consumer_1, sentry_onpremise_smtp_1, sentry_onpremise_symbolicator_1, sentry_onpremise_postgres_1, sentry_onpremise_memcached_1, sentry_onpremise_kafka_1, sentry_onpremise_zookeeper_1, sentry_onpremise_clickhouse_1, sentry_onpremise_redis_1
cron_1                                      | 06:34:44 [INFO] sentry.plugins.github: apps-not-configured
clickhouse_1                                | Processing configuration file '/etc/clickhouse-server/config.xml'.
clickhouse_1                                | Merging configuration file '/etc/clickhouse-server/config.d/docker_related_config.xml'.
clickhouse_1                                | Merging configuration file '/etc/clickhouse-server/config.d/sentry.xml'.
clickhouse_1                                | Include not found: clickhouse_remote_servers
clickhouse_1                                | Include not found: clickhouse_compression
clickhouse_1                                | Logging information to /var/log/clickhouse-server/clickhouse-server.log
clickhouse_1                                | Logging errors to /var/log/clickhouse-server/clickhouse-server.err.log
clickhouse_1                                | Logging information to console
clickhouse_1                                | 2021.06.11 06:31:45.013136 [ 1 ] {} <Information> : Starting ClickHouse 20.3.9.70 with revision 54433
clickhouse_1                                | 2021.06.11 06:31:45.015405 [ 1 ] {} <Information> Application: starting up
clickhouse_1                                | Include not found: networks
clickhouse_1                                | 2021.06.11 06:31:45.021866 [ 1 ] {} <Information> Application: Uncompressed cache size was lowered to 4.13 GiB because the system has low amount of memory
clickhouse_1                                | 2021.06.11 06:31:45.022233 [ 1 ] {} <Information> Application: Mark cache size was lowered to 4.13 GiB because the system has low amount of memory
clickhouse_1                                | 2021.06.11 06:31:45.022276 [ 1 ] {} <Information> Application: Loading metadata from /var/lib/clickhouse/
clickhouse_1                                | 2021.06.11 06:31:45.023534 [ 1 ] {} <Information> DatabaseOrdinary (default): Total 0 tables and 0 dictionaries.
clickhouse_1                                | 2021.06.11 06:31:45.024028 [ 1 ] {} <Information> DatabaseOrdinary (default): Starting up tables.
clickhouse_1                                | 2021.06.11 06:31:45.024398 [ 1 ] {} <Information> BackgroundSchedulePool: Create BackgroundSchedulePool with 16 threads
clickhouse_1                                | 2021.06.11 06:31:45.026536 [ 1 ] {} <Information> Application: It looks like the process has no CAP_NET_ADMIN capability, 'taskstats' performance statistics will be disabled. It could happen due to incorrect ClickHouse package installation. You could resolve the problem manually with 'sudo setcap cap_net_admin=+ep /usr/bin/clickhouse'. Note that it will not work on 'nosuid' mounted filesystems. It also doesn't work if you run clickhouse-server inside network namespace as it happens in some containers.
clickhouse_1                                | 2021.06.11 06:31:45.026574 [ 1 ] {} <Information> Application: It looks like the process has no CAP_SYS_NICE capability, the setting 'os_thread_nice' will have no effect. It could happen due to incorrect ClickHouse package installation. You could resolve the problem manually with 'sudo setcap cap_sys_nice=+ep /usr/bin/clickhouse'. Note that it will not work on 'nosuid' mounted filesystems.
clickhouse_1                                | 2021.06.11 06:31:45.028110 [ 1 ] {} <Error> Application: Listen [::]:8123 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:31:45.028207 [ 1 ] {} <Error> Application: Listen [::]:9000 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:31:45.028302 [ 1 ] {} <Error> Application: Listen [::]:9009 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:31:45.028397 [ 1 ] {} <Error> Application: Listen [::]:9004 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:31:45.029466 [ 1 ] {} <Information> Application: Listening for http://0.0.0.0:8123
clickhouse_1                                | 2021.06.11 06:31:45.029510 [ 1 ] {} <Information> Application: Listening for connections with native protocol (tcp): 0.0.0.0:9000
clickhouse_1                                | 2021.06.11 06:31:45.029540 [ 1 ] {} <Information> Application: Listening for replica communication (interserver): http://0.0.0.0:9009
clickhouse_1                                | 2021.06.11 06:31:45.131830 [ 1 ] {} <Information> Application: Listening for MySQL compatibility protocol: 0.0.0.0:9004
clickhouse_1                                | 2021.06.11 06:31:45.132421 [ 1 ] {} <Information> Application: Available RAM: 8.26 GiB; physical cores: 4; logical cores: 4.
clickhouse_1                                | 2021.06.11 06:31:45.132438 [ 1 ] {} <Information> Application: Ready for connections.
clickhouse_1                                | Include not found: clickhouse_remote_servers
clickhouse_1                                | Include not found: clickhouse_compression
clickhouse_1                                | 2021.06.11 06:31:52.527209 [ 49 ] {} <Information> BackgroundProcessingPool: Create BackgroundProcessingPool with 16 threads
clickhouse_1                                | 2021.06.11 06:31:54.949957 [ 70 ] {b51ac865-f8b4-4825-bf6d-b5bac1a7156a} <Information> executeQuery: Read 1 rows, 1.00 B in 0.001 sec., 1607 rows/sec., 1.57 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:54.950351 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:54.979737 [ 70 ] {dd31b666-00c2-4567-a14d-983450d25df0} <Error> executeQuery: Code: 60, e.displayText() = DB::Exception: Table default.migrations_local doesn't exist. (version 20.3.9.70 (official build)) (from 172.22.0.6:41602) (in query: SELECT group, migration_id, status FROM migrations_local FINAL WHERE group IN ('system', 'events', 'transactions', 'discover', 'outcomes', 'sessions')), Stack trace (when copying this message, always include the lines below):
clickhouse_1                                | 
clickhouse_1                                | 0. Poco::Exception::Exception(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int) @ 0x105351b0 in /usr/bin/clickhouse
clickhouse_1                                | 1. DB::Exception::Exception(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int) @ 0x8f4172d in /usr/bin/clickhouse
clickhouse_1                                | 2. DB::Context::getTableImpl(DB::StorageID const&, std::__1::optional<DB::Exception>*) const @ 0xcfe2a24 in /usr/bin/clickhouse
clickhouse_1                                | 3. DB::Context::getTable(DB::StorageID const&) const @ 0xcfe2bbb in /usr/bin/clickhouse
clickhouse_1                                | 4. DB::Context::getTable(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) const @ 0xcfe2c7d in /usr/bin/clickhouse
clickhouse_1                                | 5. DB::JoinedTables::getLeftTableStorage() @ 0xd454892 in /usr/bin/clickhouse
clickhouse_1                                | 6. DB::InterpreterSelectQuery::InterpreterSelectQuery(std::__1::shared_ptr<DB::IAST> const&, DB::Context const&, std::__1::shared_ptr<DB::IBlockInputStream> const&, std::__1::optional<DB::Pipe>, std::__1::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > const&) @ 0xd13b6d1 in /usr/bin/clickhouse
clickhouse_1                                | 7. DB::InterpreterSelectQuery::InterpreterSelectQuery(std::__1::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > const&) @ 0xd13c619 in /usr/bin/clickhouse
clickhouse_1                                | 8. DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::__1::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > const&) @ 0xd341686 in /usr/bin/clickhouse
clickhouse_1                                | 9. DB::InterpreterFactory::get(std::__1::shared_ptr<DB::IAST>&, DB::Context&, DB::QueryProcessingStage::Enum) @ 0xd0909b4 in /usr/bin/clickhouse
clickhouse_1                                | 10. ? @ 0xd550655 in /usr/bin/clickhouse
clickhouse_1                                | 11. DB::executeQuery(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, DB::Context&, bool, DB::QueryProcessingStage::Enum, bool, bool) @ 0xd553441 in /usr/bin/clickhouse
clickhouse_1                                | 12. DB::TCPHandler::runImpl() @ 0x9024489 in /usr/bin/clickhouse
clickhouse_1                                | 13. DB::TCPHandler::run() @ 0x9025470 in /usr/bin/clickhouse
clickhouse_1                                | 14. Poco::Net::TCPServerConnection::start() @ 0xe3ac69b in /usr/bin/clickhouse
clickhouse_1                                | 15. Poco::Net::TCPServerDispatcher::run() @ 0xe3acb1d in /usr/bin/clickhouse
clickhouse_1                                | 16. Poco::PooledThread::run() @ 0x105c3317 in /usr/bin/clickhouse
clickhouse_1                                | 17. Poco::ThreadImpl::runnableEntry(void*) @ 0x105bf11c in /usr/bin/clickhouse
clickhouse_1                                | 18. ? @ 0x105c0abd in /usr/bin/clickhouse
clickhouse_1                                | 19. start_thread @ 0x76db in /lib/x86_64-linux-gnu/libpthread-2.27.so
clickhouse_1                                | 20. __clone @ 0x12188f in /lib/x86_64-linux-gnu/libc-2.27.so
clickhouse_1                                | 
clickhouse_1                                | 2021.06.11 06:31:54.980267 [ 70 ] {} <Information> TCPHandler: Processed in 0.029 sec.
clickhouse_1                                | 2021.06.11 06:31:54.980492 [ 70 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:31:54.988028 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:54.990280 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.008304 [ 70 ] {} <Information> TCPHandler: Processed in 0.018 sec.
clickhouse_1                                | 2021.06.11 06:31:55.012051 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.013690 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.022559 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.024298 [ 70 ] {e03c622f-6755-4883-a215-7217ddf04994} <Information> executeQuery: Read 1 rows, 51.00 B in 0.001 sec., 1017 rows/sec., 50.65 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.024475 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.026627 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.028655 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.030290 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.038705 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.049593 [ 70 ] {} <Information> TCPHandler: Processed in 0.010 sec.
clickhouse_1                                | 2021.06.11 06:31:55.057300 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.063607 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.070266 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.078225 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.087183 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.096112 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.103261 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.111098 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.113092 [ 70 ] {a4e08acf-1798-4591-a94f-12cddfee3d2f} <Information> executeQuery: Read 1 rows, 67.00 B in 0.001 sec., 889 rows/sec., 58.18 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.113255 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.115398 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.118657 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.120415 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.134116 [ 70 ] {} <Information> TCPHandler: Processed in 0.013 sec.
clickhouse_1                                | 2021.06.11 06:31:55.136109 [ 70 ] {ea22c5f4-3e97-4a65-8728-82fe40742733} <Information> executeQuery: Read 4 rows, 208.00 B in 0.001 sec., 4066 rows/sec., 206.50 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.136239 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.138612 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.140820 [ 70 ] {c4639e67-f4a4-43b1-8946-7c2b2cf89fe1} <Information> executeQuery: Read 4 rows, 208.00 B in 0.001 sec., 4267 rows/sec., 216.70 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.140984 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.142927 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.151011 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.153032 [ 70 ] {4586bb8b-69eb-4978-a043-44bd4e2143fe} <Information> executeQuery: Read 5 rows, 275.00 B in 0.001 sec., 4976 rows/sec., 267.27 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.153169 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.155249 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.158041 [ 70 ] {dc6fa303-bd6d-4ef6-a060-eb6cd2a5f872} <Information> executeQuery: Read 4 rows, 208.00 B in 0.001 sec., 3783 rows/sec., 192.12 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.158246 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.160548 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.170389 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:55.172327 [ 70 ] {a0fd4772-7770-468d-b045-dfd816ef4997} <Information> executeQuery: Read 5 rows, 265.00 B in 0.001 sec., 4913 rows/sec., 254.32 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.172481 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.174705 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.177533 [ 70 ] {fd0bbc05-fdfa-4888-9b2d-c944bdc085e6} <Information> executeQuery: Read 4 rows, 208.00 B in 0.001 sec., 2942 rows/sec., 149.40 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.177947 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.179975 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.188142 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.190178 [ 70 ] {09e09b16-3b7c-4cfe-8284-2c59b3b83711} <Information> executeQuery: Read 7 rows, 389.00 B in 0.001 sec., 4906 rows/sec., 266.27 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.190558 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.193359 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.195247 [ 70 ] {e1543305-12f9-4ed9-825d-c3cdeede7e5b} <Information> executeQuery: Read 6 rows, 332.00 B in 0.001 sec., 4572 rows/sec., 247.09 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.195669 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.197660 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.203330 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.205673 [ 70 ] {054607cc-a316-4966-aa7a-a8ca79ec589c} <Information> executeQuery: Read 7 rows, 384.00 B in 0.002 sec., 4252 rows/sec., 227.81 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.206063 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.208635 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.211042 [ 70 ] {78217a64-92ed-4259-a70b-5b7bdb0531fe} <Information> executeQuery: Read 6 rows, 332.00 B in 0.001 sec., 5659 rows/sec., 305.81 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.211376 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.214237 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.224107 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:55.225748 [ 70 ] {9f255c2c-85f7-4900-9fbb-aada14e2d03f} <Information> executeQuery: Read 9 rows, 492.00 B in 0.001 sec., 11260 rows/sec., 601.12 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.225871 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.227584 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.229611 [ 70 ] {ad59117c-5763-475f-8df6-22e80bf8fca1} <Information> executeQuery: Read 9 rows, 492.00 B in 0.001 sec., 10980 rows/sec., 586.20 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.229771 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.231336 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.238865 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.246257 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.247784 [ 70 ] {9cf03376-deb5-43aa-8ee0-d8ce4538987b} <Information> executeQuery: Read 10 rows, 551.00 B in 0.001 sec., 11988 rows/sec., 645.10 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.247904 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.249683 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.251964 [ 70 ] {b43dbaab-a29f-44d3-a5bd-27c69705c275} <Information> executeQuery: Read 9 rows, 492.00 B in 0.001 sec., 10450 rows/sec., 557.93 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.252108 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.253875 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.255156 [ 70 ] {cbaf8801-ae4f-47fa-9a77-c6826b63d2bd} <Information> executeQuery: Read 5 rows, 442.00 B in 0.001 sec., 8578 rows/sec., 740.57 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.255413 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.257051 [ 70 ] {90d36411-b27e-4246-9e69-ec7020a28fe3} <Information> executeQuery: Read 10 rows, 568.00 B in 0.001 sec., 9933 rows/sec., 551.00 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.257196 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.258933 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.261733 [ 70 ] {1de23d12-4b1b-414b-9f41-f75f81957fd6} <Information> executeQuery: Read 11 rows, 627.00 B in 0.001 sec., 16469 rows/sec., 916.77 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.261856 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.263553 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.272544 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.279713 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.280793 [ 70 ] {} <Information> TCPHandler: Processed in 0.000 sec.
clickhouse_1                                | 2021.06.11 06:31:55.286686 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.287393 [ 70 ] {f2397ff1-5fe5-43cb-adab-00c8637fef4b} <Information> executeQuery: Read 12 rows, 678.00 B in 0.000 sec., 19261637 rows/sec., 1.01 GiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.287393 [ 70 ] {} <Information> TCPHandler: Processed in 0.000 sec.
clickhouse_1                                | 2021.06.11 06:31:55.293763 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.296330 [ 70 ] {fa596e0f-6987-4935-b79d-fd8c082cc830} <Information> executeQuery: Read 11 rows, 627.00 B in 0.002 sec., 6411 rows/sec., 356.87 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.296501 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.298792 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.310739 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.313685 [ 70 ] {80ccc00b-5430-4248-8564-726c8eb9ec16} <Information> executeQuery: Read 12 rows, 690.00 B in 0.002 sec., 6562 rows/sec., 368.51 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.314206 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.317115 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.319191 [ 70 ] {fd3c5be0-d6d3-4da4-97f0-e7ae833764af} <Information> executeQuery: Read 11 rows, 627.00 B in 0.001 sec., 9036 rows/sec., 502.98 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.319350 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.320964 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.330832 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:55.345168 [ 70 ] {} <Information> TCPHandler: Processed in 0.013 sec.
clickhouse_1                                | 2021.06.11 06:31:55.348699 [ 70 ] {ab2bfa0e-1fb7-4558-8591-12b29bca30f4} <Information> executeQuery: Read 14 rows, 808.00 B in 0.002 sec., 7216 rows/sec., 406.75 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.348932 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.351699 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.358359 [ 70 ] {216d396d-9d85-405d-9ba8-352bd7dbdeef} <Information> executeQuery: Read 14 rows, 808.00 B in 0.004 sec., 3900 rows/sec., 219.83 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.358543 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:55.361795 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.363680 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.381906 [ 70 ] {} <Information> TCPHandler: Processed in 0.016 sec.
clickhouse_1                                | 2021.06.11 06:31:55.394879 [ 70 ] {} <Information> TCPHandler: Processed in 0.013 sec.
clickhouse_1                                | 2021.06.11 06:31:55.406663 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.418277 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.431211 [ 70 ] {} <Information> TCPHandler: Processed in 0.012 sec.
clickhouse_1                                | 2021.06.11 06:31:55.443685 [ 70 ] {} <Information> TCPHandler: Processed in 0.012 sec.
clickhouse_1                                | 2021.06.11 06:31:55.463579 [ 70 ] {} <Information> TCPHandler: Processed in 0.019 sec.
clickhouse_1                                | 2021.06.11 06:31:55.474730 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.486847 [ 70 ] {} <Information> TCPHandler: Processed in 0.012 sec.
clickhouse_1                                | 2021.06.11 06:31:55.497954 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.509294 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.520449 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.532002 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.534442 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.536198 [ 70 ] {66eab1e8-f484-4c11-91e9-007413454228} <Information> executeQuery: Read 15 rows, 860.00 B in 0.001 sec., 14265 rows/sec., 798.72 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.536477 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.538699 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.540694 [ 70 ] {701dd527-33ba-4df3-9883-6579a87de99c} <Information> executeQuery: Read 14 rows, 808.00 B in 0.001 sec., 15544 rows/sec., 876.13 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.540906 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.542691 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.543365 [ 70 ] {fe8633d6-4296-4fd8-8915-24b01bfe594d} <Information> default.sentry_local: Removed 0 parts.
clickhouse_1                                | 2021.06.11 06:31:55.543565 [ 70 ] {} <Information> TCPHandler: Processed in 0.000 sec.
clickhouse_1                                | 2021.06.11 06:31:55.545204 [ 70 ] {61fb255b-842d-4d6d-a31a-52050b075f69} <Information> executeQuery: Read 15 rows, 860.00 B in 0.001 sec., 15842 rows/sec., 886.99 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.545514 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.548627 [ 70 ] {} <Information> TCPHandler: Processed in 0.003 sec.
clickhouse_1                                | 2021.06.11 06:31:55.551213 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.552866 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.559807 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.561506 [ 70 ] {a565e33c-f6c1-4a91-8046-9c8c33af2a94} <Information> executeQuery: Read 1 rows, 55.00 B in 0.001 sec., 1040 rows/sec., 55.88 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.561736 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.563489 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.565566 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.567361 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.568763 [ 70 ] {b2cda0be-5e94-4adb-b2cb-de50d0324578} <Information> executeQuery: Read 6 rows, 873.00 B in 0.001 sec., 7765 rows/sec., 1.08 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.568980 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.570473 [ 70 ] {25214d63-8af7-411a-bd09-b84cfd4209c8} <Information> executeQuery: Read 1 rows, 93.00 B in 0.001 sec., 1084 rows/sec., 98.48 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.570664 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.572189 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.574264 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.574973 [ 70 ] {} <Information> TCPHandler: Processed in 0.000 sec.
clickhouse_1                                | 2021.06.11 06:31:55.586085 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.592095 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.598816 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.605253 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.615427 [ 70 ] {} <Information> TCPHandler: Processed in 0.010 sec.
clickhouse_1                                | 2021.06.11 06:31:55.623142 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.629464 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.635072 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.641587 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.647422 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.653530 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.658995 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.667516 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.673571 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.678968 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.684122 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.685701 [ 70 ] {1beac459-de21-4acc-82ec-48b205264ef4} <Information> executeQuery: Read 19 rows, 1.11 KiB in 0.001 sec., 21217 rows/sec., 1.21 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.685867 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
geoipupdate_1                               | error loading configuration file /sentry/GeoIP.conf: error opening file: open /sentry/GeoIP.conf: no such file or directory
kafka_1                                     | ===> ENV Variables ...
kafka_1                                     | ALLOW_UNSIGNED=false
kafka_1                                     | COMPONENT=kafka
kafka_1                                     | CONFLUENT_DEB_VERSION=1
kafka_1                                     | CONFLUENT_PLATFORM_LABEL=
kafka_1                                     | CONFLUENT_SUPPORT_METRICS_ENABLE=false
kafka_1                                     | CONFLUENT_VERSION=5.5.0
kafka_1                                     | CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar
kafka_1                                     | HOME=/root
kafka_1                                     | HOSTNAME=5c4dce1153e5
kafka_1                                     | KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092
kafka_1                                     | KAFKA_LOG4J_LOGGERS=kafka.cluster=WARN,kafka.controller=WARN,kafka.coordinator=WARN,kafka.log=WARN,kafka.server=WARN,kafka.zookeeper=WARN,state.change.logger=WARN
kafka_1                                     | KAFKA_LOG4J_ROOT_LOGLEVEL=WARN
kafka_1                                     | KAFKA_LOG_RETENTION_HOURS=24
kafka_1                                     | KAFKA_MAX_REQUEST_SIZE=50000000
kafka_1                                     | KAFKA_MESSAGE_MAX_BYTES=50000000
kafka_1                                     | KAFKA_OFFSETS_TOPIC_NUM_PARTITIONS=1
kafka_1                                     | KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1
kafka_1                                     | KAFKA_TOOLS_LOG4J_LOGLEVEL=WARN
kafka_1                                     | KAFKA_VERSION=
kafka_1                                     | KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
kafka_1                                     | LANG=C.UTF-8
kafka_1                                     | PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
kafka_1                                     | PWD=/
kafka_1                                     | PYTHON_PIP_VERSION=8.1.2
kafka_1                                     | PYTHON_VERSION=2.7.9-1
kafka_1                                     | SCALA_VERSION=2.12
kafka_1                                     | SHLVL=1
kafka_1                                     | ZULU_OPENJDK_VERSION=8=8.38.0.13
kafka_1                                     | _=/usr/bin/env
kafka_1                                     | ===> User
kafka_1                                     | uid=0(root) gid=0(root) groups=0(root)
kafka_1                                     | ===> Configuring ...
kafka_1                                     | ===> Running preflight checks ... 
kafka_1                                     | ===> Check if /var/lib/kafka/data is writable ...
kafka_1                                     | ===> Check if Zookeeper is healthy ...
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.5.7-f0fdd52973d373ffd9c86b81d99842dc2c7f660e, built on 02/10/2020 11:30 GMT
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:host.name=5c4dce1153e5
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_212
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Azul Systems, Inc.
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/zulu-8-amd64/jre
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/etc/confluent/docker/docker-utils.jar
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.version=5.4.0-74-generic
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.free=125MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.max=1879MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.total=128MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=zookeeper:2181 sessionTimeout=40000 watcher=io.confluent.admin.utils.ZookeeperConnectionWatcher@cc34f4d
kafka_1                                     | [main] INFO org.apache.zookeeper.common.X509Util - Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation
kafka_1                                     | [main] INFO org.apache.zookeeper.ClientCnxnSocket - jute.maxbuffer value is 4194304 Bytes
kafka_1                                     | [main] INFO org.apache.zookeeper.ClientCnxn - zookeeper.request.timeout value is 0. feature enabled=
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Opening socket connection to server zookeeper/172.22.0.2:2181. Will not attempt to authenticate using SASL (unknown error)
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Socket connection established, initiating session, client: /172.22.0.5:44136, server: zookeeper/172.22.0.2:2181
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Session establishment complete on server zookeeper/172.22.0.2:2181, sessionid = 0x10008fa361c0000, negotiated timeout = 40000
kafka_1                                     | [main-EventThread] INFO org.apache.zookeeper.ClientCnxn - EventThread shut down for session: 0x10008fa361c0000
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Session: 0x10008fa361c0000 closed
kafka_1                                     | ===> Launching ... 
kafka_1                                     | ===> Launching kafka ... 
kafka_1                                     | [2021-06-11 06:31:49,000] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
kafka_1                                     | [2021-06-11 06:31:49,372] WARN The package io.confluent.support.metrics.collectors.FullCollector for collecting the full set of support metrics could not be loaded, so we are reverting to anonymous, basic metric collection. If you are a Confluent customer, please refer to the Confluent Platform documentation, section Proactive Support, on how to activate full metrics collection. (io.confluent.support.metrics.KafkaSupportConfig)
kafka_1                                     | [2021-06-11 06:31:49,372] WARN The support metrics collection feature ("Metrics") of Proactive Support is disabled. (io.confluent.support.metrics.SupportedServerStartable)
kafka_1                                     | [2021-06-11 06:31:49,717] WARN No meta.properties file under dir /var/lib/kafka/data/meta.properties (kafka.server.BrokerMetadataCheckpoint)
kafka_1                                     | [2021-06-11 06:31:49,847] INFO Starting the log cleaner (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:31:49,893] INFO [kafka-log-cleaner-thread-0]: Starting (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:31:50,143] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor)
kafka_1                                     | [2021-06-11 06:31:50,171] INFO [SocketServer brokerId=1001] Created data-plane acceptor and processors for endpoint : EndPoint(0.0.0.0,9092,ListenerName(PLAINTEXT),PLAINTEXT) (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:31:50,172] INFO [SocketServer brokerId=1001] Started 1 acceptor threads for data-plane (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:31:50,225] INFO Creating /brokers/ids/1001 (is it secure? false) (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:31:50,242] INFO Stat of the created znode at /brokers/ids/1001 is: 27,27,1623393110233,1623393110233,1,0,0,72067464780578817,180,0,27
kafka_1                                     |  (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:31:50,243] INFO Registered broker 1001 at path /brokers/ids/1001 with addresses: ArrayBuffer(EndPoint(kafka,9092,ListenerName(PLAINTEXT),PLAINTEXT)), czxid (broker epoch): 27 (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:31:50,321] INFO Successfully created /controller_epoch with initial epoch 0 (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:31:50,413] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
kafka_1                                     | [2021-06-11 06:31:50,428] INFO [SocketServer brokerId=1001] Started data-plane processors for 1 acceptors (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:31:51,678] INFO Creating topic transactions-subscription-results with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,722] INFO Creating topic ingest-sessions with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,754] INFO Creating topic events with configuration {message.timestamp.type=LogAppendTime} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,777] INFO Creating topic snuba-queries with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,797] INFO Creating topic events-subscription-results with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,822] INFO Creating topic outcomes with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,836] INFO Creating topic snuba-commit-log with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,867] INFO Creating topic ingest-metrics with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,891] INFO Creating topic cdc with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,904] INFO Creating topic event-replacements-legacy with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:31:51,923] INFO Creating topic event-replacements with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:32:01,293] INFO Creating topic ingest-attachments with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:32:03,956] INFO Creating topic ingest-transactions with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:32:06,688] INFO Creating topic ingest-events with configuration {} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:32:14,099] INFO Creating topic __consumer_offsets with configuration {segment.bytes=104857600, compression.type=producer, cleanup.policy=compact} and initial partition assignment Map(0 -> ArrayBuffer(1001)) (kafka.zk.AdminZkClient)
kafka_1                                     | [2021-06-11 06:33:34,211] INFO [/config/changes-event-process-thread]: Shutting down (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
kafka_1                                     | [2021-06-11 06:33:34,211] INFO [/config/changes-event-process-thread]: Stopped (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
kafka_1                                     | [2021-06-11 06:33:34,213] INFO [/config/changes-event-process-thread]: Shutdown completed (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
kafka_1                                     | [2021-06-11 06:33:34,213] INFO [SocketServer brokerId=1001] Stopping socket server request processors (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:33:34,224] INFO [SocketServer brokerId=1001] Stopped socket server request processors (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:33:35,232] INFO Shutting down the log cleaner. (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:33:35,232] INFO [kafka-log-cleaner-thread-0]: Shutting down (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:33:35,232] INFO [kafka-log-cleaner-thread-0]: Stopped (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:33:35,232] INFO [kafka-log-cleaner-thread-0]: Shutdown completed (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:33:37,871] INFO [SocketServer brokerId=1001] Shutting down socket server (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:33:37,885] INFO [SocketServer brokerId=1001] Shutdown completed (kafka.network.SocketServer)
kafka_1                                     | ===> ENV Variables ...
kafka_1                                     | ALLOW_UNSIGNED=false
kafka_1                                     | COMPONENT=kafka
kafka_1                                     | CONFLUENT_DEB_VERSION=1
kafka_1                                     | CONFLUENT_PLATFORM_LABEL=
kafka_1                                     | CONFLUENT_SUPPORT_METRICS_ENABLE=false
kafka_1                                     | CONFLUENT_VERSION=5.5.0
kafka_1                                     | CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar
kafka_1                                     | HOME=/root
kafka_1                                     | HOSTNAME=5c4dce1153e5
kafka_1                                     | KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092
kafka_1                                     | KAFKA_LOG4J_LOGGERS=kafka.cluster=WARN,kafka.controller=WARN,kafka.coordinator=WARN,kafka.log=WARN,kafka.server=WARN,kafka.zookeeper=WARN,state.change.logger=WARN
kafka_1                                     | KAFKA_LOG4J_ROOT_LOGLEVEL=WARN
kafka_1                                     | KAFKA_LOG_RETENTION_HOURS=24
kafka_1                                     | KAFKA_MAX_REQUEST_SIZE=50000000
kafka_1                                     | KAFKA_MESSAGE_MAX_BYTES=50000000
kafka_1                                     | KAFKA_OFFSETS_TOPIC_NUM_PARTITIONS=1
kafka_1                                     | KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1
kafka_1                                     | KAFKA_TOOLS_LOG4J_LOGLEVEL=WARN
kafka_1                                     | KAFKA_VERSION=
kafka_1                                     | KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
kafka_1                                     | LANG=C.UTF-8
kafka_1                                     | PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
kafka_1                                     | PWD=/
kafka_1                                     | PYTHON_PIP_VERSION=8.1.2
kafka_1                                     | PYTHON_VERSION=2.7.9-1
kafka_1                                     | SCALA_VERSION=2.12
kafka_1                                     | SHLVL=1
kafka_1                                     | ZULU_OPENJDK_VERSION=8=8.38.0.13
kafka_1                                     | _=/usr/bin/env
kafka_1                                     | ===> User
kafka_1                                     | uid=0(root) gid=0(root) groups=0(root)
kafka_1                                     | ===> Configuring ...
kafka_1                                     | ===> Running preflight checks ... 
kafka_1                                     | ===> Check if /var/lib/kafka/data is writable ...
kafka_1                                     | ===> Check if Zookeeper is healthy ...
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.5.7-f0fdd52973d373ffd9c86b81d99842dc2c7f660e, built on 02/10/2020 11:30 GMT
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:host.name=5c4dce1153e5
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_212
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Azul Systems, Inc.
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/zulu-8-amd64/jre
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/etc/confluent/docker/docker-utils.jar
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.version=5.4.0-74-generic
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.free=125MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.max=1879MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.memory.total=128MB
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=zookeeper:2181 sessionTimeout=40000 watcher=io.confluent.admin.utils.ZookeeperConnectionWatcher@cc34f4d
kafka_1                                     | [main] INFO org.apache.zookeeper.common.X509Util - Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation
kafka_1                                     | [main] INFO org.apache.zookeeper.ClientCnxnSocket - jute.maxbuffer value is 4194304 Bytes
kafka_1                                     | [main] INFO org.apache.zookeeper.ClientCnxn - zookeeper.request.timeout value is 0. feature enabled=
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Opening socket connection to server zookeeper/172.22.0.5:2181. Will not attempt to authenticate using SASL (unknown error)
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Socket connection established, initiating session, client: /172.22.0.11:41284, server: zookeeper/172.22.0.5:2181
kafka_1                                     | [main-SendThread(zookeeper:2181)] INFO org.apache.zookeeper.ClientCnxn - Session establishment complete on server zookeeper/172.22.0.5:2181, sessionid = 0x10008fcd4320000, negotiated timeout = 40000
kafka_1                                     | [main] INFO org.apache.zookeeper.ZooKeeper - Session: 0x10008fcd4320000 closed
kafka_1                                     | [main-EventThread] INFO org.apache.zookeeper.ClientCnxn - EventThread shut down for session: 0x10008fcd4320000
kafka_1                                     | ===> Launching ... 
kafka_1                                     | ===> Launching kafka ... 
kafka_1                                     | [2021-06-11 06:34:43,121] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
kafka_1                                     | [2021-06-11 06:34:44,245] WARN The package io.confluent.support.metrics.collectors.FullCollector for collecting the full set of support metrics could not be loaded, so we are reverting to anonymous, basic metric collection. If you are a Confluent customer, please refer to the Confluent Platform documentation, section Proactive Support, on how to activate full metrics collection. (io.confluent.support.metrics.KafkaSupportConfig)
kafka_1                                     | [2021-06-11 06:34:44,258] WARN The support metrics collection feature ("Metrics") of Proactive Support is disabled. (io.confluent.support.metrics.SupportedServerStartable)
kafka_1                                     | [2021-06-11 06:34:45,919] INFO Starting the log cleaner (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:34:46,045] INFO [kafka-log-cleaner-thread-0]: Starting (kafka.log.LogCleaner)
kafka_1                                     | [2021-06-11 06:34:46,505] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor)
kafka_1                                     | [2021-06-11 06:34:46,549] INFO [SocketServer brokerId=1001] Created data-plane acceptor and processors for endpoint : EndPoint(0.0.0.0,9092,ListenerName(PLAINTEXT),PLAINTEXT) (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:34:46,556] INFO [SocketServer brokerId=1001] Started 1 acceptor threads for data-plane (kafka.network.SocketServer)
kafka_1                                     | [2021-06-11 06:34:46,639] INFO Creating /brokers/ids/1001 (is it secure? false) (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:34:46,680] INFO Stat of the created znode at /brokers/ids/1001 is: 139,139,1623393286666,1623393286666,1,0,0,72067476022755329,180,0,139
kafka_1                                     |  (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:34:46,681] INFO Registered broker 1001 at path /brokers/ids/1001 with addresses: ArrayBuffer(EndPoint(kafka,9092,ListenerName(PLAINTEXT),PLAINTEXT)), czxid (broker epoch): 139 (kafka.zk.KafkaZkClient)
kafka_1                                     | [2021-06-11 06:34:47,291] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
kafka_1                                     | [2021-06-11 06:34:47,411] INFO [SocketServer brokerId=1001] Started data-plane processors for 1 acceptors (kafka.network.SocketServer)
clickhouse_1                                | 2021.06.11 06:31:55.687794 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.689814 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.691592 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.698248 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.699852 [ 70 ] {2f5500bf-30b9-47d8-bc3e-9f92f6a5a94c} <Information> executeQuery: Read 1 rows, 73.00 B in 0.001 sec., 1103 rows/sec., 78.69 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.699997 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.701638 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.703502 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.705150 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.711238 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.712817 [ 70 ] {ea8e09db-a12c-40f7-87d1-1ef94a2093f5} <Information> executeQuery: Read 1 rows, 72.00 B in 0.001 sec., 1141 rows/sec., 80.26 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.712963 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.714541 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.716582 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.718077 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.723800 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.738067 [ 70 ] {} <Information> TCPHandler: Processed in 0.014 sec.
clickhouse_1                                | 2021.06.11 06:31:55.746544 [ 70 ] {700cd75a-11f7-404f-a713-11655bea3671} <Information> executeQuery: Read 1 rows, 71.00 B in 0.005 sec., 185 rows/sec., 12.86 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.747945 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.755177 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.763697 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:55.771304 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.788882 [ 70 ] {} <Information> TCPHandler: Processed in 0.015 sec.
clickhouse_1                                | 2021.06.11 06:31:55.802265 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:55.811781 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:55.821814 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:55.824722 [ 70 ] {f0bd448f-e19e-4ab5-b69d-9ea7116bae2e} <Information> executeQuery: Read 1 rows, 73.00 B in 0.002 sec., 611 rows/sec., 43.63 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.825157 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.827817 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.830887 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.833705 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.843281 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:55.845768 [ 70 ] {cdc90dd9-3920-4c4c-9ee7-c60d99af4440} <Information> executeQuery: Read 24 rows, 1.47 KiB in 0.001 sec., 17676 rows/sec., 1.05 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.846108 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.848588 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.851698 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.853883 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.862179 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.873455 [ 70 ] {} <Information> TCPHandler: Processed in 0.010 sec.
clickhouse_1                                | 2021.06.11 06:31:55.875726 [ 70 ] {a400e8ac-e0cb-4eca-8112-97e1d9e351d9} <Information> executeQuery: Read 1 rows, 77.00 B in 0.001 sec., 778 rows/sec., 58.52 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.876056 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.878278 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.880593 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.883692 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.891402 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.893536 [ 70 ] {81e5c8a2-bdec-4d6f-a115-091896f0720b} <Information> executeQuery: Read 1 rows, 73.00 B in 0.001 sec., 794 rows/sec., 56.62 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.893782 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.896283 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.899099 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.901371 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.909698 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:55.912401 [ 70 ] {27264aa0-21a4-4025-8c49-d63f32d5f670} <Information> executeQuery: Read 1 rows, 78.00 B in 0.001 sec., 714 rows/sec., 54.41 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.912646 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.915010 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.918000 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.920420 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.926353 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.928692 [ 70 ] {af49cd5d-eecb-4370-a696-9b7d989c6966} <Information> executeQuery: Read 1 rows, 59.00 B in 0.001 sec., 767 rows/sec., 44.21 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.928936 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.931121 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.934635 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.936690 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:55.942744 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.948967 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:55.951346 [ 70 ] {75893045-b47c-42a8-b8d0-1e52f7b229ba} <Information> executeQuery: Read 29 rows, 1.82 KiB in 0.001 sec., 20982 rows/sec., 1.28 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.951591 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.953860 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.956532 [ 70 ] {56e16a5e-b19f-4581-9e3d-96acc3f2e918} <Information> executeQuery: Read 29 rows, 1.82 KiB in 0.001 sec., 24160 rows/sec., 1.48 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.956784 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.959997 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.970389 [ 70 ] {} <Information> TCPHandler: Processed in 0.010 sec.
clickhouse_1                                | 2021.06.11 06:31:55.972802 [ 70 ] {2db6ae7e-ab74-4a65-9a11-4b473af22b29} <Information> executeQuery: Read 30 rows, 1.88 KiB in 0.001 sec., 21450 rows/sec., 1.31 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.973103 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.975688 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.978057 [ 70 ] {e8767029-6d32-44bc-8aee-82a62033b344} <Information> executeQuery: Read 29 rows, 1.82 KiB in 0.001 sec., 25083 rows/sec., 1.54 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:55.978448 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.980727 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:55.987189 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:55.995487 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:56.003773 [ 70 ] {1edfa2b6-9066-4be9-b045-49d0a57d9021} <Information> executeQuery: Read 30 rows, 1.88 KiB in 0.005 sec., 6204 rows/sec., 389.79 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.004838 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.012043 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.022548 [ 70 ] {7b10ad24-69de-471d-ae7e-a19faf5e562d} <Information> executeQuery: Read 31 rows, 1.95 KiB in 0.005 sec., 6380 rows/sec., 400.59 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.023729 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:56.031824 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.046090 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:56.053834 [ 70 ] {64cefaf9-c58f-49de-a7fe-970c3954d04f} <Information> executeQuery: Read 32 rows, 2.01 KiB in 0.004 sec., 7429 rows/sec., 467.31 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.055925 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:56.063486 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.072418 [ 70 ] {3137cee7-cb1a-41c6-b942-5b547ead4ddb} <Information> executeQuery: Read 31 rows, 1.95 KiB in 0.004 sec., 7967 rows/sec., 500.24 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.073357 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.080521 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.094623 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:56.102426 [ 70 ] {332ae25e-0089-44c6-a1d9-fc9100c032c1} <Information> executeQuery: Read 32 rows, 2.00 KiB in 0.004 sec., 7240 rows/sec., 453.61 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.103556 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.110715 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.119095 [ 70 ] {22347f4d-6f61-4638-bb3b-4508a7eafdf5} <Information> executeQuery: Read 31 rows, 1.95 KiB in 0.002 sec., 12907 rows/sec., 810.37 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.119905 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:56.129080 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:56.138558 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:56.147696 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:56.151004 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.157047 [ 70 ] {361158d2-6937-4a61-b418-23e62521f58f} <Information> executeQuery: Read 34 rows, 2.12 KiB in 0.001 sec., 25099 rows/sec., 1.53 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.157348 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.159730 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.162332 [ 70 ] {76b73cd2-d23d-4684-ab5c-a2e92337c48c} <Information> executeQuery: Read 34 rows, 2.12 KiB in 0.001 sec., 29492 rows/sec., 1.79 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.162577 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.164932 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.170736 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.176009 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:56.178362 [ 70 ] {4689d58f-7c06-4a32-8802-5b57866125b5} <Information> executeQuery: Read 35 rows, 2.18 KiB in 0.001 sec., 23421 rows/sec., 1.43 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.178688 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.181151 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.182723 [ 70 ] {557e87e3-ce89-4432-960d-079598ec9068} <Information> executeQuery: Read 34 rows, 2.12 KiB in 0.000 sec., 115312 rows/sec., 7.01 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.182723 [ 70 ] {} <Information> TCPHandler: Processed in 0.000 sec.
clickhouse_1                                | 2021.06.11 06:31:56.188649 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.195076 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.201450 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.206659 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:56.212430 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.214954 [ 70 ] {ffdad8d2-eb37-4d5c-9d97-f44e38b64b63} <Information> executeQuery: Read 35 rows, 2.19 KiB in 0.001 sec., 26522 rows/sec., 1.62 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.215440 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.221704 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.225072 [ 70 ] {8f041b10-ab36-40a1-8ade-48ba2da5aa66} <Information> executeQuery: Read 36 rows, 2.26 KiB in 0.001 sec., 26877 rows/sec., 1.64 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.225573 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.228149 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.229647 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:56.238845 [ 70 ] {} <Information> TCPHandler: Processed in 0.008 sec.
clickhouse_1                                | 2021.06.11 06:31:56.241221 [ 70 ] {c2cc88cf-11e6-46b7-b215-aaca0b50de7e} <Information> executeQuery: Read 37 rows, 2.32 KiB in 0.001 sec., 28733 rows/sec., 1.76 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.241766 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.244477 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.247460 [ 70 ] {bfd8effb-7d6c-4e87-bb24-e17683410b4f} <Information> executeQuery: Read 36 rows, 2.26 KiB in 0.000 sec., 56603773 rows/sec., 3.38 GiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.251662 [ 70 ] {} <Information> TCPHandler: Processed in 0.004 sec.
clickhouse_1                                | 2021.06.11 06:31:56.254066 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.261654 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:56.270024 [ 70 ] {} <Information> TCPHandler: Processed in 0.007 sec.
clickhouse_1                                | 2021.06.11 06:31:56.276951 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.279529 [ 70 ] {858aa844-a69d-47a0-ac18-1088beb53865} <Information> executeQuery: Read 37 rows, 2.30 KiB in 0.001 sec., 26192 rows/sec., 1.59 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.280046 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.282717 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.286525 [ 70 ] {75c987b1-00ff-4d37-9301-0e66942ced4f} <Information> executeQuery: Read 36 rows, 2.26 KiB in 0.001 sec., 24956 rows/sec., 1.53 MiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.287087 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.289785 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.296130 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.302801 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.313058 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:56.319432 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.326393 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.338618 [ 70 ] {} <Information> TCPHandler: Processed in 0.011 sec.
clickhouse_1                                | 2021.06.11 06:31:56.353707 [ 70 ] {} <Information> TCPHandler: Processed in 0.013 sec.
clickhouse_1                                | 2021.06.11 06:31:56.368950 [ 70 ] {} <Information> TCPHandler: Processed in 0.012 sec.
clickhouse_1                                | 2021.06.11 06:31:56.386748 [ 70 ] {} <Information> TCPHandler: Processed in 0.015 sec.
clickhouse_1                                | 2021.06.11 06:31:56.403899 [ 70 ] {} <Information> TCPHandler: Processed in 0.014 sec.
clickhouse_1                                | 2021.06.11 06:31:56.412382 [ 70 ] {e7ac2ee5-8596-486c-946f-e8598db3369f} <Information> executeQuery: Read 39 rows, 2.42 KiB in 0.004 sec., 10011 rows/sec., 621.68 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.413282 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.421689 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.431778 [ 70 ] {b7caea0c-b069-4767-a148-404a2801b19f} <Information> executeQuery: Read 39 rows, 2.42 KiB in 0.003 sec., 11151 rows/sec., 692.52 KiB/sec.
clickhouse_1                                | 2021.06.11 06:31:56.432747 [ 70 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 06:31:56.440983 [ 70 ] {} <Information> TCPHandler: Processed in 0.006 sec.
clickhouse_1                                | 2021.06.11 06:31:56.444321 [ 70 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:31:56.456342 [ 70 ] {} <Information> TCPHandler: Processed in 0.009 sec.
clickhouse_1                                | 2021.06.11 06:31:56.459520 [ 70 ] {4d20d5d8-ea43-4d7c-a7fd-02520ea22cd7} <Information> executeQuery: Read 40 rows, 2.48 KiB in 0.002 sec., 20963 rows/sec., 1.27 MiB/sec.
memcached_1                                 | Signal handled: Terminated.
ingest-consumer_1                           | 06:34:44 [INFO] sentry.plugins.github: apps-not-configured
ingest-consumer_1                           | %3|1623393284.947|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
ingest-consumer_1                           | %3|1623393285.947|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
ingest-consumer_1                           | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'ingest-transactions' or its partitions are not ready, retrying...
ingest-consumer_1                           | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'ingest-transactions' or its partitions are not ready, retrying...
ingest-consumer_1                           | 06:34:48 [WARNING] batching-kafka-consumer: Topic 'ingest-transactions' or its partitions are not ready, retrying...
ingest-consumer_1                           | 06:34:51 [INFO] batching-kafka-consumer: New partitions assigned: [TopicPartition{topic=ingest-attachments,partition=0,offset=-1001,error=None}, TopicPartition{topic=ingest-events,partition=0,offset=-1001,error=None}, TopicPartition{topic=ingest-transactions,partition=0,offset=-1001,error=None}]
clickhouse_1                                | 2021.06.11 06:31:56.460013 [ 70 ] {} <Information> TCPHandler: Processed in 0.003 sec.
clickhouse_1                                | 2021.06.11 06:31:56.462829 [ 70 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:31:56.492417 [ 70 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:33:34.176202 [ 45 ] {} <Information> Application: Received termination signal (Terminated)
clickhouse_1                                | 2021.06.11 06:33:34.513184 [ 1 ] {} <Information> Application: Closed all listening sockets.
clickhouse_1                                | 2021.06.11 06:33:34.513305 [ 1 ] {} <Information> Application: Closed connections.
clickhouse_1                                | 2021.06.11 06:33:34.515559 [ 1 ] {} <Information> Application: Shutting down storages.
clickhouse_1                                | 2021.06.11 06:33:35.112052 [ 1 ] {} <Information> Application: shutting down
clickhouse_1                                | 2021.06.11 06:33:35.112187 [ 45 ] {} <Information> BaseDaemon: Stop SignalListener thread
clickhouse_1                                | Processing configuration file '/etc/clickhouse-server/config.xml'.
clickhouse_1                                | Merging configuration file '/etc/clickhouse-server/config.d/docker_related_config.xml'.
clickhouse_1                                | Merging configuration file '/etc/clickhouse-server/config.d/sentry.xml'.
clickhouse_1                                | Include not found: clickhouse_remote_servers
clickhouse_1                                | Include not found: clickhouse_compression
clickhouse_1                                | Logging information to /var/log/clickhouse-server/clickhouse-server.log
clickhouse_1                                | Logging errors to /var/log/clickhouse-server/clickhouse-server.err.log
clickhouse_1                                | Logging information to console
clickhouse_1                                | 2021.06.11 06:34:28.644520 [ 1 ] {} <Information> : Starting ClickHouse 20.3.9.70 with revision 54433
clickhouse_1                                | 2021.06.11 06:34:28.648763 [ 1 ] {} <Information> Application: starting up
clickhouse_1                                | Include not found: networks
clickhouse_1                                | 2021.06.11 06:34:28.676855 [ 1 ] {} <Information> Application: Uncompressed cache size was lowered to 4.13 GiB because the system has low amount of memory
clickhouse_1                                | 2021.06.11 06:34:28.676985 [ 1 ] {} <Information> Application: Mark cache size was lowered to 4.13 GiB because the system has low amount of memory
clickhouse_1                                | 2021.06.11 06:34:28.677024 [ 1 ] {} <Information> Application: Loading metadata from /var/lib/clickhouse/
clickhouse_1                                | 2021.06.11 06:34:28.678097 [ 1 ] {} <Information> DatabaseOrdinary (system): Total 2 tables and 0 dictionaries.
clickhouse_1                                | 2021.06.11 06:34:28.680226 [ 47 ] {} <Information> BackgroundProcessingPool: Create BackgroundProcessingPool with 16 threads
clickhouse_1                                | 2021.06.11 06:34:28.710207 [ 1 ] {} <Information> DatabaseOrdinary (system): Starting up tables.
clickhouse_1                                | 2021.06.11 06:34:28.774058 [ 1 ] {} <Information> DatabaseOrdinary (default): Total 13 tables and 0 dictionaries.
clickhouse_1                                | 2021.06.11 06:34:28.820373 [ 1 ] {} <Information> DatabaseOrdinary (default): Starting up tables.
clickhouse_1                                | 2021.06.11 06:34:28.838229 [ 1 ] {} <Information> BackgroundSchedulePool: Create BackgroundSchedulePool with 16 threads
clickhouse_1                                | 2021.06.11 06:34:28.844850 [ 1 ] {} <Information> Application: It looks like the process has no CAP_NET_ADMIN capability, 'taskstats' performance statistics will be disabled. It could happen due to incorrect ClickHouse package installation. You could resolve the problem manually with 'sudo setcap cap_net_admin=+ep /usr/bin/clickhouse'. Note that it will not work on 'nosuid' mounted filesystems. It also doesn't work if you run clickhouse-server inside network namespace as it happens in some containers.
clickhouse_1                                | 2021.06.11 06:34:28.844878 [ 1 ] {} <Information> Application: It looks like the process has no CAP_SYS_NICE capability, the setting 'os_thread_nice' will have no effect. It could happen due to incorrect ClickHouse package installation. You could resolve the problem manually with 'sudo setcap cap_sys_nice=+ep /usr/bin/clickhouse'. Note that it will not work on 'nosuid' mounted filesystems.
clickhouse_1                                | 2021.06.11 06:34:28.870316 [ 1 ] {} <Error> Application: Listen [::]:8123 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:34:28.870525 [ 1 ] {} <Error> Application: Listen [::]:9000 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:34:28.870666 [ 1 ] {} <Error> Application: Listen [::]:9009 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:34:28.870801 [ 1 ] {} <Error> Application: Listen [::]:9004 failed: Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = DNS error: EAI: -9 (version 20.3.9.70 (official build)). If it is an IPv6 or IPv4 address and your host has disabled IPv6 or IPv4, then consider to specify not disabled IPv4 or IPv6 address to listen in <listen_host> element of configuration file. Example for disabled IPv6: <listen_host>0.0.0.0</listen_host> . Example for disabled IPv4: <listen_host>::</listen_host>
clickhouse_1                                | 2021.06.11 06:34:28.871193 [ 1 ] {} <Information> Application: Listening for http://0.0.0.0:8123
clickhouse_1                                | 2021.06.11 06:34:28.871236 [ 1 ] {} <Information> Application: Listening for connections with native protocol (tcp): 0.0.0.0:9000
clickhouse_1                                | 2021.06.11 06:34:28.871265 [ 1 ] {} <Information> Application: Listening for replica communication (interserver): http://0.0.0.0:9009
clickhouse_1                                | 2021.06.11 06:34:29.021813 [ 1 ] {} <Information> Application: Listening for MySQL compatibility protocol: 0.0.0.0:9004
clickhouse_1                                | 2021.06.11 06:34:29.022299 [ 1 ] {} <Information> Application: Available RAM: 8.26 GiB; physical cores: 4; logical cores: 4.
clickhouse_1                                | 2021.06.11 06:34:29.022317 [ 1 ] {} <Information> Application: Ready for connections.
clickhouse_1                                | Include not found: clickhouse_remote_servers
clickhouse_1                                | Include not found: clickhouse_compression
clickhouse_1                                | 2021.06.11 06:35:02.789256 [ 87 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:35:02.804531 [ 88 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:35:02.820465 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:35:02.835746 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:40:02.723882 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:40:02.738467 [ 88 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:40:02.753859 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:40:02.766957 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:45:02.655145 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:45:02.666679 [ 88 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:45:02.682849 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:45:02.699938 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:50:02.656841 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 06:50:02.661446 [ 88 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:50:02.685964 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:50:02.694703 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:55:02.607248 [ 87 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:55:02.616827 [ 88 ] {} <Information> TCPHandler: Processed in 0.002 sec.
clickhouse_1                                | 2021.06.11 06:55:02.641017 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 06:55:02.653397 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:00:02.562407 [ 88 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:00:02.569665 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:00:02.591478 [ 88 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:00:02.597877 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:01:07.354113 [ 88 ] {} <Information> TCPHandler: Processed in 0.005 sec.
clickhouse_1                                | 2021.06.11 07:05:02.564055 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:05:02.587552 [ 89 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:05:02.589576 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:05:02.616792 [ 89 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:10:02.509647 [ 87 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:10:02.532442 [ 89 ] {} <Information> TCPHandler: Processed in 0.001 sec.
clickhouse_1                                | 2021.06.11 07:10:02.537867 [ 87 ] {} <Information> TCPHandler: Done processing connection.
clickhouse_1                                | 2021.06.11 07:10:02.563318 [ 89 ] {} <Information> TCPHandler: Done processing connection.
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:44 +0000] "GET / HTTP/1.1" 302 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:44 +0000] "GET /auth/login/ HTTP/1.1" 302 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:45 +0000] "GET / HTTP/1.1" 302 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:45 +0000] "GET /auth/login/ HTTP/1.1" 302 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:45 +0000] "GET /auth/login/sentry/ HTTP/1.1" 200 11476 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:45 +0000] "GET /auth/login/sentry/ HTTP/1.1" 200 11476 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:46 +0000] "GET /_static/dist/sentry/runtime.87c905319c5fd35ae57b.js HTTP/1.1" 200 8944 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:46 +0000] "GET /_static/1623393285/sentry/js/ads.js HTTP/1.1" 200 33 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:46 +0000] "GET /_static/1623393285/sentry/images/logos/default-organization-logo.png HTTP/1.1" 200 1666 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:46 +0000] "GET /_static/dist/sentry/app.28ffc76bac9fa23cad81.js HTTP/1.1" 200 365140 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:00:47 +0000] "GET /_static/1623393285/sentry/images/favicon.png HTTP/1.1" 200 998 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "POST /auth/login/sentry/ HTTP/1.1" 302 0 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /auth/login/ HTTP/1.1" 302 0 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /organizations/sentry/issues/ HTTP/1.1" 200 5128 "http://192.168.1.102:9000/auth/login/sentry/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/assistant/?v2 HTTP/1.1" 200 512 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/internal/health/ HTTP/1.1" 200 106 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/organizations/sentry/?detailed=0 HTTP/1.1" 200 2048 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/internal/options/?query=is:required HTTP/1.1" 200 1616 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/organizations/?member=1 HTTP/1.1" 200 1003 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/organizations/sentry/projects/?all_projects=1&collapse=latestDeploys HTTP/1.1" 200 571 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:03 +0000] "GET /api/0/organizations/sentry/teams/ HTTP/1.1" 200 764 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:04 +0000] "GET /api/0/internal/options/?query=is:required HTTP/1.1" 200 1616 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:06 +0000] "PUT /api/0/internal/options/?query=is:required HTTP/1.1" 200 0 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:06 +0000] "GET /api/0/organizations/sentry/?detailed=0 HTTP/1.1" 200 2048 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:06 +0000] "GET /api/0/organizations/sentry/teams/ HTTP/1.1" 200 764 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:06 +0000] "GET /api/0/organizations/sentry/projects/?all_projects=1&collapse=latestDeploys HTTP/1.1" 200 571 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/searches/ HTTP/1.1" 200 275 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/users/ HTTP/1.1" 200 1047 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/broadcasts/ HTTP/1.1" 200 2 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/projects/?collapse=latestDeploys&per_page=50 HTTP/1.1" 200 571 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/tags/?statsPeriod=14d&use_cache=1 HTTP/1.1" 200 2 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/recent-searches/?query=&type=0&limit=3 HTTP/1.1" 200 2 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/issues/?collapse=stats&expand=owners&expand=inbox&limit=25&query=is%3Aunresolved&shortIdLookup=1&statsPeriod=14d HTTP/1.1" 200 2 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:07 +0000] "GET /api/0/organizations/sentry/processingissues/ HTTP/1.1" 200 166 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:08 +0000] "GET /api/0/organizations/sentry/sent-first-event/?is_member=true HTTP/1.1" 200 24 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:08 +0000] "GET /api/0/organizations/sentry/issues-count/?query=is%3Aunresolved%20is%3Afor_review%20assigned_or_suggested%3A%5Bme%2C%20none%5D&query=is%3Aignored&statsPeriod=14d HTTP/1.1" 200 87 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:08 +0000] "GET /api/0/organizations/sentry/projects/?per_page=1 HTTP/1.1" 200 592 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:08 +0000] "GET /_static/dist/sentry/spot/sentry-robot.a95c7b.png HTTP/1.1" 200 84441 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
nginx_1                                     | 192.168.1.133 - - [11/Jun/2021:07:01:08 +0000] "GET /api/0/projects/sentry/internal/issues/?limit=1 HTTP/1.1" 200 2 "http://192.168.1.102:9000/organizations/sentry/issues/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.77 Safari/537.36" "-"
post-process-forwarder_1                    | 06:34:44 [INFO] sentry.plugins.github: apps-not-configured
post-process-forwarder_1                    | %3|1623393284.954|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
post-process-forwarder_1                    | %3|1623393284.962|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 7ms in state CONNECT)
post-process-forwarder_1                    | %3|1623393285.954|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
post-process-forwarder_1                    | %3|1623393285.954|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
post-process-forwarder_1                    | 06:34:51 [INFO] sentry.eventstream.kafka.backend: Received partition assignment: [TopicPartition{topic=events,partition=0,offset=-1001,error=None}]
postgres_1                                  | Setting up Change Data Capture
postgres_1                                  | DB not initialized. Postgres will take care of pg_hba
postgres_1                                  | ********************************************************************************
postgres_1                                  | WARNING: POSTGRES_HOST_AUTH_METHOD has been set to "trust". This will allow
postgres_1                                  |          anyone with access to the Postgres port to access your database without
postgres_1                                  |          a password, even if POSTGRES_PASSWORD is set. See PostgreSQL
postgres_1                                  |          documentation about "trust":
postgres_1                                  |          https://www.postgresql.org/docs/current/auth-trust.html
postgres_1                                  |          In Docker's default configuration, this is effectively any other
postgres_1                                  |          container on the same system.
postgres_1                                  | 
postgres_1                                  |          It is not recommended to use POSTGRES_HOST_AUTH_METHOD=trust. Replace
postgres_1                                  |          it with "-e POSTGRES_PASSWORD=password" instead to set a password in
postgres_1                                  |          "docker run".
postgres_1                                  | ********************************************************************************
postgres_1                                  | The files belonging to this database system will be owned by user "postgres".
postgres_1                                  | This user must also own the server process.
postgres_1                                  | 
postgres_1                                  | The database cluster will be initialized with locale "en_US.utf8".
postgres_1                                  | The default database encoding has accordingly been set to "UTF8".
postgres_1                                  | The default text search configuration will be set to "english".
postgres_1                                  | 
postgres_1                                  | Data page checksums are disabled.
postgres_1                                  | 
postgres_1                                  | fixing permissions on existing directory /var/lib/postgresql/data ... ok
postgres_1                                  | creating subdirectories ... ok
postgres_1                                  | selecting default max_connections ... 100
postgres_1                                  | selecting default shared_buffers ... 128MB
postgres_1                                  | selecting default timezone ... Etc/UTC
postgres_1                                  | selecting dynamic shared memory implementation ... posix
postgres_1                                  | creating configuration files ... ok
postgres_1                                  | running bootstrap script ... ok
postgres_1                                  | performing post-bootstrap initialization ... ok
postgres_1                                  | syncing data to disk ... ok
postgres_1                                  | 
postgres_1                                  | Success. You can now start the database server using:
postgres_1                                  | 
postgres_1                                  |     pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres_1                                  | 
postgres_1                                  | 
postgres_1                                  | WARNING: enabling "trust" authentication for local connections
postgres_1                                  | You can change this by editing pg_hba.conf or using the option -A, or
postgres_1                                  | --auth-local and --auth-host, the next time you run initdb.
postgres_1                                  | waiting for server to start....LOG:  database system was shut down at 2021-06-11 06:32:11 UTC
postgres_1                                  | LOG:  MultiXact member wraparound protections are now enabled
postgres_1                                  | LOG:  autovacuum launcher started
postgres_1                                  | LOG:  database system is ready to accept connections
postgres_1                                  |  done
postgres_1                                  | server started
postgres_1                                  | 
postgres_1                                  | /docker-entrypoint.sh: running /docker-entrypoint-initdb.d/init_hba.sh
postgres_1                                  | 
postgres_1                                  | waiting for server to shut down....LOG:  received fast shutdown request
postgres_1                                  | LOG:  aborting any active transactions
postgres_1                                  | LOG:  autovacuum launcher shutting down
postgres_1                                  | LOG:  shutting down
postgres_1                                  | LOG:  database system is shut down
postgres_1                                  |  done
postgres_1                                  | server stopped
postgres_1                                  | 
postgres_1                                  | PostgreSQL init process complete; ready for start up.
postgres_1                                  | 
postgres_1                                  | LOG:  database system was shut down at 2021-06-11 06:32:14 UTC
postgres_1                                  | LOG:  MultiXact member wraparound protections are now enabled
postgres_1                                  | LOG:  database system is ready to accept connections
postgres_1                                  | LOG:  autovacuum launcher started
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'system.url-prefix'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'github.apps-install-url'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'system.url-prefix'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'system.url-prefix'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'vercel.integration-slug'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'system.url-prefix'
postgres_1                                  | ERROR:  relation "sentry_option" does not exist at character 114
postgres_1                                  | STATEMENT:  SELECT "sentry_option"."id", "sentry_option"."key", "sentry_option"."value", "sentry_option"."last_updated" FROM "sentry_option" WHERE "sentry_option"."key" = 'msteams.app-id'
postgres_1                                  | ERROR:  relation "sentry_projectkey" does not exist at character 371
postgres_1                                  | STATEMENT:  SELECT "sentry_projectkey"."id", "sentry_projectkey"."project_id", "sentry_projectkey"."label", "sentry_projectkey"."public_key", "sentry_projectkey"."secret_key", "sentry_projectkey"."roles", "sentry_projectkey"."status", "sentry_projectkey"."date_added", "sentry_projectkey"."rate_limit_count", "sentry_projectkey"."rate_limit_window", "sentry_projectkey"."data" FROM "sentry_projectkey" WHERE ("sentry_projectkey"."project_id" = 1 AND "sentry_projectkey"."roles" = (("sentry_projectkey"."roles" | 1)) AND "sentry_projectkey"."status" = 0) ORDER BY "sentry_projectkey"."id" ASC LIMIT 1
postgres_1                                  | ERROR:  relation "south_migrationhistory" does not exist at character 15
postgres_1                                  | STATEMENT:  SELECT 1 FROM south_migrationhistory LIMIT 1
postgres_1                                  | LOG:  received fast shutdown request
postgres_1                                  | LOG:  aborting any active transactions
postgres_1                                  | LOG:  autovacuum launcher shutting down
postgres_1                                  | LOG:  shutting down
postgres_1                                  | FATAL:  the database system is shutting down
postgres_1                                  | LOG:  database system is shut down
postgres_1                                  | Setting up Change Data Capture
postgres_1                                  | Replication config already present in pg_hba. Not changing anything.
postgres_1                                  | 
postgres_1                                  | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres_1                                  | 
postgres_1                                  | LOG:  database system was shut down at 2021-06-11 06:33:33 UTC
postgres_1                                  | LOG:  MultiXact member wraparound protections are now enabled
postgres_1                                  | LOG:  database system is ready to accept connections
postgres_1                                  | LOG:  autovacuum launcher started
redis_1                                     | 1:C 11 Jun 2021 06:31:44.936 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis_1                                     | 1:C 11 Jun 2021 06:31:44.936 # Redis version=5.0.12, bits=64, commit=00000000, modified=0, pid=1, just started
redis_1                                     | 1:C 11 Jun 2021 06:31:44.936 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis_1                                     | 1:M 11 Jun 2021 06:31:44.939 * Running mode=standalone, port=6379.
redis_1                                     | 1:M 11 Jun 2021 06:31:44.939 # Server initialized
redis_1                                     | 1:M 11 Jun 2021 06:31:44.939 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1                                     | 1:M 11 Jun 2021 06:31:44.939 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled.
redis_1                                     | 1:M 11 Jun 2021 06:31:44.939 * Ready to accept connections
redis_1                                     | 1:signal-handler (1623393214) Received SIGTERM scheduling shutdown...
redis_1                                     | 1:M 11 Jun 2021 06:33:34.229 # User requested shutdown...
redis_1                                     | 1:M 11 Jun 2021 06:33:34.229 * Saving the final RDB snapshot before exiting.
redis_1                                     | 1:M 11 Jun 2021 06:33:34.233 * DB saved on disk
redis_1                                     | 1:M 11 Jun 2021 06:33:34.233 # Redis is now ready to exit, bye bye...
redis_1                                     | 1:C 11 Jun 2021 06:34:28.245 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis_1                                     | 1:C 11 Jun 2021 06:34:28.245 # Redis version=5.0.12, bits=64, commit=00000000, modified=0, pid=1, just started
redis_1                                     | 1:C 11 Jun 2021 06:34:28.245 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 * Running mode=standalone, port=6379.
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 # Server initialized
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled.
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 * DB loaded from disk: 0.000 seconds
redis_1                                     | 1:M 11 Jun 2021 06:34:28.247 * Ready to accept connections
redis_1                                     | 1:M 11 Jun 2021 06:39:29.022 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 06:39:29.022 * Background saving started by pid 998
redis_1                                     | 998:C 11 Jun 2021 06:39:29.035 * DB saved on disk
redis_1                                     | 998:C 11 Jun 2021 06:39:29.037 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 06:39:29.123 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 06:44:30.041 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 06:44:30.042 * Background saving started by pid 2001
redis_1                                     | 2001:C 11 Jun 2021 06:44:30.052 * DB saved on disk
redis_1                                     | 2001:C 11 Jun 2021 06:44:30.052 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 06:44:30.144 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 06:49:31.071 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 06:49:31.073 * Background saving started by pid 3023
redis_1                                     | 3023:C 11 Jun 2021 06:49:31.098 * DB saved on disk
redis_1                                     | 3023:C 11 Jun 2021 06:49:31.099 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 06:49:31.175 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 06:54:32.013 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 06:54:32.015 * Background saving started by pid 4034
redis_1                                     | 4034:C 11 Jun 2021 06:54:32.035 * DB saved on disk
redis_1                                     | 4034:C 11 Jun 2021 06:54:32.037 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 06:54:32.117 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 06:59:33.092 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 06:59:33.094 * Background saving started by pid 5073
redis_1                                     | 5073:C 11 Jun 2021 06:59:33.114 * DB saved on disk
redis_1                                     | 5073:C 11 Jun 2021 06:59:33.115 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 06:59:33.194 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 07:04:34.024 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 07:04:34.025 * Background saving started by pid 6105
redis_1                                     | 6105:C 11 Jun 2021 07:04:34.076 * DB saved on disk
redis_1                                     | 6105:C 11 Jun 2021 07:04:34.078 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 07:04:34.125 * Background saving terminated with success
redis_1                                     | 1:M 11 Jun 2021 07:09:35.006 * 100 changes in 300 seconds. Saving...
redis_1                                     | 1:M 11 Jun 2021 07:09:35.007 * Background saving started by pid 7125
redis_1                                     | 7125:C 11 Jun 2021 07:09:35.031 * DB saved on disk
redis_1                                     | 7125:C 11 Jun 2021 07:09:35.031 * RDB: 0 MB of memory used by copy-on-write
redis_1                                     | 1:M 11 Jun 2021 07:09:35.107 * Background saving terminated with success
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 15ms in state CONNECT)
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 15ms in state CONNECT)
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: Global error: AllBrokersDown (Local: All broker connections are down): 1/1 brokers are down
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 3ms in state CONNECT)
relay_1                                     | 2021-06-11T06:34:34Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-11T06:34:34Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 3ms in state CONNECT)
relay_1                                     | 2021-06-11T06:34:34Z [rdkafka::client] ERROR: librdkafka: Global error: AllBrokersDown (Local: All broker connections are down): 1/1 brokers are down
relay_1                                     | 2021-06-11T06:34:34Z [actix::actors::resolver] WARN: Can not create system dns resolver: io error
relay_1                                     | 2021-06-11T06:34:35Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-11T06:34:35Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-11T06:34:35Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-11T06:34:35Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-11T06:34:35Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-11T06:34:37Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-11T06:34:39Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-11T06:34:43Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
smtp_1                                      | + sed -ri '
smtp_1                                      |   s/^#?(dc_local_interfaces)=.*/\1='\''0.0.0.0 ; ::0'\''/;
smtp_1                                      |   s/^#?(dc_other_hostnames)=.*/\1='\'''\''/;
smtp_1                                      |   s/^#?(dc_relay_nets)=.*/\1='\''0.0.0.0\/0'\''/;
smtp_1                                      |   s/^#?(dc_eximconfig_configtype)=.*/\1='\''internet'\''/;
smtp_1                                      | ' /etc/exim4/update-exim4.conf.conf
smtp_1                                      | + update-exim4.conf -v
smtp_1                                      | using non-split configuration scheme from /etc/exim4/exim4.conf.template
smtp_1                                      |   272 LOG: MAIN
smtp_1                                      |   272   exim 4.92 daemon started: pid=272, no queue runs, listening for SMTP on port 25 (IPv6 and IPv4)
smtp_1                                      | + sed -ri '
smtp_1                                      |   s/^#?(dc_local_interfaces)=.*/\1='\''0.0.0.0 ; ::0'\''/;
smtp_1                                      |   s/^#?(dc_other_hostnames)=.*/\1='\'''\''/;
smtp_1                                      |   s/^#?(dc_relay_nets)=.*/\1='\''0.0.0.0\/0'\''/;
smtp_1                                      |   s/^#?(dc_eximconfig_configtype)=.*/\1='\''internet'\''/;
smtp_1                                      | ' /etc/exim4/update-exim4.conf.conf
smtp_1                                      | + update-exim4.conf -v
smtp_1                                      | using non-split configuration scheme from /etc/exim4/exim4.conf.template
smtp_1                                      |   271 LOG: MAIN
smtp_1                                      |   271   exim 4.92 daemon started: pid=271, no queue runs, listening for SMTP on port 25 (IPv6 and IPv4)
snuba-api_1                                 | *** Starting uWSGI 2.0.18 (64bit) on [Fri Jun 11 06:32:15 2021] ***
snuba-api_1                                 | compiled with version: 8.3.0 on 08 June 2021 14:13:44
snuba-api_1                                 | os: Linux-5.4.0-74-generic #83-Ubuntu SMP Sat May 8 02:35:39 UTC 2021
snuba-api_1                                 | nodename: d3aab8a75232
snuba-api_1                                 | machine: x86_64
snuba-api_1                                 | clock source: unix
snuba-api_1                                 | pcre jit disabled
snuba-api_1                                 | detected number of CPU cores: 4
snuba-api_1                                 | current working directory: /usr/src/snuba
snuba-api_1                                 | detected binary path: /usr/local/bin/uwsgi
snuba-api_1                                 | your memory page size is 4096 bytes
snuba-api_1                                 | detected max file descriptor number: 1048576
snuba-api_1                                 | lock engine: pthread robust mutexes
snuba-api_1                                 | thunder lock: enabled
snuba-api_1                                 | uwsgi socket 0 bound to TCP address 0.0.0.0:1218 fd 3
snuba-api_1                                 | Python version: 3.8.10 (default, May 12 2021, 15:56:47)  [GCC 8.3.0]
snuba-api_1                                 | Set PythonHome to /usr/local
snuba-api_1                                 | Python main interpreter initialized at 0x564fcef74bf0
snuba-api_1                                 | python threads support enabled
snuba-api_1                                 | your server socket listen backlog is limited to 100 connections
snuba-api_1                                 | your mercy for graceful operations on workers is 60 seconds
snuba-api_1                                 | mapped 145808 bytes (142 KB) for 1 cores
snuba-api_1                                 | *** Operational MODE: single process ***
snuba-api_1                                 | initialized 38 metrics
snuba-api_1                                 | spawned uWSGI master process (pid: 1)
snuba-api_1                                 | spawned uWSGI worker 1 (pid: 15, cores: 1)
snuba-api_1                                 | metrics collector thread started
snuba-api_1                                 | WSGI app 0 (mountpoint='') ready in 1 seconds on interpreter 0x564fcef74bf0 pid: 15 (default app)
snuba-api_1                                 | SIGINT/SIGQUIT received...killing workers...
snuba-api_1                                 | worker 1 buried after 1 seconds
snuba-api_1                                 | goodbye to uWSGI.
snuba-api_1                                 | *** Starting uWSGI 2.0.18 (64bit) on [Fri Jun 11 06:34:39 2021] ***
snuba-api_1                                 | compiled with version: 8.3.0 on 08 June 2021 14:13:44
snuba-api_1                                 | os: Linux-5.4.0-74-generic #83-Ubuntu SMP Sat May 8 02:35:39 UTC 2021
snuba-api_1                                 | nodename: d3aab8a75232
snuba-api_1                                 | machine: x86_64
snuba-api_1                                 | clock source: unix
snuba-api_1                                 | pcre jit disabled
snuba-api_1                                 | detected number of CPU cores: 4
snuba-api_1                                 | current working directory: /usr/src/snuba
snuba-api_1                                 | detected binary path: /usr/local/bin/uwsgi
snuba-api_1                                 | your memory page size is 4096 bytes
snuba-api_1                                 | detected max file descriptor number: 1048576
snuba-api_1                                 | lock engine: pthread robust mutexes
snuba-api_1                                 | thunder lock: enabled
snuba-api_1                                 | uwsgi socket 0 bound to TCP address 0.0.0.0:1218 fd 3
snuba-api_1                                 | Python version: 3.8.10 (default, May 12 2021, 15:56:47)  [GCC 8.3.0]
snuba-api_1                                 | Set PythonHome to /usr/local
snuba-api_1                                 | Python main interpreter initialized at 0x55de2e971bf0
snuba-api_1                                 | python threads support enabled
snuba-api_1                                 | your server socket listen backlog is limited to 100 connections
snuba-api_1                                 | your mercy for graceful operations on workers is 60 seconds
snuba-api_1                                 | mapped 145808 bytes (142 KB) for 1 cores
snuba-api_1                                 | *** Operational MODE: single process ***
snuba-api_1                                 | initialized 38 metrics
snuba-api_1                                 | spawned uWSGI master process (pid: 1)
snuba-api_1                                 | spawned uWSGI worker 1 (pid: 16, cores: 1)
snuba-api_1                                 | metrics collector thread started
snuba-api_1                                 | WSGI app 0 (mountpoint='') ready in 2 seconds on interpreter 0x55de2e971bf0 pid: 16 (default app)
sentry-cleanup_1                            | SHELL=/bin/bash
sentry-cleanup_1                            | BASH_ENV=/container.env
sentry-cleanup_1                            | 0 0 * * * gosu sentry sentry cleanup --days 90 > /proc/1/fd/1 2>/proc/1/fd/2
snuba-cleanup_1                             | SHELL=/bin/bash
snuba-cleanup_1                             | BASH_ENV=/container.env
snuba-cleanup_1                             | */5 * * * * gosu snuba snuba cleanup --storage errors --dry-run False > /proc/1/fd/1 2>/proc/1/fd/2
snuba-cleanup_1                             | 2021-06-11 06:35:02,789 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 06:40:02,723 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 06:45:02,655 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 06:50:02,661 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 06:55:02,607 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 07:00:02,562 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 07:05:02,587 Dropped 0 partitions on clickhouse:9000
snuba-cleanup_1                             | 2021-06-11 07:10:02,532 Dropped 0 partitions on clickhouse:9000
snuba-sessions-consumer_1                   | 2021-06-11 06:32:21,120 New partitions assigned: {Partition(topic=Topic(name='ingest-sessions'), index=0): 0}
snuba-sessions-consumer_1                   | 2021-06-11 06:33:33,166 Partitions revoked: [Partition(topic=Topic(name='ingest-sessions'), index=0)]
snuba-sessions-consumer_1                   | 2021-06-11 06:33:33,621 New partitions assigned: {Partition(topic=Topic(name='ingest-sessions'), index=0): 0}
snuba-sessions-consumer_1                   | 2021-06-11 06:33:33,622 Partitions revoked: [Partition(topic=Topic(name='ingest-sessions'), index=0)]
snuba-sessions-consumer_1                   | %3|1623393279.704|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 6ms in state CONNECT)
snuba-sessions-consumer_1                   | %3|1623393279.724|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-sessions-consumer_1                   | %3|1623393280.704|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-sessions-consumer_1                   | %3|1623393280.721|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-sessions-consumer_1                   | 2021-06-11 06:34:51,763 New partitions assigned: {Partition(topic=Topic(name='ingest-sessions'), index=0): 0}
snuba-replacer_1                            | 2021-06-11 06:32:18,296 New partitions assigned: {Partition(topic=Topic(name='event-replacements'), index=0): 0}
snuba-replacer_1                            | 2021-06-11 06:33:33,323 Partitions revoked: [Partition(topic=Topic(name='event-replacements'), index=0)]
snuba-replacer_1                            | %3|1623393279.723|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 3ms in state CONNECT)
snuba-replacer_1                            | %3|1623393280.719|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-replacer_1                            | 2021-06-11 06:34:51,768 New partitions assigned: {Partition(topic=Topic(name='event-replacements'), index=0): 0}
snuba-outcomes-consumer_1                   | 2021-06-11 06:32:21,121 New partitions assigned: {Partition(topic=Topic(name='outcomes'), index=0): 0}
snuba-outcomes-consumer_1                   | 2021-06-11 06:33:33,141 Partitions revoked: [Partition(topic=Topic(name='outcomes'), index=0)]
snuba-outcomes-consumer_1                   | %3|1623393279.068|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 1ms in state CONNECT)
snuba-outcomes-consumer_1                   | %3|1623393279.069|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-outcomes-consumer_1                   | %3|1623393280.067|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-outcomes-consumer_1                   | %3|1623393280.068|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-outcomes-consumer_1                   | 2021-06-11 06:34:51,765 New partitions assigned: {Partition(topic=Topic(name='outcomes'), index=0): 0}
snuba-consumer_1                            | 2021-06-11 06:32:21,116 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-consumer_1                            | 2021-06-11 06:33:33,159 Partitions revoked: [Partition(topic=Topic(name='events'), index=0)]
snuba-consumer_1                            | %3|1623393279.964|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 1ms in state CONNECT)
snuba-consumer_1                            | %3|1623393279.990|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 4ms in state CONNECT)
snuba-consumer_1                            | %3|1623393280.963|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-consumer_1                            | %3|1623393280.963|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-consumer_1                            | 2021-06-11 06:34:51,764 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-subscription-consumer-events_1        | 2021-06-11 06:32:21,634 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-subscription-consumer-events_1        | 2021-06-11 06:33:32,672 Partitions revoked: [Partition(topic=Topic(name='events'), index=0)]
snuba-subscription-consumer-events_1        | %3|1623393278.743|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 15ms in state CONNECT)
snuba-subscription-consumer-events_1        | %3|1623393279.717|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-subscription-consumer-events_1        | %3|1623393279.727|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-subscription-consumer-events_1        | %3|1623393280.717|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-subscription-consumer-events_1        | 2021-06-11 06:34:54,804 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-transactions-cleanup_1                | SHELL=/bin/bash
snuba-transactions-cleanup_1                | BASH_ENV=/container.env
snuba-transactions-cleanup_1                | */5 * * * * gosu snuba snuba cleanup --storage transactions --dry-run False > /proc/1/fd/1 2>/proc/1/fd/2
snuba-transactions-cleanup_1                | 2021-06-11 06:35:02,804 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 06:40:02,738 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 06:45:02,666 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 06:50:02,657 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 06:55:02,616 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 07:00:02,569 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 07:05:02,564 Dropped 0 partitions on clickhouse:9000
snuba-transactions-cleanup_1                | 2021-06-11 07:10:02,509 Dropped 0 partitions on clickhouse:9000
snuba-transactions-consumer_1               | 2021-06-11 06:32:18,403 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-transactions-consumer_1               | 2021-06-11 06:33:33,436 Partitions revoked: [Partition(topic=Topic(name='events'), index=0)]
snuba-transactions-consumer_1               | %3|1623393279.412|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 1ms in state CONNECT)
snuba-transactions-consumer_1               | %3|1623393279.413|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-transactions-consumer_1               | %3|1623393280.415|FAIL|rdkafka#producer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-transactions-consumer_1               | %3|1623393280.415|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-transactions-consumer_1               | 2021-06-11 06:34:51,430 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-subscription-consumer-transactions_1  | 2021-06-11 06:32:20,619 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
snuba-subscription-consumer-transactions_1  | 2021-06-11 06:33:32,652 Partitions revoked: [Partition(topic=Topic(name='events'), index=0)]
snuba-subscription-consumer-transactions_1  | %3|1623393280.988|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-subscription-consumer-transactions_1  | %3|1623393281.968|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
snuba-subscription-consumer-transactions_1  | %3|1623393281.982|FAIL|rdkafka#consumer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-subscription-consumer-transactions_1  | %3|1623393282.967|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
snuba-subscription-consumer-transactions_1  | 2021-06-11 06:34:55,038 New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 0}
subscription-consumer-events_1              | 06:34:43 [INFO] sentry.plugins.github: apps-not-configured
subscription-consumer-events_1              | %3|1623393284.447|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
subscription-consumer-events_1              | %3|1623393285.447|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
subscription-consumer-events_1              | %3|1623393285.448|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
subscription-consumer-events_1              | %3|1623393286.447|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
subscription-consumer-events_1              | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'events-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-events_1              | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'events-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-events_1              | 06:34:48 [WARNING] batching-kafka-consumer: Topic 'events-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-events_1              | 06:34:54 [INFO] sentry.snuba.query_subscription_consumer: query-subscription-consumer.on_assign (offsets='{0: None}' partitions='[TopicPartition{topic=events-subscription-results,partition=0,offset=-1001,error=None}]')
subscription-consumer-transactions_1        | 06:34:44 [INFO] sentry.plugins.github: apps-not-configured
subscription-consumer-transactions_1        | %3|1623393284.788|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 1ms in state CONNECT)
subscription-consumer-transactions_1        | %3|1623393285.785|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
subscription-consumer-transactions_1        | %3|1623393285.788|FAIL|rdkafka#producer-2| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.22.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
subscription-consumer-transactions_1        | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'transactions-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-transactions_1        | 06:34:47 [WARNING] batching-kafka-consumer: Topic 'transactions-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-transactions_1        | 06:34:48 [WARNING] batching-kafka-consumer: Topic 'transactions-subscription-results' or its partitions are not ready, retrying...
subscription-consumer-transactions_1        | 06:34:54 [INFO] sentry.snuba.query_subscription_consumer: query-subscription-consumer.on_assign (offsets='{0: None}' partitions='[TopicPartition{topic=transactions-subscription-results,partition=0,offset=-1001,error=None}]')
symbolicator-cleanup_1                      | SHELL=/bin/bash
symbolicator-cleanup_1                      | BASH_ENV=/container.env
symbolicator-cleanup_1                      | 55 23 * * * gosu symbolicator symbolicator cleanup > /proc/1/fd/1 2>/proc/1/fd/2
worker_1                                    | 06:34:44 [INFO] sentry.plugins.github: apps-not-configured
worker_1                                    | 06:34:44 [INFO] sentry.bgtasks: bgtask.spawn (task_name='sentry.bgtasks.clean_dsymcache:clean_dsymcache')
worker_1                                    | 06:34:44 [INFO] sentry.bgtasks: bgtask.spawn (task_name='sentry.bgtasks.clean_releasefilecache:clean_releasefilecache')
worker_1                                    |  
worker_1                                    |  -------------- celery@7c4065cb0b91 v4.4.7 (cliffs)
worker_1                                    | --- ***** ----- 
worker_1                                    | -- ******* ---- Linux-5.4.0-74-generic-x86_64-with-debian-10.9 2021-06-11 06:34:45
worker_1                                    | - *** --- * --- 
worker_1                                    | - ** ---------- [config]
worker_1                                    | - ** ---------- .> app:         sentry:0x7f395ae26940
worker_1                                    | - ** ---------- .> transport:   redis://redis:6379/0
worker_1                                    | - ** ---------- .> results:     disabled://
worker_1                                    | - *** --- * --- .> concurrency: 4 (prefork)
worker_1                                    | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
worker_1                                    | --- ***** ----- 
worker_1                                    |  -------------- [queues]
worker_1                                    |                 .> activity.notify  exchange=default(direct) key=activity.notify
worker_1                                    |                 .> alerts           exchange=default(direct) key=alerts
worker_1                                    |                 .> app_platform     exchange=default(direct) key=app_platform
worker_1                                    |                 .> assemble         exchange=default(direct) key=assemble
worker_1                                    |                 .> auth             exchange=default(direct) key=auth
worker_1                                    |                 .> buffers.process_pending exchange=default(direct) key=buffers.process_pending
worker_1                                    |                 .> cleanup          exchange=default(direct) key=cleanup
worker_1                                    |                 .> commits          exchange=default(direct) key=commits
worker_1                                    |                 .> counters-0       exchange=counters(direct) key=default
worker_1                                    |                 .> data_export      exchange=default(direct) key=data_export
worker_1                                    |                 .> default          exchange=default(direct) key=default
worker_1                                    |                 .> digests.delivery exchange=default(direct) key=digests.delivery
worker_1                                    |                 .> digests.scheduling exchange=default(direct) key=digests.scheduling
worker_1                                    |                 .> email            exchange=default(direct) key=email
worker_1                                    |                 .> events.preprocess_event exchange=default(direct) key=events.preprocess_event
worker_1                                    |                 .> events.process_event exchange=default(direct) key=events.process_event
worker_1                                    |                 .> events.reprocess_events exchange=default(direct) key=events.reprocess_events
worker_1                                    |                 .> events.reprocessing.preprocess_event exchange=default(direct) key=events.reprocessing.preprocess_event
worker_1                                    |                 .> events.reprocessing.process_event exchange=default(direct) key=events.reprocessing.process_event
worker_1                                    |                 .> events.reprocessing.symbolicate_event exchange=default(direct) key=events.reprocessing.symbolicate_event
worker_1                                    |                 .> events.save_event exchange=default(direct) key=events.save_event
worker_1                                    |                 .> events.symbolicate_event exchange=default(direct) key=events.symbolicate_event
worker_1                                    |                 .> files.delete     exchange=default(direct) key=files.delete
worker_1                                    |                 .> group_owners.process_suspect_commits exchange=default(direct) key=group_owners.process_suspect_commits
worker_1                                    |                 .> incident_snapshots exchange=default(direct) key=incident_snapshots
worker_1                                    |                 .> incidents        exchange=default(direct) key=incidents
worker_1                                    |                 .> integrations     exchange=default(direct) key=integrations
worker_1                                    |                 .> merge            exchange=default(direct) key=merge
worker_1                                    |                 .> options          exchange=default(direct) key=options
worker_1                                    |                 .> relay_config     exchange=default(direct) key=relay_config
worker_1                                    |                 .> reports.deliver  exchange=default(direct) key=reports.deliver
worker_1                                    |                 .> reports.prepare  exchange=default(direct) key=reports.prepare
worker_1                                    |                 .> search           exchange=default(direct) key=search
worker_1                                    |                 .> sleep            exchange=default(direct) key=sleep
worker_1                                    |                 .> stats            exchange=default(direct) key=stats
worker_1                                    |                 .> subscriptions    exchange=default(direct) key=subscriptions
worker_1                                    |                 .> triggers-0       exchange=triggers(direct) key=default
worker_1                                    |                 .> unmerge          exchange=default(direct) key=unmerge
worker_1                                    |                 .> update           exchange=default(direct) key=update
worker_1                                    | 
worker_1                                    | 06:39:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 06:44:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 06:49:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 06:49:46 [INFO] sentry.tasks.update_user_reports: update_user_reports.records_updated (reports_to_update=0 reports_with_event=0 updated_reports=0)
worker_1                                    | 06:54:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 06:59:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 07:04:46 [INFO] sentry.tasks.update_user_reports: update_user_reports.records_updated (reports_to_update=0 reports_with_event=0 updated_reports=0)
worker_1                                    | 07:04:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
worker_1                                    | 07:09:46 [WARNING] sentry.tasks.release_registry: Release registry URL is not specified, skipping the task.
web_1                                       | 06:34:43 [INFO] sentry.plugins.github: apps-not-configured
web_1                                       | *** Starting uWSGI 2.0.19.1 (64bit) on [Fri Jun 11 06:34:44 2021] ***
web_1                                       | compiled with version: 8.3.0 on 10 June 2021 13:23:55
web_1                                       | os: Linux-5.4.0-74-generic #83-Ubuntu SMP Sat May 8 02:35:39 UTC 2021
web_1                                       | nodename: 940d3c672498
web_1                                       | machine: x86_64
web_1                                       | clock source: unix
web_1                                       | detected number of CPU cores: 4
web_1                                       | current working directory: /
web_1                                       | detected binary path: /usr/local/bin/uwsgi
web_1                                       | !!! no internal routing support, rebuild with pcre support !!!
web_1                                       | your memory page size is 4096 bytes
web_1                                       | detected max file descriptor number: 1048576
web_1                                       | lock engine: pthread robust mutexes
web_1                                       | thunder lock: enabled
web_1                                       | uWSGI http bound on 0.0.0.0:9000 fd 4
web_1                                       | uwsgi socket 0 bound to TCP address 127.0.0.1:35381 (port auto-assigned) fd 3
web_1                                       | Python version: 3.6.13 (default, May 12 2021, 16:48:24)  [GCC 8.3.0]
web_1                                       | Set PythonHome to /usr/local
web_1                                       | Python main interpreter initialized at 0x560f34a10fa0
web_1                                       | python threads support enabled
web_1                                       | your server socket listen backlog is limited to 100 connections
web_1                                       | your mercy for graceful operations on workers is 60 seconds
web_1                                       | setting request body buffering size to 65536 bytes
web_1                                       | mapped 1924224 bytes (1879 KB) for 12 cores
web_1                                       | *** Operational MODE: preforking+threaded ***
web_1                                       | spawned uWSGI master process (pid: 22)
web_1                                       | spawned uWSGI worker 1 (pid: 26, cores: 4)
web_1                                       | spawned uWSGI worker 2 (pid: 27, cores: 4)
web_1                                       | spawned uWSGI worker 3 (pid: 28, cores: 4)
web_1                                       | spawned uWSGI http 1 (pid: 29)
web_1                                       | 06:34:49 [INFO] sentry.plugins.github: apps-not-configured
web_1                                       | 06:34:49 [INFO] sentry.plugins.github: apps-not-configured
web_1                                       | WSGI app 0 (mountpoint='') ready in 5 seconds on interpreter 0x560f34a10fa0 pid: 28 (default app)
web_1                                       | WSGI app 0 (mountpoint='') ready in 5 seconds on interpreter 0x560f34a10fa0 pid: 27 (default app)
web_1                                       | 06:34:49 [INFO] sentry.plugins.github: apps-not-configured
web_1                                       | WSGI app 0 (mountpoint='') ready in 5 seconds on interpreter 0x560f34a10fa0 pid: 26 (default app)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.logged-in (ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.auth: user.auth.success (ip_address='192.168.1.133' username='wp_byy@163.com' organization_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/auth/login/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/organizations/sentry/issues/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/?detailed=0' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/projects/?all_projects=1&collapse=latestDeploys' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/teams/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/assistant/?v2' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/?member=1' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/internal/health/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:03 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/internal/options/?query=is:required' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:04 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/internal/options/?query=is:required' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:06 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/internal/options/?query=is:required' method='PUT' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:06 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/?detailed=0' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:06 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/teams/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:06 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/projects/?all_projects=1&collapse=latestDeploys' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/users/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/searches/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/projects/?collapse=latestDeploys&per_page=50' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/tags/?statsPeriod=14d&use_cache=1' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/broadcasts/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/processingissues/' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/issues/?collapse=stats&expand=owners&expand=inbox&limit=25&query=is%3Aunresolved&shortIdLookup=1&statsPeriod=14d' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:07 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/recent-searches/?query=&type=0&limit=3' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:08 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/projects/?per_page=1' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:08 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/sent-first-event/?is_member=true' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:08 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/organizations/sentry/issues-count/?query=is%3Aunresolved%20is%3Afor_review%20assigned_or_suggested%3A%5Bme%2C%20none%5D&query=is%3Aignored&statsPeriod=14d' method='GET' ip_address='192.168.1.133' user_id=1)
web_1                                       | 07:01:08 [INFO] sentry.superuser: superuser.request (url='http://192.168.1.102/api/0/projects/sentry/internal/issues/?limit=1' method='GET' ip_address='192.168.1.133' user_id=1)
zookeeper_1                                 | ===> ENV Variables ...
zookeeper_1                                 | ALLOW_UNSIGNED=false
zookeeper_1                                 | COMPONENT=zookeeper
zookeeper_1                                 | CONFLUENT_DEB_VERSION=1
zookeeper_1                                 | CONFLUENT_PLATFORM_LABEL=
zookeeper_1                                 | CONFLUENT_SUPPORT_METRICS_ENABLE=false
zookeeper_1                                 | CONFLUENT_VERSION=5.5.0
zookeeper_1                                 | CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar
zookeeper_1                                 | HOME=/root
zookeeper_1                                 | HOSTNAME=d815b94b68d0
zookeeper_1                                 | KAFKA_OPTS=-Dzookeeper.4lw.commands.whitelist=ruok
zookeeper_1                                 | KAFKA_VERSION=
zookeeper_1                                 | LANG=C.UTF-8
zookeeper_1                                 | PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
zookeeper_1                                 | PWD=/
zookeeper_1                                 | PYTHON_PIP_VERSION=8.1.2
zookeeper_1                                 | PYTHON_VERSION=2.7.9-1
zookeeper_1                                 | SCALA_VERSION=2.12
zookeeper_1                                 | SHLVL=1
zookeeper_1                                 | ZOOKEEPER_CLIENT_PORT=2181
zookeeper_1                                 | ZOOKEEPER_LOG4J_ROOT_LOGLEVEL=WARN
zookeeper_1                                 | ZOOKEEPER_TOOLS_LOG4J_LOGLEVEL=WARN
zookeeper_1                                 | ZULU_OPENJDK_VERSION=8=8.38.0.13
zookeeper_1                                 | _=/usr/bin/env
zookeeper_1                                 | ===> User
zookeeper_1                                 | uid=0(root) gid=0(root) groups=0(root)
zookeeper_1                                 | ===> Configuring ...
zookeeper_1                                 | ===> Running preflight checks ... 
zookeeper_1                                 | ===> Check if /var/lib/zookeeper/data is writable ...
zookeeper_1                                 | ===> Check if /var/lib/zookeeper/log is writable ...
zookeeper_1                                 | ===> Launching ... 
zookeeper_1                                 | ===> Launching zookeeper ... 
zookeeper_1                                 | [2021-06-11 06:31:47,694] WARN Either no config or no quorum defined in config, running  in standalone mode (org.apache.zookeeper.server.quorum.QuorumPeerMain)
zookeeper_1                                 | [2021-06-11 06:31:47,813] WARN o.e.j.s.ServletContextHandler@167fdd33{/,null,UNAVAILABLE} contextPath ends with /* (org.eclipse.jetty.server.handler.ContextHandler)
zookeeper_1                                 | [2021-06-11 06:31:47,813] WARN Empty contextPath (org.eclipse.jetty.server.handler.ContextHandler)
zookeeper_1                                 | ===> ENV Variables ...
zookeeper_1                                 | ALLOW_UNSIGNED=false
zookeeper_1                                 | COMPONENT=zookeeper
zookeeper_1                                 | CONFLUENT_DEB_VERSION=1
zookeeper_1                                 | CONFLUENT_PLATFORM_LABEL=
zookeeper_1                                 | CONFLUENT_SUPPORT_METRICS_ENABLE=false
zookeeper_1                                 | CONFLUENT_VERSION=5.5.0
zookeeper_1                                 | CUB_CLASSPATH=/etc/confluent/docker/docker-utils.jar
zookeeper_1                                 | HOME=/root
zookeeper_1                                 | HOSTNAME=d815b94b68d0
zookeeper_1                                 | KAFKA_OPTS=-Dzookeeper.4lw.commands.whitelist=ruok
zookeeper_1                                 | KAFKA_VERSION=
zookeeper_1                                 | LANG=C.UTF-8
zookeeper_1                                 | PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
zookeeper_1                                 | PWD=/
zookeeper_1                                 | PYTHON_PIP_VERSION=8.1.2
zookeeper_1                                 | PYTHON_VERSION=2.7.9-1
zookeeper_1                                 | SCALA_VERSION=2.12
zookeeper_1                                 | SHLVL=1
zookeeper_1                                 | ZOOKEEPER_CLIENT_PORT=2181
zookeeper_1                                 | ZOOKEEPER_LOG4J_ROOT_LOGLEVEL=WARN
zookeeper_1                                 | ZOOKEEPER_TOOLS_LOG4J_LOGLEVEL=WARN
zookeeper_1                                 | ZULU_OPENJDK_VERSION=8=8.38.0.13
zookeeper_1                                 | _=/usr/bin/env
zookeeper_1                                 | ===> User
zookeeper_1                                 | uid=0(root) gid=0(root) groups=0(root)
zookeeper_1                                 | ===> Configuring ...
zookeeper_1                                 | ===> Running preflight checks ... 
zookeeper_1                                 | ===> Check if /var/lib/zookeeper/data is writable ...
zookeeper_1                                 | ===> Check if /var/lib/zookeeper/log is writable ...
zookeeper_1                                 | ===> Launching ... 
zookeeper_1                                 | ===> Launching zookeeper ... 
zookeeper_1                                 | [2021-06-11 06:34:37,420] WARN Either no config or no quorum defined in config, running  in standalone mode (org.apache.zookeeper.server.quorum.QuorumPeerMain)
zookeeper_1                                 | [2021-06-11 06:34:38,440] WARN o.e.j.s.ServletContextHandler@4d95d2a2{/,null,UNAVAILABLE} contextPath ends with /* (org.eclipse.jetty.server.handler.ContextHandler)
zookeeper_1                                 | [2021-06-11 06:34:38,440] WARN Empty contextPath (org.eclipse.jetty.server.handler.ContextHandler)
BYK commented 3 years ago

Have you tried running docker-compose restart relay?

FeiBa0125 commented 3 years ago

I tried to run this command several times and the result was the same.

git pull
./install.sh
docker-compose up

I'm not sure I need any other configuration before I run it?

BYK commented 3 years ago

That's not the command I shared in my previous comment.

Have you tried restarting relay?

FeiBa0125 commented 3 years ago

这不是我在之前的评论中分享的命令。

您是否尝试过重新启动中继?

yes,i did. This is the latest relay log

relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 33ms in state CONNECT)
relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 33ms in state CONNECT)
relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: Global error: AllBrokersDown (Local: All broker connections are down): 1/1 brokers are down
relay_1                                     | 2021-06-15T09:50:56Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution
relay_1                                     | 2021-06-15T09:50:56Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution
relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 0ms in state CONNECT)
relay_1                                     | 2021-06-15T09:50:56Z [rdkafka::client] ERROR: librdkafka: Global error: AllBrokersDown (Local: All broker connections are down): 1/1 brokers are down
relay_1                                     | 2021-06-15T09:50:56Z [actix::actors::resolver] WARN: Can not create system dns resolver: io error
relay_1                                     | 2021-06-15T09:50:57Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 5ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-15T09:50:57Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 5ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-15T09:50:57Z [rdkafka::client] ERROR: librdkafka: FAIL [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-15T09:50:57Z [rdkafka::client] ERROR: librdkafka: Global error: BrokerTransportFailure (Local: Broker transport failure): kafka:9092/bootstrap: Connect to ipv4#172.26.0.11:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
relay_1                                     | 2021-06-15T09:50:59Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-15T09:51:00Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-15T09:51:03Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-15T09:51:06Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 0ns
relay_1                                     | 2021-06-15T09:51:06Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-15T09:51:06Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 1s
relay_1                                     | 2021-06-15T09:51:07Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 1.5s
relay_1                                     | 2021-06-15T09:51:08Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 2.25s
relay_1                                     | 2021-06-15T09:51:11Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 3.375s
relay_1                                     | 2021-06-15T09:51:11Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): error trying to connect: tcp connect error: Connection refused (os error 111)
relay_1                                     | 2021-06-15T09:51:14Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 5.0625s
relay_1                                     | 2021-06-15T09:51:24Z [relay_server::actors::upstream] ERROR: authentication encountered error: could not send request to upstream
relay_1                                     |   caused by: could not send request using reqwest
relay_1                                     |   caused by: error sending request for url (http://web:9000/api/0/relays/register/challenge/): operation timed out
relay_1                                     | 2021-06-15T09:51:24Z [relay_server::actors::upstream] WARN: Network outage, scheduling another check in 7.59375s
BYK commented 3 years ago

Seems like you have a networking issue there as Relay can neither connect to Sentry web nor to kafka. This may also be related to getsentry/relay#997 so you may wanna check some suggestions from that thread.

github-actions[bot] commented 3 years ago

This issue has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you label it Status: Backlog or Status: In Progress, I will leave it alone ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀