Open def- opened 4 months ago
docker logs from running it locally. It stopped in kafka-recreate-topic.td
:
--- kafka-recreate-topic.td
Connecting to PostgreSQL server at postgres://mz_system:materialize@materialized:6877...
> DROP DATABASE materialize
Creating Kafka topic testdrive-topic0-1458174224 with partition count of 4
Ingesting data into Kafka topic testdrive-topic0-1458174224 with start_iteration = 0, repeat = 1
> CREATE CONNECTION IF NOT EXISTS csr_conn TO CONFLUENT SCHEMA REGISTRY (
URL 'http://schema-registry:8081/'
);
rows match; continuing at ts 1709658045.0080962
> CREATE CONNECTION kafka_conn
TO KAFKA (BROKER 'kafka:9092', SECURITY PROTOCOL PLAINTEXT);
rows match; continuing at ts 1709658045.1533785
> CREATE CLUSTER to_recreate SIZE '1', REPLICATION FACTOR 1;
rows match; continuing at ts 1709658045.4746995
> CREATE SOURCE source0
IN CLUSTER to_recreate
FROM KAFKA CONNECTION kafka_conn (TOPIC 'testdrive-topic0-1458174224')
KEY FORMAT TEXT
VALUE FORMAT TEXT
ENVELOPE UPSERT
rows match; continuing at ts 1709658045.6735914
> SELECT * FROM source0
rows match; continuing at ts 1709658047.2475429
Deleting Kafka topic testdrive-topic0-1458174224
Sleeping for 2s
Creating Kafka topic testdrive-topic0-1458174224 with partition count of 2
> SELECT * FROM source0
query error didn't match; sleeping to see if dataflow produces error shortly 50ms 75ms 113ms 169ms 253ms 380ms 570ms 854ms 1s 2s 3s 4s 6s 10s 15s
query error matches; continuing
Creating Kafka topic testdrive-topic1-1458174224 with partition count of 1
Ingesting data into Kafka topic testdrive-topic1-1458174224 with start_iteration = 0, repeat = 1
> CREATE SOURCE source1
IN CLUSTER to_recreate
FROM KAFKA CONNECTION kafka_conn (TOPIC 'testdrive-topic1-1458174224')
FORMAT TEXT
ENVELOPE NONE
rows match; continuing at ts 1709658094.0188534
> SELECT * FROM source1
rows match; continuing at ts 1709658095.0843124
Deleting Kafka topic testdrive-topic1-1458174224
Sleeping for 2s
Creating Kafka topic testdrive-topic1-1458174224 with partition count of 1
> SELECT * FROM source1
query error didn't match; sleeping to see if dataflow produces error shortly 50ms 75ms 113ms 169ms 253ms 380ms 570ms 854ms 1s 2s 3s 4s 6s 10s 15s 22s 33s 22s
environmentd: 2024-03-05T17:00:45.536832Z INFO sequence_create_source:coord::catalog_transact_with_side_effects:coord::catalog_transact_inner:catalog::transact:catalog::transact_inner: mz_adapter::catalog::state: create source materialize.public.source0_progress (u979)
environmentd: 2024-03-05T17:00:45.536915Z INFO sequence_create_source:coord::catalog_transact_with_side_effects:coord::catalog_transact_inner:catalog::transact:catalog::transact_inner: mz_adapter::catalog::state: create source materialize.public.source0 (u980)
cluster-u189-replica-u198: 2024-03-05T17:00:45.676702Z INFO tokio_postgres::connection: NOTICE: relation "consensus" already exists, skipping
cluster-u189-replica-u198: 2024-03-05T17:00:45.697424Z INFO mz_storage::storage_state: worker 0/1 trying to (re-)start ingestion u980 as_of=Antichain { elements: [0] } resume_uppers={User(980): Antichain { elements: [0] }}
cluster-u189-replica-u198: 2024-03-05T17:00:45.697704Z INFO mz_storage::source::source_reader_pipeline: timely-0 building source pipeline id=u980 as_of={0}
cluster-u189-replica-u198: 2024-03-05T17:00:45.699020Z INFO mz_storage::render::upsert: timely-0 rendering u980 with rocksdb-backed upsert state tuning=RocksDBConfig { compaction_style: Level, optimize_compaction_memtable_budget: 178956970, level_compaction_dynamic_level_bytes: true, universal_compaction_target_ratio: 200, parallelism: None, compression_type: Lz4, bottommost_compression_type: Lz4, retry_max_duration: 1s, stats_log_interval_seconds: 600, stats_persist_interval_seconds: 600, point_lookup_block_cache_size_mb: None, shrink_buffers_by_ratio: 0, dynamic: RocksDBDynamicConfig { batch_size: 20480 }, write_buffer_manager_config: RocksDbWriteBufferManagerConfig { write_buffer_manager_memory_bytes: None, write_buffer_manager_memory_fraction: None, write_buffer_manager_allow_stall: false, cluster_memory_limit: None } } storage_configuration.parameters.upsert_auto_spill_config=UpsertAutoSpillConfig { allow_spilling_to_disk: false, spill_to_disk_threshold_bytes: 89478484 }
cluster-u189-replica-u198: 2024-03-05T17:00:45.699344Z INFO mz_storage::render: timely-0 rendering u980 with multi-worker persist_sink
cluster-u189-replica-u198: 2024-03-05T17:00:45.699915Z INFO mz_storage::source::kafka: instantiating Kafka source reader at offsets {} source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:45.700627Z WARN librdkafka: warning: CONFWARN [thrd:app]: Configuration property transaction.timeout.ms is a producer property and will be ignored by this consumer instance
cluster-u189-replica-u198: 2024-03-05T17:00:45.700636Z INFO mz_storage::source::kafka: kafka worker noticed rehydration is finished, starting partition queues... source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:45.703664Z INFO mz_storage::source::reclock::compat: remap(u980) 0/1 initializing PersistHandle since=Antichain { elements: [0] } as_of=Antichain { elements: [0] } upper=Antichain { elements: [0] }
cluster-u189-replica-u198: 2024-03-05T17:00:45.710038Z INFO mz_rocksdb: Starting rocksdb at "/scratch/cluster-u189-replica-u198/storage/upsert/u980/0" with write_buffer_manager: None
cluster-u189-replica-u198: 2024-03-05T17:00:45.780089Z INFO mz_storage::render::upsert: timely-0 upsert source u980 finished rehydration
cluster-u189-replica-u198: 2024-03-05T17:00:46.703024Z INFO mz_storage::source::kafka: activating Kafka queue for topic testdrive-topic0-1458174224, partition 0 source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:46.703142Z INFO mz_storage::source::kafka: activating Kafka queue for topic testdrive-topic0-1458174224, partition 1 source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:46.703206Z INFO mz_storage::source::kafka: activating Kafka queue for topic testdrive-topic0-1458174224, partition 2 source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:46.703272Z INFO mz_storage::source::kafka: activating Kafka queue for topic testdrive-topic0-1458174224, partition 3 source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:00:46.703627Z INFO mz_storage::healthcheck: Health transition for source u980: Some(Starting) -> Some(Running)
cluster-u189-replica-u198: 2024-03-05T17:00:47.518577Z INFO librdkafka: PARTCNT [thrd:main]: Topic testdrive-topic0-1458174224 partition count changed from 4 to 0
cluster-u189-replica-u198: 2024-03-05T17:00:47.518911Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [0]: desired partition is no longer available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519041Z WARN librdkafka: error: Global error: UnknownTopicOrPartition (Broker: Unknown topic or partition): testdrive-topic0-1458174224 [0]: topic does not exist (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519123Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [1]: desired partition is no longer available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519199Z WARN librdkafka: error: Global error: UnknownTopicOrPartition (Broker: Unknown topic or partition): testdrive-topic0-1458174224 [1]: topic does not exist (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519274Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [2]: desired partition is no longer available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519349Z WARN librdkafka: error: Global error: UnknownTopicOrPartition (Broker: Unknown topic or partition): testdrive-topic0-1458174224 [2]: topic does not exist (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519425Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [3]: desired partition is no longer available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.519501Z WARN librdkafka: error: Global error: UnknownTopicOrPartition (Broker: Unknown topic or partition): testdrive-topic0-1458174224 [3]: topic does not exist (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:47.922726Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:48.918283Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:49.518785Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [2]: desired partition is not available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:49.518882Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic0-1458174224 [3]: desired partition is not available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:00:49.714127Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:50.711565Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:51.715799Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:52.715482Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:53.728001Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:54.715631Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:55.711824Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:56.712085Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:57.711869Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:58.711954Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:00:59.711891Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:00.717486Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:01.715508Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:02.713306Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:03.713092Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:04.712090Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:05.713220Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:06.713226Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:07.714551Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:08.712664Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:09.713196Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:10.716442Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:11.721816Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:12.714198Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:13.722064Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:14.713564Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:15.712749Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:16.715460Z WARN mz_storage::source::kafka: timely-u980 source(0) failed to commit offsets: resume_upper={([0, 0], 0), ([1, 1], 0), ([2, 2], 0), ([3, 3], 1), ([3>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:17.552017Z INFO mz_storage::source::kafka: kafka metadata thread: partition info has been dropped; shutting down. source_id="u980" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:01:17.552227Z INFO mz_storage::healthcheck: Health transition for source u980: Some(Running) -> Some(Ceased { error: "topic was recreated: partition count regressed from 4 to 2" })
cluster-u189-replica-u198: 2024-03-05T17:01:17.754665Z WARN mz_rocksdb: failed to cleanup legacy rocksdb dir at /scratch/cluster-u189-replica-u198/u980/0: IO error: No such file or directory: while open a file for lock: /scratch/cluster-u189-replica-u198/u980/0/LOCK: No such file or directory
environmentd: 2024-03-05T17:01:33.810839Z WARN librdkafka: warning: CONFWARN [thrd:app]: Configuration property transaction.timeout.ms is a producer property and will be ignored by this consumer instance
environmentd: 2024-03-05T17:01:33.867084Z INFO sequence_create_source:coord::catalog_transact_with_side_effects:coord::catalog_transact_inner:catalog::transact:catalog::transact_inner: mz_adapter::catalog::state: create source materialize.public.source1_progress (u981)
environmentd: 2024-03-05T17:01:33.867235Z INFO sequence_create_source:coord::catalog_transact_with_side_effects:coord::catalog_transact_inner:catalog::transact:catalog::transact_inner: mz_adapter::catalog::state: create source materialize.public.source1 (u982)
cluster-u189-replica-u198: 2024-03-05T17:01:34.014243Z INFO mz_storage::storage_state: worker 0/1 trying to (re-)start ingestion u982 as_of=Antichain { elements: [0] } resume_uppers={User(982): Antichain { elements: [0] }}
cluster-u189-replica-u198: 2024-03-05T17:01:34.015039Z INFO mz_storage::source::source_reader_pipeline: timely-0 building source pipeline id=u982 as_of={0}
cluster-u189-replica-u198: 2024-03-05T17:01:34.017509Z INFO mz_storage::render: timely-0 rendering u982 with multi-worker persist_sink
cluster-u189-replica-u198: 2024-03-05T17:01:34.018167Z INFO mz_storage::source::kafka: instantiating Kafka source reader at offsets {} source_id="u982" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:01:34.018493Z WARN librdkafka: warning: CONFWARN [thrd:app]: Configuration property transaction.timeout.ms is a producer property and will be ignored by this consumer instance
cluster-u189-replica-u198: 2024-03-05T17:01:34.018506Z INFO mz_storage::source::kafka: kafka worker noticed rehydration is finished, starting partition queues... source_id="u982" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:01:34.021007Z INFO mz_storage::source::reclock::compat: remap(u982) 0/1 initializing PersistHandle since=Antichain { elements: [0] } as_of=Antichain { elements: [0] } upper=Antichain { elements: [0] }
cluster-u189-replica-u198: 2024-03-05T17:01:35.020240Z INFO mz_storage::source::kafka: activating Kafka queue for topic testdrive-topic1-1458174224, partition 0 source_id="u982" worker_id=0 num_workers=1
cluster-u189-replica-u198: 2024-03-05T17:01:35.020588Z INFO mz_storage::healthcheck: Health transition for source u982: Some(Starting) -> Some(Running)
cluster-u189-replica-u198: 2024-03-05T17:01:35.345228Z INFO librdkafka: PARTCNT [thrd:main]: Topic testdrive-topic1-1458174224 partition count changed from 1 to 0
cluster-u189-replica-u198: 2024-03-05T17:01:35.345481Z WARN librdkafka: error: Global error: UnknownPartition (Local: Unknown partition): testdrive-topic1-1458174224 [0]: desired partition is no longer available (Local: Unknown partition)
cluster-u189-replica-u198: 2024-03-05T17:01:35.345574Z WARN librdkafka: error: Global error: UnknownTopicOrPartition (Broker: Unknown topic or partition): testdrive-topic1-1458174224 [0]: topic does not exist (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:36.229415Z WARN mz_storage::source::kafka: timely-u982 source(0) failed to commit offsets: resume_upper={([0, 0], 1), ([0>, +inf], 0)} e=Consumer commit error: UnknownTopicOrPartition (Broker: Unknown topic or partition)
cluster-u189-replica-u198: 2024-03-05T17:01:38.019495Z WARN librdkafka: warning: OFFSET [thrd:main]: testdrive-topic1-1458174224 [0]: offset reset (at offset 1, broker 1) to BEGINNING: fetch failed due to requested offset not available on the broker: Broker: Offset out of range
environmentd: 2024-03-05T17:24:28.720538Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (rb37697fd-d5fe-49fb-a775-41fc101d576d) of shard (s53853bee-ffe5-451d-807c-f07ee1c289aa) due to inactivity
environmentd: 2024-03-05T17:24:28.720557Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (rc3c977f8-78b4-4e03-9e1e-8a1a54c0a935) of shard (s53853bee-ffe5-451d-807c-f07ee1c289aa) due to inactivity
cluster-u189-replica-u198: 2024-03-05T17:24:28.723918Z INFO mz_persist_client::internal::state: Force expiring reader (r46fec27a-7d41-4556-8bf3-c20d6bf1c2f0) of shard (sfda88b68-c27a-4278-8a2a-103d9dcf9d29) due to inactivity
cluster-u189-replica-u198: 2024-03-05T17:24:28.723933Z INFO mz_persist_client::internal::state: Force expiring reader (r6617fa06-31de-4827-b3ff-3c34f1d292c6) of shard (sfda88b68-c27a-4278-8a2a-103d9dcf9d29) due to inactivity
thread 'timely:work-0' panicked at /var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:867:17:
LeasedReaderId(r46fec27a-7d41-4556-8bf3-c20d6bf1c2f0) was expired due to inactivity. Did the machine go to sleep?
stack backtrace:
environmentd: 2024-03-05T17:24:28.765772Z INFO PersistTableWriteWorkerInner::send: mz_persist_client::internal::state: Force expiring reader (r987f98f6-dc36-4d5c-ad98-a7edb7ea10a6) of shard (s40791ab4-5f1a-48a6-ad07-b3dd8039fd9c) due to inactivity
thread 'tokio:work-0' panicked at /var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:867:17:
LeasedReaderId(r987f98f6-dc36-4d5c-ad98-a7edb7ea10a6) was expired due to inactivity. Did the machine go to sleep?
stack backtrace:
environmentd: 2024-03-05T17:24:28.770955Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r8870adbe-6cd9-4a45-a79f-5a3adbb2f69f) of shard (sef7fb659-2238-4610-bfa3-a9a0d26a7ab2) due to inactivity
environmentd: 2024-03-05T17:24:28.771064Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r6128ed80-443b-4afe-b613-48afaaa31faf) of shard (s93ccd536-44d3-4f2d-8f4e-224920bca179) due to inactivity
thread 'timely:work-0' panicked at /var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:867:17:
LeasedReaderId(rc3c977f8-78b4-4e03-9e1e-8a1a54c0a935) was expired due to inactivity. Did the machine go to sleep?
stack backtrace:
environmentd: 2024-03-05T17:24:28.771645Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r51d33d79-3f7d-433a-afa8-2e0605760a17) of shard (sace06f84-8f81-4fb2-807d-100d0ef90762) due to inactivity
environmentd: 2024-03-05T17:24:28.771945Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r7496e1e1-2b73-424d-a602-df3ddef129ac) of shard (s16d4c8c3-f9c1-4f89-a2e6-4c107da76ace) due to inactivity
environmentd: 2024-03-05T17:24:28.771990Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r683c1b52-8544-4ffe-a831-9b5a534156ce) of shard (s1ffcadd0-4f60-41e7-96a4-b7bb5b5e785c) due to inactivity
environmentd: 2024-03-05T17:24:28.772026Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r8b9d0f74-8aa2-4535-94be-491804d05400) of shard (s0e65b780-ff49-4654-b38a-4dc62f71df9e) due to inactivity
environmentd: 2024-03-05T17:24:28.772094Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (raab1d6ab-761b-48c9-92cd-74b663063749) of shard (s9d147bf2-03c0-4ab3-bdd9-aceffac2c775) due to inactivity
environmentd: 2024-03-05T17:24:28.772129Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r8d43e62a-f2e8-4575-9518-e8496c4cfa93) of shard (s254bbcc7-f7dc-4218-a27c-da58ddfc6a11) due to inactivity
environmentd: 2024-03-05T17:24:28.805201Z INFO PersistMonotonicWriteCmd::send: mz_persist_client::internal::state: Force expiring reader (r269f3762-1f98-40ac-95f7-2b07781a1202) of shard (sb7ccc232-e30b-442c-83ad-117f35df885f) due to inactivity
0: rust_begin_unwind
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:645:5
1: core::panicking::panic_fmt
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/panicking.rs:72:14
2: <mz_persist_client::internal::state::StateCollections<mz_repr::timestamp::Timestamp>>::leased_reader::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:867:17
3: <core::option::Option<&mut mz_persist_client::internal::state::LeasedReaderState<mz_repr::timestamp::Timestamp>>>::unwrap_or_else::<<mz_persist_client::internal::state::StateCollections<mz_repr::timestamp::Timestamp>>::leased_reader::{closure#0}>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/option.rs:976:21
4: <mz_persist_client::internal::state::StateCollections<mz_repr::timestamp::Timestamp>>::leased_reader
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:866:14
5: <mz_persist_client::internal::state::StateCollections<mz_repr::timestamp::Timestamp>>::downgrade_since
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:662:28
6: <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/machine.rs:556:13
7: <&mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0} as core::ops::function::FnMut<(mz_persist::location::SeqNo, &mz_persist_client::cfg::PersistConfig, &mut mz_persist_client::internal::state::StateCollections<mz_repr::timestamp::Timestamp>)>>::call_mut
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/ops/function.rs:294:13
8: <mz_persist_client::internal::state::TypedState<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::clone_apply::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/state.rs:1174:24
9: <mz_persist_client::internal::apply::Applier<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::compute_next_state_locked::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/apply.rs:451:47
10: <mz_persist_client::internal::apply::Applier<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::apply_unbatched_cmd_locked::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/apply.rs:363:17
11: <mz_persist_client::cache::LockingTypedState<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::read_lock::<core::result::Result<mz_persist_client::internal::apply::NextState<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, (mz_persist::location::SeqNo, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>)>, <mz_persist_client::internal::apply::Applier<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::apply_unbatched_cmd_locked<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>::{closure#0}::{closure#0}>
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/cache.rs:638:9
12: <mz_persist_client::internal::apply::Applier<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::apply_unbatched_cmd_locked::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/apply.rs:361:35
13: <mz_persist_client::internal::apply::Applier<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::apply_unbatched_cmd::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, mz_persist_client::internal::state::NoOpStateTransition<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>>, &mut <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/apply.rs:316:14
14: <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::apply_unbatched_idempotent_cmd::<mz_persist_client::internal::state::Since<mz_repr::timestamp::Timestamp>, <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/machine.rs:973:71
15: <mz_persist_client::internal::machine::Machine<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/internal/machine.rs:564:10
16: <mz_persist_client::read::ReadHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:645:14
17: <mz_persist_client::read::ReadHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::downgrade_since::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:629:5
18: <mz_persist_client::read::ReadHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::maybe_downgrade_since::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:870:45
19: <mz_persist_client::read::Listen<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::next::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:378:56
20: <mz_persist_client::read::Subscribe<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::next::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:145:69
21: <mz_persist_client::read::Subscribe<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64>>::next::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-client/src/read.rs:136:5
22: <mz_persist_txn::txn_cache::TxnsCache<mz_repr::timestamp::Timestamp, mz_storage_types::controller::TxnsCodecRow>>::update::<<mz_persist_txn::txn_cache::TxnsCache<mz_repr::timestamp::Timestamp, mz_storage_types::controller::TxnsCodecRow>>::update_gt::{closure#0}::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txn_cache.rs:768:57
23: <mz_persist_txn::txn_cache::TxnsCache<mz_repr::timestamp::Timestamp, mz_storage_types::controller::TxnsCodecRow>>::update_gt::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txn_cache.rs:752:14
24: <mz_persist_txn::txn_cache::TxnsCache<mz_repr::timestamp::Timestamp, mz_storage_types::controller::TxnsCodecRow>>::update_gt::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txn_cache.rs:749:5
25: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_le::{closure#0}::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:523:43
26: <mz_persist_txn::metrics::InfallibleOpMetrics>::run::<mz_persist_txn::txns::Tidy, <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_le::{closure#0}::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/metrics.rs:151:22
27: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_le::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:587:10
28: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_le::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:518:5
29: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_eager_le::{closure#0}::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:598:42
30: <mz_persist_txn::metrics::InfallibleOpMetrics>::run::<mz_persist_txn::txns::Tidy, <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_eager_le::{closure#0}::{closure#0}::{closure#0}>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/metrics.rs:151:22
31: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_eager_le::{closure#0}::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:623:10
32: <mz_persist_txn::txns::TxnsHandle<mz_storage_types::sources::SourceData, (), mz_repr::timestamp::Timestamp, i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>>::apply_eager_le::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txns.rs:592:5
33: <mz_persist_txn::txn_write::TxnApply<mz_repr::timestamp::Timestamp>>::apply_eager::<mz_storage_types::sources::SourceData, (), i64, mz_storage_controller::PersistEpoch, mz_storage_types::controller::TxnsCodecRow>::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/persist-txn/src/txn_write.rs:313:48
34: <mz_storage_controller::persist_handles::TxnsTableWorker<mz_repr::timestamp::Timestamp>>::append::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/storage-controller/src/persist_handles.rs:618:86
35: <tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::TxnsTableWorker<mz_repr::timestamp::Timestamp>>::append::{closure#0}> as core::future::future::Future>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tracing-0.1.37/src/instrument.rs:272:9
36: <mz_storage_controller::persist_handles::TxnsTableWorker<mz_repr::timestamp::Timestamp>>::run::{closure#0}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/storage-controller/src/persist_handles.rs:465:26
37: <mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}
at ./var/lib/buildkite-agent/builds/buildkite-builders-d43b1b5-i-05ce813570de37413-1/materialize/tests/src/storage-controller/src/persist_handles.rs:365:28
38: <tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}> as core::future::future::Future>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tracing-0.1.37/src/instrument.rs:272:9
39: <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::current_thread::Handle>>>::poll::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:334:17
40: <tokio::loom::std::unsafe_cell::UnsafeCell<tokio::runtime::task::core::Stage<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>>>>::with_mut::<core::task::poll::Poll<()>, <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::current_thread::Handle>>>::poll::{closure#0}>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/loom/std/unsafe_cell.rs:16:9
41: <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::current_thread::Handle>>>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:323:13
42: tokio::runtime::task::harness::poll_future::<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:485:19
43: <core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>::{closure#0}> as core::ops::function::FnOnce<()>>::call_once
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/panic/unwind_safe.rs:272:9
44: std::panicking::try::do_call::<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>::{closure#0}>, core::task::poll::Poll<()>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:552:40
45: std::panicking::try::<core::task::poll::Poll<()>, core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>::{closure#0}>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:516:19
46: std::panic::catch_unwind::<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>::{closure#0}>, core::task::poll::Poll<()>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panic.rs:142:14
47: tokio::runtime::task::harness::poll_future::<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:473:18
48: <tokio::runtime::task::harness::Harness<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>>::poll_inner
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:208:27
49: <tokio::runtime::task::harness::Harness<tracing::instrument::Instrumented<<mz_storage_controller::persist_handles::PersistTableWriteWorker<mz_repr::timestamp::Timestamp>>::new_txns::{closure#1}>, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:153:15
50: <tokio::runtime::task::raw::RawTask>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/raw.rs:200:18
51: <tokio::runtime::task::LocalNotified<alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle>>>::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/mod.rs:400:9
52: <tokio::runtime::scheduler::multi_thread::worker::Context>::run_task::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:639:22
53: tokio::runtime::coop::with_budget::<core::result::Result<alloc::boxed::Box<tokio::runtime::scheduler::multi_thread::worker::Core>, ()>, <tokio::runtime::scheduler::multi_thread::worker::Context>::run_task::{closure#0}>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/coop.rs:107:5
54: tokio::runtime::coop::budget::<core::result::Result<alloc::boxed::Box<tokio::runtime::scheduler::multi_thread::worker::Core>, ()>, <tokio::runtime::scheduler::multi_thread::worker::Context>::run_task::{closure#0}>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/coop.rs:73:5
55: <tokio::runtime::scheduler::multi_thread::worker::Context>::run_task
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:575:9
56: <tokio::runtime::scheduler::multi_thread::worker::Context>::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:526:24
57: tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:491:21
58: <tokio::runtime::context::scoped::Scoped<tokio::runtime::scheduler::Context>>::set::<tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure#0}, ()>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/context/scoped.rs:40:9
59: tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:486:9
60: tokio::runtime::context::runtime::enter_runtime::<tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}, ()>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/context/runtime.rs:65:16
61: tokio::runtime::scheduler::multi_thread::worker::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:478:5
62: <tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/scheduler/multi_thread/worker.rs:447:45
63: <tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}> as core::future::future::Future>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/blocking/task.rs:42:21
64: <tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>> as core::future::future::Future>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tracing-0.1.37/src/instrument.rs:272:9
65: <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>>::poll::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:334:17
66: <tokio::loom::std::unsafe_cell::UnsafeCell<tokio::runtime::task::core::Stage<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>>>>::with_mut::<core::task::poll::Poll<()>, <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>>::poll::{closure#0}>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/loom/std/unsafe_cell.rs:16:9
67: <tokio::runtime::task::core::Core<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:323:13
68: tokio::runtime::task::harness::poll_future::<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:485:19
69: <core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>::{closure#0}> as core::ops::function::FnOnce<()>>::call_once
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/panic/unwind_safe.rs:272:9
70: std::panicking::try::do_call::<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>::{closure#0}>, core::task::poll::Poll<()>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:552:40
71: std::panicking::try::<core::task::poll::Poll<()>, core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>::{closure#0}>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:516:19
72: std::panic::catch_unwind::<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>::{closure#0}>, core::task::poll::Poll<()>>
at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panic.rs:142:14
73: tokio::runtime::task::harness::poll_future::<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:473:18
74: <tokio::runtime::task::harness::Harness<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>>::poll_inner
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:208:27
75: <tokio::runtime::task::harness::Harness<tracing::instrument::Instrumented<tokio::runtime::blocking::task::BlockingTask<<tokio::runtime::scheduler::multi_thread::worker::Launch>::launch::{closure#0}>>, tokio::runtime::blocking::schedule::BlockingSchedule>>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:153:15
76: <tokio::runtime::task::raw::RawTask>::poll
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/raw.rs:200:18
77: <tokio::runtime::task::UnownedTask<tokio::runtime::blocking::schedule::BlockingSchedule>>::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/mod.rs:437:9
78: <tokio::runtime::blocking::pool::Task>::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/blocking/pool.rs:159:9
79: <tokio::runtime::blocking::pool::Inner>::run
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/blocking/pool.rs:513:17
80: <tokio::runtime::blocking::pool::Spawner>::spawn_thread::{closure#0}
at ./cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/blocking/pool.rs:471:13
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
cluster-u189-replica-u198: 2024-03-05T17:24:33.483439Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
cluster-u1-replica-u1: 2024-03-05T17:24:33.483526Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
cluster-u1-replica-u1: 2024-03-05T17:24:33.483532Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
cluster-u1-replica-u1: 2024-03-05T17:24:33.483572Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
cluster-u1-replica-u1: 2024-03-05T17:24:33.483574Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
cluster-s1-replica-s1: 2024-03-05T17:24:33.483596Z WARN mz_persist_client::rpc: pubsub client error: Status { code: Unknown, message: "h2 protocol error: error reading a body from connection: stream closed because of a broken pipe", source: Some(hyper::Error(Body, Error { kind: Io(Custom { kind: BrokenPipe, error: "stream closed because of a broken pipe" }) })) }
I'll check for plan regressions, but if the plans are identical I don't see how this can be caused by #25502.
It succeeded in the rerun in CI, so maybe it's unrelated.
Un-assigning myself then, ping me if this happens again, although it seems unlikely to be optimizer related.
What version of Materialize are you using?
ff65497e3a
What is the issue?
So the test hangs entirely for 90 minutes.
Seen in https://buildkite.com/materialize/nightlies/builds/6765#018e0e98-e546-4f3b-95ae-bf21bb644b18
Local reproducer:
bin/mzcompose --find testdrive run default --default-size=8
, I'll check if this is reliably reproducible. I strongly suspect that this comes from https://github.com/MaterializeInc/materialize/pull/25502, maybe this is the same issue as https://github.com/MaterializeInc/materialize/issues/25769, and the OoM and hang are just two different symptoms.