pingcap / tidb-docker-compose

Apache License 2.0
351 stars 155 forks source link

Failed after upgrade latest version #94

Open quaff opened 4 years ago

quaff commented 4 years ago

Latest Docker Desktop for Mac and latest TiDB

Creating network "tidb-docker-compose_default" with the default driver
Creating tidb-docker-compose_grafana_1     ... done
Creating tidb-docker-compose_prometheus_1  ... done
Creating tidb-docker-compose_tidb-vision_1 ... done
Creating tidb-docker-compose_pd1_1         ... done
Creating tidb-docker-compose_pd0_1         ... done
Creating tidb-docker-compose_pd2_1         ... done
Creating tidb-docker-compose_pushgateway_1 ... done
Creating tidb-docker-compose_tikv0_1       ... done
Creating tidb-docker-compose_tikv2_1       ... done
Creating tidb-docker-compose_tikv1_1       ... done
Creating tidb-docker-compose_tidb_1           ... done
Creating tidb-docker-compose_tispark-master_1 ... done
Creating tidb-docker-compose_tispark-slave0_1 ... done
Attaching to tidb-docker-compose_pd0_1, tidb-docker-compose_pushgateway_1, tidb-docker-compose_grafana_1, tidb-docker-compose_pd2_1, tidb-docker-compose_tidb-vision_1, tidb-docker-compose_pd1_1, tidb-docker-compose_prometheus_1, tidb-docker-compose_tikv2_1, tidb-docker-compose_tikv1_1, tidb-docker-compose_tikv0_1, tidb-docker-compose_tidb_1, tidb-docker-compose_tispark-master_1, tidb-docker-compose_tispark-slave0_1
grafana_1         | t=2020-08-12T02:09:32+0000 lvl=eror msg="Can't read alert notification provisioning files from directory" logger=provisioning.notifiers path=/etc/grafana/provisioning/notifiers error="open /etc/grafana/provisioning/notifiers: no such file or directory"
tidb_1            | config file /tidb.toml contained unknown configuration options: log.file.log-rotate, performance.retry-limit, plan-cache, plan-cache.enabled, plan-cache.capacity, plan-cache.shards
tidb-vision_1     | Activating privacy features... done.
tidb-vision_1     | http://0.0.0.0:8010
tispark-master_1  | starting org.apache.spark.deploy.master.Master, logging to /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-67095e36fd36.out
tispark-slave0_1  | starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-965384e506c3.out
tispark-master_1  | Spark Command: /opt/jdk/bin/java -cp /opt/spark/conf/:/opt/spark/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host 0.0.0.0 --port 7077 --webui-port 8080
tispark-master_1  | ========================================
tispark-slave0_1  | Spark Command: /opt/jdk/bin/java -cp /opt/spark/conf/:/opt/spark/jars/* -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 38081 spark://tispark-master:7077
tispark-slave0_1  | ========================================
tispark-master_1  | 20/08/12 02:09:35 INFO Master: Started daemon with process name: 12@67095e36fd36
tispark-master_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for TERM
tispark-slave0_1  | 20/08/12 02:09:35 INFO Worker: Started daemon with process name: 11@965384e506c3
tispark-master_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for HUP
tispark-master_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for INT
tispark-slave0_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for TERM
tispark-slave0_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for HUP
tispark-slave0_1  | 20/08/12 02:09:35 INFO SignalUtils: Registered signal handler for INT
tispark-slave0_1  | 20/08/12 02:09:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
tispark-slave0_1  | 20/08/12 02:09:41 INFO SecurityManager: Changing view acls to: root
tispark-slave0_1  | 20/08/12 02:09:41 INFO SecurityManager: Changing modify acls to: root
tispark-slave0_1  | 20/08/12 02:09:41 INFO SecurityManager: Changing view acls groups to: 
tispark-slave0_1  | 20/08/12 02:09:41 INFO SecurityManager: Changing modify acls groups to: 
tispark-slave0_1  | 20/08/12 02:09:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
tispark-master_1  | 20/08/12 02:09:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
tispark-master_1  | 20/08/12 02:09:42 INFO SecurityManager: Changing view acls to: root
tispark-master_1  | 20/08/12 02:09:42 INFO SecurityManager: Changing modify acls to: root
tispark-master_1  | 20/08/12 02:09:42 INFO SecurityManager: Changing view acls groups to: 
tispark-master_1  | 20/08/12 02:09:42 INFO SecurityManager: Changing modify acls groups to: 
tispark-master_1  | 20/08/12 02:09:42 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
tispark-slave0_1  | 20/08/12 02:09:46 INFO Utils: Successfully started service 'sparkWorker' on port 39373.
tispark-master_1  | 20/08/12 02:09:53 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
tispark-master_1  | 20/08/12 02:09:53 INFO Master: Starting Spark master at spark://0.0.0.0:7077
tispark-master_1  | 20/08/12 02:09:53 INFO Master: Running Spark version 2.4.3
tispark-slave0_1  | 20/08/12 02:09:56 INFO Worker: Starting Spark worker 172.20.0.14:39373 with 6 cores, 1024.0 MB RAM
tispark-slave0_1  | 20/08/12 02:09:56 INFO Worker: Running Spark version 2.4.3
tispark-slave0_1  | 20/08/12 02:09:56 INFO Worker: Spark home: /opt/spark
tispark-master_1  | 20/08/12 02:09:57 INFO Utils: Successfully started service 'MasterUI' on port 8080.
tispark-slave0_1  | 20/08/12 02:09:59 INFO Utils: Successfully started service 'WorkerUI' on port 38081.
tispark-slave0_1  | 20/08/12 02:10:00 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://965384e506c3:38081
tispark-slave0_1  | 20/08/12 02:10:00 INFO Worker: Connecting to master tispark-master:7077...
tispark-slave0_1  | 20/08/12 02:10:01 INFO TransportClientFactory: Successfully created connection to tispark-master/172.20.0.13:7077 after 362 ms (0 ms spent in bootstraps)
tispark-master_1  | 20/08/12 02:10:04 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://67095e36fd36:8080
pd0_1             | [2020/08/12 02:10:17.688 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | {"level":"warn","ts":"2020-08-12T02:10:19.380Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = Unavailable desc = etcdserver: leader changed"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:10:19.380Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = etcdserver: leader changed"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:10:25.504Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:10:26.859Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:10:27.102 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
tispark-slave0_1  | 20/08/12 02:10:27 INFO Worker: Retrying connection to master (attempt # 1)
tispark-slave0_1  | 20/08/12 02:10:28 INFO Worker: Retrying connection to master (attempt # 2)
tispark-slave0_1  | 20/08/12 02:10:28 INFO Worker: Connecting to master tispark-master:7077...
pd2_1             | {"level":"warn","ts":"2020-08-12T02:10:29.601Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = NotFound desc = etcdserver: requested lease not found"}
tispark-slave0_1  | 20/08/12 02:10:30 INFO Worker: Retrying connection to master (attempt # 3)
tispark-slave0_1  | 20/08/12 02:10:30 INFO Worker: Connecting to master tispark-master:7077...
tispark-slave0_1  | 20/08/12 02:10:40 INFO Worker: Retrying connection to master (attempt # 4)
tispark-slave0_1  | 20/08/12 02:10:41 INFO Worker: Connecting to master tispark-master:7077...
tispark-slave0_1  | 20/08/12 02:10:50 INFO Worker: Retrying connection to master (attempt # 5)
tispark-slave0_1  | 20/08/12 02:10:50 INFO Worker: Connecting to master tispark-master:7077...
pd0_1             | [2020/08/12 02:10:52.574 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | {"level":"warn","ts":"2020-08-12T02:10:55.096Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:10:56.473Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tispark-slave0_1  | 20/08/12 02:11:01 INFO Worker: Retrying connection to master (attempt # 6)
tispark-slave0_1  | 20/08/12 02:11:01 INFO Worker: Connecting to master tispark-master:7077...
pd1_1             | {"level":"warn","ts":"2020-08-12T02:10:56.312Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:02.005 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
tidb_1            | {"level":"warn","ts":"2020-08-12T02:10:55.282Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:11:01.639Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | {"level":"warn","ts":"2020-08-12T02:11:10.616Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:14.225 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
tidb_1            | {"level":"warn","ts":"2020-08-12T02:11:15.293Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:16.485 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd1_1             | {"level":"warn","ts":"2020-08-12T02:11:25.373Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:32.281 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd2_1             | {"level":"warn","ts":"2020-08-12T02:11:32.744Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:11:38.340Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:37.318 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | {"level":"warn","ts":"2020-08-12T02:11:40.165Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:11:53.629Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | {"level":"warn","ts":"2020-08-12T02:11:59.156Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:11:59.674Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:00.156Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:12:01.879 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
tidb_1            | {"level":"warn","ts":"2020-08-12T02:12:10.447Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:12:10.185Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:11:59.230 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = latest connection error: connection error: desc = \"transport: Error while dialing context deadline exceeded\""]
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:13.184Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tispark-slave0_1  | 20/08/12 02:12:25 INFO Worker: Retrying connection to master (attempt # 7)
tispark-slave0_1  | 20/08/12 02:12:26 INFO Worker: Connecting to master tispark-master:7077...
grafana_1         | t=2020-08-12T02:12:26+0000 lvl=eror msg="Alert Rule Result Error" logger=alerting.evalContext ruleId=4 name="TiKV channel full alert" error="tsdb.HandleRequest() error Get http://prometheus:9090/api/v1/query_range?end=2020-08-12T02%3A11%3A47.40193117Z&query=sum%28rate%28tikv_channel_full_total%7Binstance%3D~%22%24instance%22%7D%5B1m%5D%29%29+by+%28instance%2C+type%29&start=2020-08-12T02%3A11%3A37.40193117Z&step=30.000: context deadline exceeded" changing state to=alerting
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:30.617Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:28.671Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | {"level":"warn","ts":"2020-08-12T02:12:35.352Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
grafana_1         | t=2020-08-12T02:12:41+0000 lvl=eror msg="Alert Rule Result Error" logger=alerting.evalContext ruleId=3 name="server report failures alert" error="tsdb.HandleRequest() error Get http://prometheus:9090/api/v1/query_range?end=2020-08-12T02%3A12%3A11.554027303Z&query=sum%28rate%28tikv_server_report_failure_msg_total%7Binstance%3D~%22%24instance%22%7D%5B1m%5D%29%29+by+%28type%2Cinstance%2Cstore_id%29&start=2020-08-12T02%3A12%3A01.554027303Z&step=30.000: context deadline exceeded" changing state to=alerting
pd0_1             | [2020/08/12 02:12:37.635 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd2_1             | {"level":"warn","ts":"2020-08-12T02:12:37.316Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:12:36.851Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:12:41.754 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:46.968Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:12:50.266Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:12:40.390Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:12:55.156 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = latest connection error: connection error: desc = \"transport: Error while dialing context deadline exceeded\""]
pd2_1             | {"level":"warn","ts":"2020-08-12T02:12:59.546Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:12:58.221 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd2_1             | {"level":"warn","ts":"2020-08-12T02:13:05.711Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | {"level":"warn","ts":"2020-08-12T02:13:11.037Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-f120b896-1e89-4a73-acd4-6b5cf7aad564/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
grafana_1         | t=2020-08-12T02:13:14+0000 lvl=eror msg="Failed to save state" logger=alerting.resultHandler error="database is locked"
pd1_1             | {"level":"warn","ts":"2020-08-12T02:13:13.094Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:13:15.811Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:13:06.939Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/12 02:13:22.709 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-f7486119-cc9a-4582-89a7-fbcb2860c0d9/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = latest connection error: connection error: desc = \"transport: Error while dialing dial tcp: i/o timeout\""]
tispark-slave0_1  | 20/08/12 02:13:24 INFO Worker: Retrying connection to master (attempt # 8)
tispark-slave0_1  | 20/08/12 02:13:24 INFO Worker: Connecting to master tispark-master:7077...
pd2_1             | {"level":"warn","ts":"2020-08-12T02:13:32.314Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:13:35.699Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
tidb_1            | {"level":"warn","ts":"2020-08-12T02:13:31.612Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-cd73f566-1459-4fa7-b4fe-8839bf86a71b/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-12T02:13:39.234Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-af77e97c-be35-490e-9cea-3831cf3fa66b/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd2_1             | {"level":"warn","ts":"2020-08-12T02:13:37.964Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-42d3c676-61b5-4f47-8216-5c7700c11bfa/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
DanielZhangQD commented 4 years ago

@quaff Would you mind trying the deployment with kind here?

quaff commented 4 years ago

I'm not using k8s, only docker for development.

quaff commented 4 years ago

Please resolve retrying of unary invoker failed

DanielZhangQD commented 4 years ago

What's the image version are you using?

quaff commented 4 years ago
docker image ls|grep pingcap
pingcap/pd                           latest                                           5e850bfca7b5        4 weeks ago         135MB
pingcap/tidb                         latest                                           07a17362b8e0        4 weeks ago         126MB
pingcap/tikv                         latest                                           d02317436da1        4 weeks ago         295MB
pingcap/tispark                      latest                                           f97259e33f7d        5 months ago        636MB
pingcap/tidb-vision                  latest                                           e9b25d9f7bdb        2 years ago         47.6MB
DanielZhangQD commented 4 years ago

Could you please retry with the v4.0.4 version?

DanielZhangQD commented 4 years ago

https://hub.docker.com/layers/pingcap/tidb/v4.0.4/images/sha256-431e8e71d3a02134297b4370abbb40b0bd2bc5aec0c42f12e4d4e03943b50910?context=explore

quaff commented 4 years ago

v4.0.4

pd0_1             | [2020/08/31 02:06:27.809 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
tispark-slave0_1  | 20/08/31 02:06:29 INFO TransportClientFactory: Successfully created connection to tispark-master/172.19.0.13:7077 after 7353 ms (0 ms spent in bootstraps)
pd2_1             | {"level":"warn","ts":"2020-08-31T02:06:32.242Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-2a1a6e2e-0871-4d68-9a6c-3b3a83427fc9/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/31 02:06:38.238 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | [2020/08/31 02:06:41.252 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | [2020/08/31 02:06:45.501 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | [2020/08/31 02:06:48.067 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd0_1             | [2020/08/31 02:06:51.529 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd2_1             | {"level":"warn","ts":"2020-08-31T02:06:52.267Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-2a1a6e2e-0871-4d68-9a6c-3b3a83427fc9/pd2:2379","attempt":0,"error":"rpc error: code = NotFound desc = etcdserver: requested lease not found"}
pd1_1             | {"level":"warn","ts":"2020-08-31T02:06:52.133Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-447f391b-3fa4-4352-89d5-d7a873657132/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-31T02:06:54.064Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-447f391b-3fa4-4352-89d5-d7a873657132/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd1_1             | {"level":"warn","ts":"2020-08-31T02:06:55.172Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-447f391b-3fa4-4352-89d5-d7a873657132/pd1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | [2020/08/31 02:06:55.815 +00:00] [WARN] [retry_interceptor.go:61] ["retrying of unary invoker failed"] [target=endpoint://client-3e057590-2163-497c-a1a4-79259b8085ee/pd0:2379] [attempt=0] [error="rpc error: code = DeadlineExceeded desc = context deadline exceeded"]
pd2_1             | {"level":"warn","ts":"2020-08-31T02:06:56.202Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-2a1a6e2e-0871-4d68-9a6c-3b3a83427fc9/pd2:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
pd0_1             | {"level":"warn","ts":"2020-08-31T02:06:53.668Z","caller":"clientv3/retry_interceptor.go:61","msg":"retrying of unary invoker failed","target":"endpoint://client-4a19fd34-67c2-4430-a537-9bcc5dee681f/pd0:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
DanielZhangQD commented 4 years ago

OK, sorry for the issue. I would suggest trying the kind creation I mentioned before, we have not maintained the installation with docker-compose for a long time, and installation with kind does not require any additional steps except for downloading the kind binary. And I suppose that you're just trying the TiDB instead of deploying a production env, trial with kind should be similar with docker-compose but the procedure is simpler.

quaff commented 4 years ago

Is there an instruction to deploy tidb using kind?

DanielZhangQD commented 4 years ago

Sure. English version: https://docs.pingcap.com/tidb-in-kubernetes/stable/get-started#create-a-kubernetes-cluster-using-kind Chinese version: https://docs.pingcap.com/zh/tidb-in-kubernetes/stable/get-started#%E4%BD%BF%E7%94%A8-kind-%E5%88%9B%E5%BB%BA-kubernetes-%E9%9B%86%E7%BE%A4

And if any issues or anything unclear with the doc, just create an issue here and you can assign it to me directly. For the TiDB Operator issues, just create the issue in the TiDB Operator repo.

quaff commented 4 years ago

作为普通开发者我只想快速构建一个可用的数据库实例,不纠结是docker-compose还是k8s,本来用docker-compose只需要一个命令就可以搭建一个可用环境,现在换成kind,又是kubectl又是helm,绕晕了,估计很多人中途就放弃了。

quaff commented 4 years ago

建议你们还是继续维护docker-compose方式部署给开发人员使用,k8s这些是给专业运维人员在生产环境使用的。

DanielZhangQD commented 4 years ago

OK, understand. I will check this issue.

lonng commented 4 years ago

作为普通开发者我只想快速构建一个可用的数据库实例,不纠结是docker-compose还是k8s,本来用docker-compose只需要一个命令就可以搭建一个可用环境,现在换成kind,又是kubectl又是helm,绕晕了,估计很多人中途就放弃了。

@quaff Recommend using tiup playground in the developing environment, which can startup a TiDB cluster in single command.

tennix commented 4 years ago

@quaff Sorry for the inconvenience, as @lonng said, it's recommended to use tiup which is more easier than docker-compose, and it has full lifecycle management of the cluster. You can use tiup playground to easily spin up a local testing cluster or use tiup to deploy a production cluster.

In the future, tidb-docker-compose will not be maintained anymore in favor of tiup.

quaff commented 4 years ago

@quaff Sorry for the inconvenience, as @lonng said, it's recommended to use tiup which is more easier than docker-compose, and it has full lifecycle management of the cluster. You can use tiup playground to easily spin up a local testing cluster or use tiup to deploy a production cluster.

In the future, tidb-docker-compose will not be maintained anymore in favor of tiup.

首先,我不同意tiup比docker-compose更easy的说法,其次,我相信绝大多数开发者更希望使用业界流行的方案而不是某个软件的专用方案,而docker可以说是事实上的标准,我可以在docker里面部署YugabyteDB Citus SQLServer Oracle等等等,作为开发人员, 我没太多精力去折腾怎么部署某个软件,那是运维人员做的事情。 如果一开始就没支持docker-compose,我能理解。 如果现在是因为docker-compose做了不兼容的变动导致这条路走不下去,我也能理解。 但是为了推自己的tiup而放弃docker-compose,我觉得不是一个负责的做法,我建议你们花一点点精力排查一下这个问题,说不定这个问题也会隐藏在其他部署环境只是暂时没发现而已。我能从解决问题中得到快感,只是这不是我擅长的领域,否则我会尝试一下。

lonng commented 4 years ago

@quaff I am very sorry for the inconvenience. The reason why docker-compose is not maintained for the time being is not what you said to promote our new platform tiup. Because maintaining any product requires more manpower from the community.

With limited manpower, we tend to focus on solving more important issues to provide users with better products. We have the following solutions to help users use TiDB in both the production environment and development environment.

For a production environment:

For a development environment:

quaff commented 4 years ago

人力有限可以理解,麻烦有空还是抽出精力看看这个问题,也许是一个比在这里回复还不费脑的问题。

heathjay commented 3 years ago

I meet the same question in the Mac while using the docker-compose. But I thought it may be a system bottleneck problem since it disappeared when I deploy it on my university server.

smallyaohailu commented 2 years ago

pingcap/tidb-operator v1.3.0-beta.1 19bf953fa60a 6 days ago 268MB pingcap/tidb-monitor-initializer v5.3.0 7f05e711b50a 7 weeks ago 4.52MB pingcap/tidb latest 778bf9e1e051 9 months ago 145MB pingcap/tikv latest 6e34b1d95950 9 months ago 355MB pingcap/pd latest d55858ba1d82 9 months ago 151MB pingcap/tidb-vision latest e9b25d9f7bdb 3 years ago 47.6MB

我大概是明白这个问题就是官方不维护引起的,是让我们强制使用k8s, 用k8s ,命令更复杂,就让我们了解他们的kubesphere, 然后了解tidb+kubersphere 的联合用法,然后,依旧安装不料,大厂能用tidb的需求,基本的都可以拿钱找官方了。 产品确实很不错,帮忙修复下吧。呜呜

tennix commented 2 years ago

pingcap/tidb-operator v1.3.0-beta.1 19bf953fa60a 6 days ago 268MB pingcap/tidb-monitor-initializer v5.3.0 7f05e711b50a 7 weeks ago 4.52MB pingcap/tidb latest 778bf9e1e051 9 months ago 145MB pingcap/tikv latest 6e34b1d95950 9 months ago 355MB pingcap/pd latest d55858ba1d82 9 months ago 151MB pingcap/tidb-vision latest e9b25d9f7bdb 3 years ago 47.6MB

The latest tag is misleading, we are now using nightly tag instead. So the Docker images you mentioned above are updated 9 months ago.

我大概是明白这个问题就是官方不维护引起的,是让我们强制使用k8s, 用k8s ,命令更复杂,就让我们了解他们的kubesphere, 然后了解tidb+kubersphere 的联合用法,然后,依旧安装不料,大厂能用tidb的需求,基本的都可以拿钱找官方了。 产品确实很不错,帮忙修复下吧。呜呜

We never claimed tidb-docker-compose to be a production solution, it's for testing only, we also don't guarantee the data durability for tidb-docker-compose. Previously a user uses tidb-docker-compose for production, and lost data after system reboot. We try to avoid this happen again. So by all means tidb-docker-compose is not comparable to k8s+operator or tiup. For testing purpose, tiup playground is the recommended way to play with. And by using tiup, there is a clear path to go to production.