crawlab-team / crawlab

Distributed web crawler admin platform for spiders management regardless of languages and frameworks. 分布式爬虫管理平台,支持任何语言和框架
https://www.crawlab.cn
BSD 3-Clause "New" or "Revised" License
11.37k stars 1.79k forks source link

使用 dockerhub中的所有镜像,都会有grpc client connect error错误 #1026

Closed Schr0dingerCat closed 2 years ago

Schr0dingerCat commented 3 years ago

系统环境 虚拟机中 Linux a 5.11.0-40-generic #44-Ubuntu SMP Wed Oct 20 16:16:42 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux Bug 描述 使用代码中默认的docker-compose.yml文件,使用 docker-compose up 运行,输出提示报错: crawlab_master | 2021/11/22 11:45:19 error grpc client connect error: grpc error: client failed to start. reattempt in 1.1 seconds... 复现步骤 该 Bug 复现步骤如下

期望结果 docker 镜像 能工作。

输出信息

Attaching to crlb_mongo_1, crawlab_master
crawlab_master | Using config file: /app/backend/conf/config.yml
mongo_1   |
mongo_1   | WARNING: MongoDB 5.0+ requires a CPU with AVX support, and your current system does not appear to have that!
mongo_1   |   see https://jira.mongodb.org/browse/SERVER-54407
mongo_1   |   see also https://www.mongodb.com/community/forums/t/mongodb-5-0-cpu-intel-g4650-compatibility/116610/2
mongo_1   |   see also https://github.com/docker-library/mongo/issues/485#issuecomment-891991814
mongo_1   |
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.929+00:00"},"s":"I",  "c":"CONTROL",  "id":23285,   "ctx":"main","msg":"Automatically disabling TLS 1.0, to force-enable TLS 1.0 specify --sslDisabledProtocols 'none'"}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.956+00:00"},"s":"W",  "c":"ASIO",     "id":22601,   "ctx":"main","msg":"No TransportLayer configured during NetworkInterface startup"}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.956+00:00"},"s":"I",  "c":"NETWORK",  "id":4648601, "ctx":"main","msg":"Implicit TCP FastOpen unavailable. If TCP FastOpen is required, set tcpFastOpenServer, tcpFastOpenClient, and tcpFastOpenQueueSize."}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.959+00:00"},"s":"I",  "c":"STORAGE",  "id":4615611, "ctx":"initandlisten","msg":"MongoDB starting","attr":{"pid":1,"port":27017,"dbPath":"/data/db","architecture":"64-bit","host":"8637beb66237"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.959+00:00"},"s":"I",  "c":"CONTROL",  "id":23403,   "ctx":"initandlisten","msg":"Build Info","attr":{"buildInfo":{"version":"4.4.10","gitVersion":"58971da1ef93435a9f62bf4708a81713def6e88c","openSSLVersion":"OpenSSL 1.1.1f  31 Mar 2020","modules":[],"allocator":"tcmalloc","environment":{"distmod":"ubuntu2004","distarch":"x86_64","target_arch":"x86_64"}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.960+00:00"},"s":"I",  "c":"CONTROL",  "id":51765,   "ctx":"initandlisten","msg":"Operating System","attr":{"os":{"name":"Ubuntu","version":"20.04"}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.960+00:00"},"s":"I",  "c":"CONTROL",  "id":21951,   "ctx":"initandlisten","msg":"Options set by command line","attr":{"options":{"net":{"bindIp":"*"}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.969+00:00"},"s":"I",  "c":"STORAGE",  "id":22297,   "ctx":"initandlisten","msg":"Using the XFS filesystem is strongly recommended with the WiredTiger storage engine. See http://dochub.mongodb.org/core/prodnotes-filesystem","tags":["startupWarnings"]}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:52.969+00:00"},"s":"I",  "c":"STORAGE",  "id":22315,   "ctx":"initandlisten","msg":"Opening WiredTiger","attr":{"config":"create,cache_size=477M,session_max=33000,eviction=(threads_min=4,threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000,close_scan_interval=10,close_handle_minimum=250),statistics_log=(wait=0),verbose=[recovery_progress,checkpoint_progress,compact_progress],"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.432+00:00"},"s":"I",  "c":"STORAGE",  "id":22430,   "ctx":"initandlisten","msg":"WiredTiger message","attr":{"message":"[1637552697:432351][1:0x7fbf2321ccc0], txn-recover: [WT_VERB_RECOVERY | WT_VERB_RECOVERY_PROGRESS] Set global recovery timestamp: (0, 0)"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.432+00:00"},"s":"I",  "c":"STORAGE",  "id":22430,   "ctx":"initandlisten","msg":"WiredTiger message","attr":{"message":"[1637552697:432444][1:0x7fbf2321ccc0], txn-recover: [WT_VERB_RECOVERY | WT_VERB_RECOVERY_PROGRESS] Set global oldest timestamp: (0, 0)"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.459+00:00"},"s":"I",  "c":"STORAGE",  "id":4795906, "ctx":"initandlisten","msg":"WiredTiger opened","attr":{"durationMillis":4489}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.467+00:00"},"s":"I",  "c":"RECOVERY", "id":23987,   "ctx":"initandlisten","msg":"WiredTiger recoveryTimestamp","attr":{"recoveryTimestamp":{"$timestamp":{"t":0,"i":0}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.542+00:00"},"s":"I",  "c":"STORAGE",  "id":4366408, "ctx":"initandlisten","msg":"No table logging settings modifications are required for existing WiredTiger tables","attr":{"loggingEnabled":true}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.549+00:00"},"s":"I",  "c":"STORAGE",  "id":22262,   "ctx":"initandlisten","msg":"Timestamp monitor starting"}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.558+00:00"},"s":"W",  "c":"CONTROL",  "id":22120,   "ctx":"initandlisten","msg":"Access control is not enabled for the database. Read and write access to data and configuration is unrestricted","tags":["startupWarnings"]}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.579+00:00"},"s":"I",  "c":"STORAGE",  "id":20320,   "ctx":"initandlisten","msg":"createCollection","attr":{"namespace":"admin.system.version","uuidDisposition":"provided","uuid":{"uuid":{"$uuid":"b0298ee7-3643-44cd-8a24-6079856891b6"}},"options":{"uuid":{"$uuid":"b0298ee7-3643-44cd-8a24-6079856891b6"}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.620+00:00"},"s":"I",  "c":"INDEX",    "id":20345,   "ctx":"initandlisten","msg":"Index build: done building","attr":{"buildUUID":null,"namespace":"admin.system.version","index":"_id_","commitTimestamp":{"$timestamp":{"t":0,"i":0}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.621+00:00"},"s":"I",  "c":"COMMAND",  "id":20459,   "ctx":"initandlisten","msg":"Setting featureCompatibilityVersion","attr":{"newVersion":"4.4"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.630+00:00"},"s":"I",  "c":"STORAGE",  "id":20536,   "ctx":"initandlisten","msg":"Flow Control is enabled on this deployment"}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.637+00:00"},"s":"I",  "c":"STORAGE",  "id":20320,   "ctx":"initandlisten","msg":"createCollection","attr":{"namespace":"local.startup_log","uuidDisposition":"generated","uuid":{"uuid":{"$uuid":"3ce33dc0-2f0c-4828-ad49-cc861c9abe8f"}},"options":{"capped":true,"size":10485760}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.702+00:00"},"s":"I",  "c":"INDEX",    "id":20345,   "ctx":"initandlisten","msg":"Index build: done building","attr":{"buildUUID":null,"namespace":"local.startup_log","index":"_id_","commitTimestamp":{"$timestamp":{"t":0,"i":0}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.703+00:00"},"s":"I",  "c":"FTDC",     "id":20625,   "ctx":"initandlisten","msg":"Initializing full-time diagnostic data capture","attr":{"dataDirectory":"/data/db/diagnostic.data"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.704+00:00"},"s":"I",  "c":"REPL",     "id":6015317, "ctx":"initandlisten","msg":"Setting new configuration state","attr":{"newState":"ConfigReplicationDisabled","oldState":"ConfigPreStart"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.728+00:00"},"s":"I",  "c":"NETWORK",  "id":23015,   "ctx":"listener","msg":"Listening on","attr":{"address":"/tmp/mongodb-27017.sock"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.728+00:00"},"s":"I",  "c":"NETWORK",  "id":23015,   "ctx":"listener","msg":"Listening on","attr":{"address":"0.0.0.0"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.728+00:00"},"s":"I",  "c":"NETWORK",  "id":23016,   "ctx":"listener","msg":"Waiting for connections","attr":{"port":27017,"ssl":"off"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.766+00:00"},"s":"I",  "c":"STORAGE",  "id":20320,   "ctx":"LogicalSessionCacheRefresh","msg":"createCollection","attr":{"namespace":"config.system.sessions","uuidDisposition":"generated","uuid":{"uuid":{"$uuid":"4221cb50-7558-45b0-97f5-141f93a157df"}},"options":{}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.767+00:00"},"s":"I",  "c":"CONTROL",  "id":20712,   "ctx":"LogicalSessionCacheReap","msg":"Sessions collection is not set up; waiting until next sessions reap interval","attr":{"error":"NamespaceNotFound: config.system.sessions does not exist"}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.856+00:00"},"s":"I",  "c":"INDEX",    "id":20345,   "ctx":"LogicalSessionCacheRefresh","msg":"Index build: done building","attr":{"buildUUID":null,"namespace":"config.system.sessions","index":"_id_","commitTimestamp":{"$timestamp":{"t":0,"i":0}}}}
mongo_1   | {"t":{"$date":"2021-11-22T03:44:57.862+00:00"},"s":"I",  "c":"INDEX",    "id":20345,   "ctx":"LogicalSessionCacheRefresh","msg":"Index build: done building","attr":{"buildUUID":null,"namespace":"config.system.sessions","index":"lsidTTLIndex","commitTimestamp":{"$timestamp":{"t":0,"i":0}}}}
crawlab_master | context deadline exceeded
crawlab_master | /go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:11 github.com/crawlab-team/go-trace.TraceError()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:262 github.com/crawlab-team/crawlab-core/grpc/client.(*Client)._connect()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:55 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:242 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:61 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/node/service/worker_service.go:45 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
crawlab_master | /usr/local/go/src/runtime/asm_amd64.s:1371 runtime.goexit()
crawlab_master | grpc error: client failed to start
crawlab_master | /go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:6 github.com/crawlab-team/go-trace.PrintError()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/utils/backoff.go:13 github.com/crawlab-team/crawlab-core/utils.BackoffErrorNotify.func1()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:69 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:242 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:61 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/node/service/worker_service.go:45 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
crawlab_master | /usr/local/go/src/runtime/asm_amd64.s:1371 runtime.goexit()
crawlab_master | 2021/11/22 11:45:08 error grpc client connect error: grpc error: client failed to start. reattempt in 0.6 seconds...
crawlab_master | context deadline exceeded
crawlab_master | /go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:11 github.com/crawlab-team/go-trace.TraceError()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:262 github.com/crawlab-team/crawlab-core/grpc/client.(*Client)._connect()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:55 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:242 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:61 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/node/service/worker_service.go:45 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
crawlab_master | /usr/local/go/src/runtime/asm_amd64.s:1371 runtime.goexit()
crawlab_master | grpc error: client failed to start
crawlab_master | /go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:6 github.com/crawlab-team/go-trace.PrintError()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/utils/backoff.go:13 github.com/crawlab-team/crawlab-core/utils.BackoffErrorNotify.func1()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:69 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
crawlab_master | /go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:242 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/grpc/client/client.go:61 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
crawlab_master | /go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20211120.1848/node/service/worker_service.go:45 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
crawlab_master | /usr/local/go/src/runtime/asm_amd64.s:1371 runtime.goexit()
crawlab_master | 2021/11/22 11:45:19 error grpc client connect error: grpc error: client failed to start. reattempt in 1.1 seconds...
ding112 commented 2 years ago

一样的问题,master镜像,develop镜像都不行

context deadline exceeded
/go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:11 github.com/crawlab-team/go-trace.TraceError()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/grpc/client/client.go:240 github.com/crawlab-team/crawlab-core/grpc/client.(*Client)._connect()
/go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:55 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
/go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/grpc/client/client.go:220 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/grpc/client/client.go:57 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/node/service/worker_service.go:43 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
/usr/local/go/src/runtime/asm_amd64.s:1374 runtime.goexit()
grpc error: client failed to start
/go/pkg/mod/github.com/crawlab-team/go-trace@v0.1.0/trace.go:6 github.com/crawlab-team/go-trace.PrintError()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/utils/backoff.go:13 github.com/crawlab-team/crawlab-core/utils.BackoffErrorNotify.func1()
/go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:69 github.com/cenkalti/backoff/v4.RetryNotifyWithTimer()
/go/pkg/mod/github.com/cenkalti/backoff/v4@v4.1.0/retry.go:34 github.com/cenkalti/backoff/v4.RetryNotify()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/grpc/client/client.go:220 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).connect()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/grpc/client/client.go:57 github.com/crawlab-team/crawlab-core/grpc/client.(*Client).Start()
/go/pkg/mod/github.com/crawlab-team/crawlab-core@v0.6.0-beta.20210802.1344/node/service/worker_service.go:43 github.com/crawlab-team/crawlab-core/node/service.(*WorkerService).Start()
/usr/local/go/src/runtime/asm_amd64.s:1374 runtime.goexit()
2021/12/02 17:53:53 error grpc client connect error: grpc error: client failed to start. reattempt in 29.3 seconds...
ding112 commented 2 years ago

0.6bete的docker compose 还有问题,用回0.5版本的就没问题。

version: '3.3'
services:
  master:
    image: tikazyq/crawlab:0.5.1
    container_name: master
    environment:
      # CRAWLAB_API_ADDRESS: "https://<your_api_ip>:<your_api_port>"  # backend API address 后端 API 地址. 适用于 https 或者源码部署
      CRAWLAB_SERVER_MASTER: "Y"  # whether to be master node 是否为主节点,主节点为 Y,工作节点为 N
      CRAWLAB_MONGO_HOST: "mongo"  # MongoDB host address MongoDB 的地址,在 docker compose 网络中,直接引用服务名称
      # CRAWLAB_MONGO_PORT: "27017"  # MongoDB port MongoDB 的端口
      # CRAWLAB_MONGO_DB: "crawlab_test"  # MongoDB database MongoDB 的数据库
      # CRAWLAB_MONGO_USERNAME: "username"  # MongoDB username MongoDB 的用户名
      # CRAWLAB_MONGO_PASSWORD: "password"  # MongoDB password MongoDB 的密码
      # CRAWLAB_MONGO_AUTHSOURCE: "admin"  # MongoDB auth source MongoDB 的验证源
      CRAWLAB_REDIS_ADDRESS: "redis"  # Redis host address Redis 的地址,在 docker compose 网络中,直接引用服务名称
      # CRAWLAB_REDIS_PORT: "6379"  # Redis port Redis 的端口
      # CRAWLAB_REDIS_DATABASE: "1"  # Redis database Redis 的数据库
      # CRAWLAB_REDIS_PASSWORD: "password"  # Redis password Redis 的密码
      # CRAWLAB_LOG_LEVEL: "info"  # log level 日志级别. 默认为 info
      # CRAWLAB_LOG_ISDELETEPERIODICALLY: "N"  # whether to periodically delete log files 是否周期性删除日志文件. 默认不删除
      # CRAWLAB_LOG_DELETEFREQUENCY: "@hourly"  # frequency of deleting log files 删除日志文件的频率. 默认为每小时
      # CRAWLAB_TASK_WORKERS: 8  # number of task executors 任务执行器个数(并行执行任务数)
      # CRAWLAB_SERVER_REGISTER_TYPE: "mac"  # node register type 节点注册方式. 默认为 mac 地址,也可设置为 ip(防止 mac 地址冲突)
      # CRAWLAB_SERVER_REGISTER_IP: "127.0.0.1"  # node register ip 节点注册IP. 节点唯一识别号,只有当 CRAWLAB_SERVER_REGISTER_TYPE 为 "ip" 时才生效
      # CRAWLAB_SERVER_LANG_NODE: "Y"  # whether to pre-install Node.js 预安装 Node.js 语言环境
      CRAWLAB_SERVER_LANG_JAVA: "Y"  # whether to pre-install Java 预安装 Java 语言环境
      # CRAWLAB_SERVER_LANG_DOTNET: "Y"  # whether to pre-install .Net core 预安装 .Net Core 语言环境
      # CRAWLAB_SERVER_LANG_PHP: "Y"  # whether to pre-install PHP 预安装 PHP 语言环境
      # CRAWLAB_SERVER_LANG_GO: "Y"  # whether to pre-install Golang 预安装 Golang 语言环境
      # CRAWLAB_SETTING_ALLOWREGISTER: "N"  # whether to allow user registration 是否允许用户注册
      # CRAWLAB_SETTING_ENABLETUTORIAL: "N"  # whether to enable tutorial 是否启用教程
      CRAWLAB_SETTING_RUNONMASTER: "N"  # whether to run on master node 是否在主节点上运行任务
      # CRAWLAB_SETTING_DEMOSPIDERS: "Y"  # whether to init demo spiders 是否使用Demo爬虫
      # CRAWLAB_SETTING_CHECKSCRAPY: "Y"  # whether to automatically check if the spider is scrapy 是否自动检测爬虫为scrapy
      # CRAWLAB_NOTIFICATION_MAIL_SERVER: smtp.exmaple.com  # STMP server address STMP 服务器地址
      # CRAWLAB_NOTIFICATION_MAIL_PORT: 465  # STMP server port STMP 服务器端口
      # CRAWLAB_NOTIFICATION_MAIL_SENDEREMAIL: admin@exmaple.com  # sender email 发送者邮箱
      # CRAWLAB_NOTIFICATION_MAIL_SENDERIDENTITY: admin@exmaple.com  # sender ID 发送者 ID
      # CRAWLAB_NOTIFICATION_MAIL_SMTP_USER: username  # SMTP username SMTP 用户名
      # CRAWLAB_NOTIFICATION_MAIL_SMTP_PASSWORD: password  # SMTP password SMTP 密码
    ports:
      - "8080:8080" # frontend port mapping 前端端口映射
    depends_on:
      - mongo
      - redis
    # volumes:
    #   - "/var/crawlab/log:/var/logs/crawlab" # log persistent 日志持久化
  worker:
    image: tikazyq/crawlab:0.5.1
    container_name: worker
    environment:
      CRAWLAB_SERVER_MASTER: "N"
      CRAWLAB_MONGO_HOST: "mongo"
      CRAWLAB_REDIS_ADDRESS: "redis"
      CRAWLAB_SERVER_LANG_JAVA: "Y"  # whether to pre-install Java 预安装 Java 语言环境
    depends_on:
      - mongo
      - redis
    # volumes:
    #   - "/var/crawlab/log:/var/logs/crawlab" # log persistent 日志持久化
  mongo:
    image: mongo:latest
    # environment:
      # MONGO_INITDB_ROOT_USERNAME: username
      # MONGO_INITDB_ROOT_PASSWORD: password
    # volumes:
    #   - "/opt/crawlab/mongo/data/db:/data/db"  # make data persistent 持久化
    # ports:
    #   - "27017:27017"  # expose port to host machine 暴露接口到宿主机
  redis:
    image: redis:latest
    # command: redis-server --requirepass "password" # set redis password 设置 Redis 密码
    # volumes:
    #   - "/opt/crawlab/redis/data:/data"  # make data persistent 持久化
    # ports:
    #   - "6379:6379"  # expose port to host machine 暴露接口到宿主机
  # splash:  # use Splash to run spiders on dynamic pages
  #   image: scrapinghub/splash
  #   container_name: splash
  #   ports:
  #     - "8050:8050"
voctenzuk commented 2 years ago

@ding112 I solved this error by replacing CRAWLAB_SERVER_MASTER: Y with CRAWLAB_NODE_MASTER: "Y"

tikazyq commented 2 years ago

CRAWLAB_NODE_MASTER

OMG!! You just discovered an ultimate secret in the universe.