Closed heiheiheibj closed 4 years ago
有时候网络异常,可以暂时忽略
把配置去掉敏感信息后贴出来看看
我是在单机上先做的测试系统,配置就是按文档中的,如下 vversion: '3.3' services: master: image: tikazyq/crawlab:latest container_name: master environment: CRAWLAB_API_ADDRESS: "localhost:8000" CRAWLAB_SERVER_MASTER: "Y" CRAWLAB_MONGO_HOST: "mongo" CRAWLAB_REDIS_ADDRESS: "redis" ports:
另外看日志输出中,有一句 master | 2020/03/03 17:40:10 error Get https://pypi.tuna.tsinghua.edu.cn/simple: dial tcp: lookup pypi.tuna.tsinghua. edu.cn on 127.0.0.11:53: read udp 127.0.0.1:34949->127.0.0.11:53: i/o timeout 这个会有影响不,感谢。。
不影响
张冶青 Yeqing Zhang 邮箱:tikazyq@163.com |
---|
签名由 网易邮箱大师 定制
在2020年03月03日 17:44,heiheiheibj 写道:
我是在单机上先做的测试系统,配置就是按文档中的,如下 vversion: '3.3' services: master: image: tikazyq/crawlab:latest container_name: master environment: CRAWLAB_API_ADDRESS: "localhost:8000" CRAWLAB_SERVER_MASTER: "Y" CRAWLAB_MONGO_HOST: "mongo" CRAWLAB_REDIS_ADDRESS: "redis" ports:
另外看日志输出中,有一句 master | 2020/03/03 17:40:10 error Get https://pypi.tuna.tsinghua.edu.cn/simple: dial tcp: lookup pypi.tuna.tsinghua. edu.cn on 127.0.0.11:53: read udp 127.0.0.1:34949->127.0.0.11:53: i/o timeout 这个会有影响不,感谢。。
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
把这一行去掉
CRAWLAB_API_ADDRESS: "localhost:8000"
去了就OK了,,也可以是 master机的IP,感谢
在win10下,装的docker desktop 2.2.0.3 ,用DOCKER方式安装, 能出现登录界面,但输入密码,提示登录时出错(请查看文档 Q&A) 用docker logs master查看 内容为
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
2020/03/03 02:40:12 info initialized config successfully 2020/03/03 02:40:12 info initialized log config successfully 2020/03/03 02:40:12 info periodically cleaning log is switched off 2020/03/03 02:40:14 info initialized MongoDB successfully 2020/03/03 02:40:14 info initialized Redis successfully 2020/03/03 02:40:14 info initialized schedule successfully 2020/03/03 02:40:14 info initialized user service successfully 2020/03/03 02:40:14 info initialized dependency fetcher successfully 2020/03/03 02:40:14 info initialized task executor successfully 2020/03/03 02:40:14 info register type is :*register.MacRegister {subscribe nodes:master 1} 2020/03/03 02:40:14 info initialized node service successfully {subscribe nodes:public 1} 2020/03/03 02:40:14 error read yaml error: open /app/spiders/csdn/Spiderfile: no such file or directory 2020/03/03 02:40:14 error read yaml error: open /app/spiders/juejin_node/Spiderfile: no such file or directory 2020/03/03 02:40:14 error read yaml error: open /app/spiders/segmentfault/Spiderfile: no such file or directory 2020/03/03 02:40:14 error read yaml error: open /app/spiders/sites_inspector/Spiderfile: no such file or directory 2020/03/03 02:40:14 info start sync spider to local, total: 14 2020/03/03 02:40:14 info initialized spider service successfully 2020/03/03 02:40:14 info initialized rpc service successfully [GIN-debug] POST /login --> crawlab/routes.Login (4 handlers) [GIN-debug] PUT /users --> crawlab/routes.PutUser (4 handlers) [GIN-debug] GET /setting --> crawlab/routes.GetSetting (4 handlers) [GIN-debug] GET /version --> crawlab/routes.GetVersion (4 handlers) [GIN-debug] GET /releases/latest --> crawlab/routes.GetLatestRelease (4 handlers) [GIN-debug] GET /docs --> crawlab/routes.GetDocs (4 handlers) [GIN-debug] GET /nodes --> crawlab/routes.GetNodeList (5 handlers) [GIN-debug] GET /nodes/:id --> crawlab/routes.GetNode (5 handlers) [GIN-debug] POST /nodes/:id --> crawlab/routes.PostNode (5 handlers) [GIN-debug] GET /nodes/:id/tasks --> crawlab/routes.GetNodeTaskList (5 handlers) [GIN-debug] GET /nodes/:id/system --> crawlab/routes.GetSystemInfo (5 handlers) [GIN-debug] DELETE /nodes/:id --> crawlab/routes.DeleteNode (5 handlers) [GIN-debug] GET /nodes/:id/langs --> crawlab/routes.GetLangList (5 handlers) [GIN-debug] GET /nodes/:id/deps --> crawlab/routes.GetDepList (5 handlers) [GIN-debug] GET /nodes/:id/deps/installed --> crawlab/routes.GetInstalledDepList (5 handlers) [GIN-debug] POST /nodes/:id/deps/install --> crawlab/routes.InstallDep (5 handlers) [GIN-debug] POST /nodes/:id/deps/uninstall --> crawlab/routes.UninstallDep (5 handlers) [GIN-debug] POST /nodes/:id/langs/install --> crawlab/routes.InstallLang (5 handlers) [GIN-debug] GET /spiders --> crawlab/routes.GetSpiderList (5 handlers) [GIN-debug] GET /spiders/:id --> crawlab/routes.GetSpider (5 handlers) [GIN-debug] PUT /spiders --> crawlab/routes.PutSpider (5 handlers) [GIN-debug] POST /spiders --> crawlab/routes.UploadSpider (5 handlers) [GIN-debug] POST /spiders/:id --> crawlab/routes.PostSpider (5 handlers) [GIN-debug] POST /spiders/:id/publish --> crawlab/routes.PublishSpider (5 handlers) [GIN-debug] POST /spiders/:id/upload --> crawlab/routes.UploadSpiderFromId (5 handlers) [GIN-debug] DELETE /spiders --> crawlab/routes.DeleteSelectedSpider (5 handlers) [GIN-debug] DELETE /spiders/:id --> crawlab/routes.DeleteSpider (5 handlers) [GIN-debug] POST /spiders/:id/copy --> crawlab/routes.CopySpider (5 handlers) [GIN-debug] GET /spiders/:id/tasks --> crawlab/routes.GetSpiderTasks (5 handlers) [GIN-debug] GET /spiders/:id/file/tree --> crawlab/routes.GetSpiderFileTree (5 handlers) [GIN-debug] GET /spiders/:id/file --> crawlab/routes.GetSpiderFile (5 handlers) [GIN-debug] POST /spiders/:id/file --> crawlab/routes.PostSpiderFile (5 handlers) [GIN-debug] PUT /spiders/:id/file --> crawlab/routes.PutSpiderFile (5 handlers) [GIN-debug] PUT /spiders/:id/dir --> crawlab/routes.PutSpiderDir (5 handlers) [GIN-debug] DELETE /spiders/:id/file --> crawlab/routes.DeleteSpiderFile (5 handlers) [GIN-debug] POST /spiders/:id/file/rename --> crawlab/routes.RenameSpiderFile (5 handlers) [GIN-debug] GET /spiders/:id/dir --> crawlab/routes.GetSpiderDir (5 handlers) [GIN-debug] GET /spiders/:id/stats --> crawlab/routes.GetSpiderStats (5 handlers) [GIN-debug] GET /spiders/:id/schedules --> crawlab/routes.GetSpiderSchedules (5 handlers) [GIN-debug] GET /spiders/:id/scrapy/spiders --> crawlab/routes.GetSpiderScrapySpiders (5 handlers) [GIN-debug] PUT /spiders/:id/scrapy/spiders --> crawlab/routes.PutSpiderScrapySpiders (5 handlers) [GIN-debug] GET /spiders/:id/scrapy/settings --> crawlab/routes.GetSpiderScrapySettings (5 handlers) [GIN-debug] POST /spiders/:id/scrapy/settings --> crawlab/routes.PostSpiderScrapySettings (5 handlers) [GIN-debug] GET /spiders/:id/scrapy/items --> crawlab/routes.GetSpiderScrapyItems (5 handlers) [GIN-debug] POST /spiders/:id/scrapy/items --> crawlab/routes.PostSpiderScrapyItems (5 handlers) [GIN-debug] GET /spiders/:id/scrapy/pipelines --> crawlab/routes.GetSpiderScrapyPipelines (5 handlers) [GIN-debug] GET /spiders/:id/scrapy/spider/filepath --> crawlab/routes.GetSpiderScrapySpiderFilepath (5 handlers) [GIN-debug] POST /spiders/:id/git/sync --> crawlab/routes.PostSpiderSyncGit (5 handlers) [GIN-debug] POST /spiders/:id/git/reset --> crawlab/routes.PostSpiderResetGit (5 handlers) [GIN-debug] POST /spiders-cancel --> crawlab/routes.CancelSelectedSpider (5 handlers) [GIN-debug] POST /spiders-run --> crawlab/routes.RunSelectedSpider (5 handlers) [GIN-debug] GET /config_spiders/:id/config --> crawlab/routes.GetConfigSpiderConfig (5 handlers) [GIN-debug] POST /config_spiders/:id/config --> crawlab/routes.PostConfigSpiderConfig (5 handlers) [GIN-debug] PUT /config_spiders --> crawlab/routes.PutConfigSpider (5 handlers) [GIN-debug] POST /config_spiders/:id --> crawlab/routes.PostConfigSpider (5 handlers) [GIN-debug] POST /config_spiders/:id/upload --> crawlab/routes.UploadConfigSpider (5 handlers) [GIN-debug] POST /config_spiders/:id/spiderfile --> crawlab/routes.PostConfigSpiderSpiderfile (5 handlers) [GIN-debug] GET /config_spiders_templates --> crawlab/routes.GetConfigSpiderTemplateList (5 handlers) [GIN-debug] GET /tasks --> crawlab/routes.GetTaskList (5 handlers) [GIN-debug] GET /tasks/:id --> crawlab/routes.GetTask (5 handlers) [GIN-debug] PUT /tasks --> crawlab/routes.PutTask (5 handlers) [GIN-debug] DELETE /tasks/:id --> crawlab/routes.DeleteTask (5 handlers) [GIN-debug] DELETE /tasks --> crawlab/routes.DeleteSelectedTask (5 handlers) [GIN-debug] DELETE /tasks_by_status --> crawlab/routes.DeleteTaskByStatus (5 handlers) [GIN-debug] POST /tasks/:id/cancel --> crawlab/routes.CancelTask (5 handlers) [GIN-debug] GET /tasks/:id/log --> crawlab/routes.GetTaskLog (5 handlers) [GIN-debug] GET /tasks/:id/results --> crawlab/routes.GetTaskResults (5 handlers) [GIN-debug] GET /tasks/:id/results/download --> crawlab/routes.DownloadTaskResultsCsv (5 handlers) [GIN-debug] GET /schedules --> crawlab/routes.GetScheduleList (5 handlers) [GIN-debug] GET /schedules/:id --> crawlab/routes.GetSchedule (5 handlers) [GIN-debug] PUT /schedules --> crawlab/routes.PutSchedule (5 handlers) [GIN-debug] POST /schedules/:id --> crawlab/routes.PostSchedule (5 handlers) [GIN-debug] DELETE /schedules/:id --> crawlab/routes.DeleteSchedule (5 handlers) [GIN-debug] POST /schedules/:id/disable --> crawlab/routes.DisableSchedule (5 handlers) [GIN-debug] POST /schedules/:id/enable --> crawlab/routes.EnableSchedule (5 handlers) [GIN-debug] GET /users --> crawlab/routes.GetUserList (5 handlers) [GIN-debug] GET /users/:id --> crawlab/routes.GetUser (5 handlers) [GIN-debug] POST /users/:id --> crawlab/routes.PostUser (5 handlers) [GIN-debug] DELETE /users/:id --> crawlab/routes.DeleteUser (5 handlers) [GIN-debug] GET /me --> crawlab/routes.GetMe (5 handlers) [GIN-debug] POST /me --> crawlab/routes.PostMe (5 handlers) [GIN-debug] GET /system/deps/:lang --> crawlab/routes.GetAllDepList (5 handlers) [GIN-debug] GET /system/deps/:lang/:dep_name/json --> crawlab/routes.GetDepJson (5 handlers) [GIN-debug] GET /variables --> crawlab/routes.GetVariableList (5 handlers) [GIN-debug] PUT /variable --> crawlab/routes.PutVariable (5 handlers) [GIN-debug] POST /variable/:id --> crawlab/routes.PostVariable (5 handlers) [GIN-debug] DELETE /variable/:id --> crawlab/routes.DeleteVariable (5 handlers) [GIN-debug] GET /projects --> crawlab/routes.GetProjectList (5 handlers) [GIN-debug] GET /projects/tags --> crawlab/routes.GetProjectTags (5 handlers) [GIN-debug] PUT /projects --> crawlab/routes.PutProject (5 handlers) [GIN-debug] POST /projects/:id --> crawlab/routes.PostProject (5 handlers) [GIN-debug] DELETE /projects/:id --> crawlab/routes.DeleteProject (5 handlers) [GIN-debug] GET /stats/home --> crawlab/routes.GetHomeStats (5 handlers) [GIN-debug] GET /file --> crawlab/routes.GetFile (5 handlers) [GIN-debug] GET /git/branches --> crawlab/routes.GetGitBranches (5 handlers) [GIN-debug] GET /git/public-key --> crawlab/routes.GetGitSshPublicKey (5 handlers) [GIN-debug] GET /ping --> crawlab/routes.Ping (4 handlers) 2020/03/03 02:41:00 info start sync spider to local, total: 14 2020/03/03 02:42:00 info start sync spider to local, total: 14 2020/03/03 02:43:00 info start sync spider to local, total: 14 2020/03/03 02:44:00 info start sync spider to local, total: 14 2020/03/03 02:45:00 info start sync spider to local, total: 14 2020/03/03 02:46:00 info start sync spider to local, total: 14 2020/03/03 02:47:00 info start sync spider to local, total: 14 2020/03/03 02:48:00 info start sync spider to local, total: 14 2020/03/03 02:49:00 info start sync spider to local, total: 14 2020/03/03 02:50:00 info start sync spider to local, total: 14 2020/03/03 02:51:00 info start sync spider to local, total: 14 2020/03/03 02:52:00 info start sync spider to local, total: 14 2020/03/03 02:53:00 info start sync spider to local, total: 14 2020/03/03 02:54:00 info start sync spider to local, total: 14 2020/03/03 02:55:00 info start sync spider to local, total: 14 2020/03/03 02:56:00 info start sync spider to local, total: 14 2020/03/03 02:57:00 info start sync spider to local, total: 14 2020/03/03 02:58:00 info start sync spider to local, total: 14 2020/03/03 02:59:00 info start sync spider to local, total: 14 2020/03/03 03:00:00 info start sync spider to local, total: 14 2020/03/03 03:01:00 info start sync spider to local, total: 14 2020/03/03 03:02:00 info start sync spider to local, total: 14 2020/03/03 03:03:00 info start sync spider to local, total: 14 2020/03/03 03:04:00 info start sync spider to local, total: 14 2020/03/03 03:05:00 info start sync spider to local, total: 14 2020/03/03 03:05:10 error Get https://pypi.tuna.tsinghua.edu.cn/simple: dial tcp: lookup pypi.tuna.tsinghua.edu.cn on 127.0.0.11:53: read udp 127.0.0.1:54426->127.0.0.11:53: i/o timeout goroutine 15080 [running]: runtime/debug.Stack(0x0, 0x0, 0x0) /usr/local/go/src/runtime/debug/stack.go:24 +0x9d runtime/debug.PrintStack() /usr/local/go/src/runtime/debug/stack.go:16 +0x22 crawlab/services.FetchPythonDepList(0xc000416f58, 0x2, 0x0, 0x0, 0x1000000000001) /go/src/app/services/system.go:305 +0x3b4 crawlab/services.UpdatePythonDepList() /go/src/app/services/system.go:334 +0x26 crawlab/lib/cron.FuncJob.Run(0xffa6c8) /go/src/app/lib/cron/cron.go:131 +0x25 crawlab/lib/cron.(Cron).startJob.func1(0xc000303220, 0x1153ac0, 0xffa6c8) /go/src/app/lib/cron/cron.go:307 +0x5f created by crawlab/lib/cron.(Cron).startJob /go/src/app/lib/cron/cron.go:305 +0x73 2020/03/03 03:06:00 info start sync spider to local, total: 14 2020/03/03 03:07:00 info start sync spider to local, total: 14 2020/03/03 03:08:00 info start sync spider to local, total: 14 2020/03/03 03:09:00 info start sync spider to local, total: 14