crawlab-team / crawlab

Distributed web crawler admin platform for spiders management regardless of languages and frameworks. 分布式爬虫管理平台,支持任何语言和框架
https://www.crawlab.cn
BSD 3-Clause "New" or "Revised" License
11.38k stars 1.8k forks source link

为什么加pro上的微信, 没通过, 是不维护pro了吗 #1353

Closed sqzxcv closed 1 year ago

sqzxcv commented 1 year ago

为什么加pro上的微信, 没通过, 是不维护pro了吗

tikazyq commented 1 year ago

微信 tikazyq1 再加 ---- Replied Message ---- | From | @.> | | Date | 07/05/2023 16:29 | | To | crawlab-team/crawlab @.> | | Cc | Subscribed @.***> | | Subject | [crawlab-team/crawlab] 为什么加pro上的微信, 没通过, 是不维护pro了吗 (Issue #1353) |

为什么加pro上的微信, 没通过, 是不维护pro了吗

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

sqzxcv commented 1 year ago

微信 tikazyq1 再加 ---- Replied Message ---- | From | @.> | | Date | 07/05/2023 16:29 | | To | crawlab-team/crawlab @.> | | Cc | Subscribed @.> | | Subject | [crawlab-team/crawlab] 为什么加pro上的微信, 没通过, 是不维护pro了吗 (Issue #1353) | 为什么加pro上的微信, 没通过, 是不维护pro了吗 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.>

加了,但是还没反应

另外请教一下: a 爬虫爬取的链接存入 MongoDB后, b 爬虫怎么获取a的数据进行二次爬取内容?