-
'since_date'
Traceback (most recent call last):
File "C:\Users\cnghw\Desktop\weibo-crawler-masterN\weibo.py", line 1849, in get_pages
if self.get_user_info() != 0:
File "C:\Users\cnghw\Des…
-
docker-hub中作者没有显示为dataabc的,分别是
- neversayfuck/weibo-crawler-mql4
- nathan21hz/weibo-crawler
在nas中需要配置哪些环境变量和挂载容器
-
请问这两个问题我应该怎么修改代码啊?我的cookie是被封了吗?
Error: 'Weibo' object has no attribute 'cookie'
name 'traceback' is not defined
Traceback (most recent call last):
File "G:\图片\weibo-crawler-master\weibo.py",…
-
用户信息
Error: 'id'
Traceback (most recent call last):
File "D:/爬虫测试/weibo-crawler/weibo.py", line 983, in get_pages
self.print_user_info()
File "D:/爬虫测试/weibo-crawler/weibo.py", line 510, …
-
Traceback (most recent call last):
File "/Users/bicaihua/PycharmProjects/Weibo/weibo-crawler/Claude_Copilot.py", line 109, in
main()
File "/Users/bicaihua/PycharmProjects/Weibo/weibo-crawl…
-
Traceback (most recent call last):
File "C:/Users/June/Desktop/sina-weibo-crawler-master/spider.py", line 12, in
print(crawler.crawl(url = 'http://weibo.cn/yaochen'))
File "C:\Users\June\D…
-
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Progress: 0%| | 0/156 [00:00
-
weibo-crawler(也就是免cookie版)只能下载9张,但是微博weibo-spider(也就是cookie版)却可以全部下载全。(我是两个都填了cookie的)
-
您好,由于我目前只想快速获取列表中所有用户的信息,所以我把get_pages中后半部分爬取微博内容的部分·都注释了,只剩前面get用户信息那两句。
然后当程序爬取大概几十个用户后,就会报这个错然后退出:
用户信息
Error: 'id'
Traceback (most recent call last):
File "C:/Users/Lzp/Desktop/weibo-cra…
-
我在weibo-crawler的目录下输入指令python weibo.py 或者python -m weibo.py或者 python -m weibo都提示找不到模块。
显示Traceback (most recent call last):
File "weibo.py", line 14, in
import sqlite3
File "D:\anaconda\l…