dataabc / weibo-crawler

新浪微博爬虫,用python爬取新浪微博数据,并下载微博图片和微博视频
3.25k stars 739 forks source link

安装依赖出现问题[Errno 2] No such file or directory: 'requirements.txt' #377

Open kadima221 opened 1 year ago

kadima221 commented 1 year ago

weibo-crawler文件夹已经出现了,这个文件夹里面也有requirements.txt的文件,但运行pip install命令的时候出现了ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'。 我知道问题似乎出在没有在同一个目录上执行命令,但不知道该怎么解决,能否请大佬指点一下。用的jupyter notebook

kadima221 commented 1 year ago

不好意思,大佬,我解决了这个问题,但出现了新的error ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. conda-repo-cli 1.0.41 requires requests_mock, which is not installed. conda-repo-cli 1.0.41 requires clyent==1.2.1, but you have clyent 1.2.2 which is incompatible. conda-repo-cli 1.0.41 requires nbformat==5.4.0, but you have nbformat 5.7.0 which is incompatible. conda-repo-cli 1.0.41 requires requests==2.28.1, but you have requests 2.31.0 which is incompatible. 这个有影响吗

kadima221 commented 1 year ago

我没对上面的代码进行修改,只改了config后输入python weibo.py显示SyntaxError: invalid syntax

kadima221 commented 1 year ago

我把代码换成%run weibo.py完美运行了,不好意思打扰大佬了