-
## Status
Assigning to @N2ITN
Please use this branch https://github.com/N2ITN/are-you-fake-news/tree/develop-dockerize
### Issue
There are several webscraping functions that were formerly on AW…
N2ITN updated
5 years ago
-
I haven't checked it, but there is a https://github.com/scrapy/scrapy/pull/999#discussion_r105122341 by @adiroiban which suggests HTTP11DownloadHandler.close implementation is not complete.
kmike updated
3 years ago
-
I found that running the example script with `uvloop` instead of the selector event loop fails to properly communicate with the client. Primary failure is that the buffer text does not get sent to the…
-
I have a project with tests inheriting from both twisted.trial.unittest.TestCase and tests not inheriting from anything, just plain Python objects. When running without asyncio reactor, they all work …
-
Seems like now Scrapy does not provide any way to crawl website that use websockets. Can we support something like this? There are websocket implementations based on Twisted: http://autobahn.ws/python…
-
I'm trying to pass requests to the spider externally, via message queues, and keep it running forever.
I found some projects made by others but none of them work for the current version of scrapy, …
-
### First Check
- [X] I added a very descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the SQLModel documentation, with the …
-
### Summary of problem
IPython produces RuntimeWarning on each tab-completion and enter when ddtrace's `asyncio` integration is active.
### Which version of dd-trace-py are you using?
2.8…
-
**程序时原封不动的运行
我的scrapy版本时2.5**
`
2021-12-29 14:10:14 [scrapy.utils.log] INFO: Scrapy 2.5.0 started (bot: example)
2021-12-29 14:10:14 [scrapy.utils.log] INFO: Versions: lxml 4.6.3.0, libxml2 2.9.5,…
-
I think we haven't created a ticket for this yet, we discussed integrating https://github.com/darkrho/scrapy-inline-requests a couple of weeks ago with @pablohoffman, @kmike and @nramirezuy.
I'm copy…