-
Metadata is saved in distributed mode if there is a db worker with no flag `--no-incoming`. When I switched to single process mode, metadata is not saved. I did not find any setting that enables it. I…
-
I use SQLAlchemy backend with `db.worker` in distributed spiders setting.
```
BACKEND = 'frontera.contrib.backends.sqlalchemy.BFS'
SQLALCHEMYBACKEND_ENGINE = 'sqlite:///../state/bfs.db'
```
I e…
-
## Comportamento Esperado
Os campos da aba "Spider" na interface devem conter apenas configurações cuja edição pelo usuário seja segura, por exemplo: opções que alterarem funcionalidades globais do s…
-
Hello, I followed quick start tutorial. Turned on db worker and strategy worker. Then I turned on scrapy using `scrapy crawl general -L INFO -s FRONTERA_SETTINGS=frontier.spider_settings -s SEEDS_SOUR…
-
I did everything as the document [running-the-rawl](https://frontera.readthedocs.io/en/latest/topics/scrapy-integration.html#running-the-rawl
), and start to run
```
scrapy crawl my-spider
```
…
-
Python 3.6, Scrapy 1.5, Twisted 17.9.0
I'm running multiple spiders in the same process per:
https://doc.scrapy.org/en/latest/topics/practices.html#running-multiple-spiders-in-the-same-process
…
-
Hi,
I do not understand how to set `meta` parameters in a frontier Request generated from a seeder.
It seems that there are two kinds of meta parameters: frontier ones and scrapy ones. I would like to…
-
This is the Epic User Story to create a crawler system for iFiltr, it should be broken down to several other user stories and additional tasks for each user story.
1) Introduction
iFiltr will index …
-
Can we run Dask Jobqueue outside the SLURM system (e.g. on SRC) and have workers submitted to SLURM? Dask jobqueue uses `sbatch`/`scancel` to manage jobs, can one can provide custom commands that invo…
-
- [ ] [Spider: Yale Semantic Parsing and Text-to-SQL Challenge](https://yale-lily.github.io/spider)
# Spider: Yale Semantic Parsing and Text-to-SQL Challenge
**DESCRIPTION:**
Spider 1.0
Yale Se…