Closed jarrodnorwell closed 3 years ago
Forgot bin/console enqueue:setup-broker
, sorry, fixed.
run bin/console enqueue:setup-broker
after create schema
I'll give that a shot tomorrow, thank you!
Running pgrep... only grabs 2-40 items, is there something else I'm missing?
I'm new to PHP stuff but have gotten a majority of this to work.
@ivan1986
for wich time? It's ok - many limits, look in enqueue table - should be many tasks
I'm now running into this issue, is there a way to skip over failed getMedia queries? My database is about 90MB in size now.
hm, do you clean torrrnt table? it's imposible case ok, add check
I believe I'm doing something wrong, I've been running it for days and only have 250MB and about 120 pages of movies.
I think the movie loads pretty slow. it takes weeks. as long as it's adding new movies, give it some time.
@official-antique
do you run bin/conole spider:run --all
?
because in cron only new films for last 48 hours
I've not run that every couple days, I'm running the enqueue with no time or memory limit to make it run forever.
I saw your API has 1000's of pages but mine seems to have 120 or so.
in addition to the queue, you need to run a start script that starts scanning anew the initial pages of the forums, by default it is configured to start every day with checking for topics newer than 48 hours. For the first launch, you need to run without the --last parameter
You can see what you have in the parsing queue right now
select count(*) c,
json_extract(properties, '$.\"enqueue.topic\"') topic,
json_extract(body, '$.spider') spider,
json_extract(body, '$.type') type
from enqueue
group by topic, spider, type;
Sorry, but now admin and statustic pages not ready
Just wondering what is needed to deploy this on Ubuntu, currently only for local testing.
I've gotten to the database creation but it doesn't contain an 'enqueue' table and running 'bin/console spider:run --all' doesn't do anything I believe.