Closed matthewpalmer closed 9 years ago
Roughly, what I want to happen is to load the content of each URL in urls
and then have an array of their contents at the end. I can't use a straight for loop because it's async. Maybe we should do it nicely with event emitters and stuff, but that seems like overkill. I think I'm just missing something.
Cheers
Scope issues are the worst. I've given it a quick look - but it does sound like it's failing to bind. I'll read up on the async pitfall tomorrow, and I'll try and see you in person to talk about it
Ok. I'm heading home this weekend but I'll be back Sunday. I'll probably rewrite it so that it uses streams/events anyway
On 15 Jan 2015, at 10:04 pm, fraserh notifications@github.com wrote:
Scope issues are the worst. I've given it a quick look - but it does sound like it's failing to bind. I'll read up on the async pitfall tomorrow, and I'll try and see you in person to talk about it
— Reply to this email directly or view it on GitHub.
Hitting something strange that I don't quite understand in 9b957cc16bd7415a3fcb62b68f7fef795917102c, specifically here.
First I tried to bind Crawler.prototype.crawl's
this
to theloadPage
method so that we could usethis.waitTime
(this
was being reported as undefined before doing this). Then stuff stopped working, so I tried to take that out, but now I'm getting the errorTypeError: 'undefined' is not a function (evaluating 'page.openPage')
from Phantom.I'll take another look either tomorrow morning or Sunday night, so don't worry if you can't get to it (I realise I haven't been great about describing the issue here haha).
It's using async.js, which mentions this common pitfall that might be related?