elixir-crawly / crawly

Crawly, a high-level web crawling & scraping framework for Elixir.
https://hexdocs.pm/crawly
Apache License 2.0
965 stars 113 forks source link

My Spider's code is never invoked, weird behavior with `Crawly.RequestsStorage.pop` in library code #224

Closed azrosen92 closed 1 year ago

azrosen92 commented 1 year ago

Not really sure what the issue is here. I have a pretty basic Spider implementation that looks like

  use Crawly.Spider

  @impl Crawly.Spider
  def base_url(), do: "https://www.backcountry.com"

  @impl Crawly.Spider
  def init() do
    [
      start_urls: [
        "https://www.backcountry.com/search?q=pannier+bag"
      ]
    ]
  end

  @impl Crawly.Spider
  def parse_item(response) do
    # Parse response body to document
    {:ok, document} = Floki.parse_document(response.body)

    search_results =
      document
      |> Floki.find("[data-id='productListingItems']")

    IO.inspect(search_results)

    parsed_results =
      search_results
      |> Enum.map(fn x ->
        %{
          name: Floki.find(x, "[data-id='productListingTitle']") |> Floki.text(),
          brand: Floki.find(x, "[data-id='productListingBrand']") |> Floki.text(),
          full_url: Floki.find(x, "a:first-child") |> Floki.attribute("href")
        }
      end)

    %Crawly.ParsedItem{:items => parsed_results, :requests => []}
  end
end

When I run iex -S mix run -e "Crawly.Engine.start_spider(Homebase)", I get the following errors

22:23:19.968 [error] GenServer #PID<0.462.0> terminating
** (UndefinedFunctionError) function :ok.url/0 is undefined (module :ok is not available)
    :ok.url()
    (crawly 0.13.0) lib/crawly/fetchers/httpoison_fetcher.ex:11: Crawly.Fetchers.HTTPoisonFetcher.fetch/2
    (crawly 0.13.0) lib/crawly/worker.ex:93: Crawly.Worker.get_response/1
    (epipe 1.0.0) /Users/aaronrosen/code/tutorial/deps/epipe/src/epipe.erl:23: :epipe.run/2
    (crawly 0.13.0) lib/crawly/worker.ex:53: Crawly.Worker.handle_info/2
    (stdlib 4.0.1) gen_server.erl:1120: :gen_server.try_dispatch/4
    (stdlib 4.0.1) gen_server.erl:1197: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
Last message: :work
State: %Crawly.Worker{backoff: 10000, spider_name: Homebase, crawl_id: "9d2a83d0-3709-11ed-aa52-8c8590b36e7e"}

I tracked it down to the implementation of RequestStorage.pop. It seems to be returning :ok instead of what I believe it's supposed to be returning (%Crawly.Request{...}). I was able to sort of reproduce this in the iex REPL, by doing the following

iex(1)> Crawly.Engine.start_spider(Homebase) 
iex(2)> r = GenServer.call(Crawly.RequestsStorage, {:pop, Elixir.Homebase})
iex(4)> r
:ok
iex(5)> r = GenServer.call(Crawly.RequestsStorage, {:pop, Elixir.Homebase})
%Crawly.Request{ ... }

What happens is, I immediately invoke GenServer.call(...) after starting the spider, it returns :ok (which seems to be what's causing the bug), then after a few more seconds, I call GenServer.call(...) again and it returns what it's supposed to. So it would seem that when Crawly invokes that RequestStorage.pop here, the request that is returned is actually the value :ok. That is eventually passed through to the fetcher, in this case, the default Crawly.Fetchers.HTTPoisonFetcher, and everything crashes here, when it tries to get request.url.

Anyone know what's going on here?

oltarasenko commented 1 year ago

Just in case, could you make an update to 0.14.0. For me it looks like this:

Generated quickstart app

11:03:59.519 [debug] Starting data storage

11:03:59.524 [debug] Starting the manager for BooksToScrape

11:03:59.524 [debug] Starting requests storage worker for BooksToScrape...

11:03:59.524 [debug] Started 8 workers for BooksToScrape
Interactive Elixir (1.14.0) - press Ctrl+C to exit (type h() ENTER for help)
iex(1)> []

nil
iex(2)>
11:04:59.525 [info] Current crawl speed is: 0 items/min

11:04:59.525 [info] Current requests count is: 0

11:04:59.525 [info] Stopping BooksToScrape, all requests handled
azrosen92 commented 1 year ago

@oltarasenko thanks for the quick response! I tried upgrading crawly to the latest version and I'm seeing the same error. I'll paste the complete log output below, there are a couple other debug logs that I didn't paste in the original question, that look like they might be relevant. In case it matters, I'm using elixir version 1.14.

iex(1)> Crawly.Engine.start_spider(Homebase)

12:11:44.688 [debug] Starting the manager for Homebase

12:11:44.688 [debug] Starting requests storage worker for Homebase...

12:11:44.688 [debug] Started 4 workers for Homebase
:ok
iex(2)> 
12:11:44.761 [warning] Description: 'Authenticity is not established by certificate path validation'
     Reason: 'Option {verify, verify_peer} and cacertfile/cacerts is missing'

12:11:49.690 [debug] ** (exit) exited in: GenServer.call(#PID<0.2046.0>, {:store, [%Crawly.Request{url: "https://www.backcountry.com/search?q=pannier+bag", headers: [], prev_response: nil, options: [], middlewares: [Crawly.Middlewares.DomainFilter, Crawly.Middlewares.RequestOptions, Crawly.Middlewares.UniqueRequest, Crawly.Middlewares.RobotsTxt], retries: 0}]}, 5000)
    ** (EXIT) time out
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage_worker.ex:85: Crawly.RequestsStorage.Worker.do_call/2
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage.ex:101: Crawly.RequestsStorage.handle_call/3
    (stdlib 4.0.1) gen_server.erl:1146: :gen_server.try_handle_call/4
    (stdlib 4.0.1) gen_server.erl:1175: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3

12:11:49.690 [error] GenServer #PID<0.2044.0> terminating
** (stop) exited in: GenServer.call(Crawly.RequestsStorage, {:store, {Homebase, [%Crawly.Request{url: "https://www.backcountry.com/search?q=pannier+bag", headers: [], prev_response: nil, options: [], middlewares: [Crawly.Middlewares.DomainFilter, Crawly.Middlewares.RequestOptions, Crawly.Middlewares.UniqueRequest, Crawly.Middlewares.RobotsTxt], retries: 0}]}}, 5000)
    ** (EXIT) time out
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (elixir 1.14.0) lib/enum.ex:980: anonymous fn/3 in Enum.each/2
    (elixir 1.14.0) lib/enum.ex:4307: anonymous fn/3 in Enum.each/2
    (elixir 1.14.0) lib/stream.ex:1799: anonymous fn/3 in Enumerable.Stream.reduce/3
    (elixir 1.14.0) lib/stream.ex:288: Stream.after_chunk_while/2
    (elixir 1.14.0) lib/stream.ex:1828: Enumerable.Stream.do_done/2
    (elixir 1.14.0) lib/enum.ex:4307: Enum.each/2
    (crawly 0.14.0) lib/crawly/manager.ex:142: Crawly.Manager.handle_continue/2
Last message: {:continue, {:startup, [crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e"]}}
State: %{closespider_timeout_limit: nil, crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e", itemcount_limit: nil, name: Homebase, prev_scraped_cnt: 0, tref: #Reference<0.2326695856.574881794.250792>, workers: [ok: #PID<0.2047.0>, ok: #PID<0.2048.0>, ok: #PID<0.2049.0>, ok: #PID<0.2050.0>]}

12:11:49.691 [debug] Starting the manager for Homebase

12:11:49.691 [debug] Starting the manager for Homebase

12:11:49.692 [debug] ** (exit) exited in: GenServer.call(#PID<0.2046.0>, :pop, 5000)
    ** (EXIT) exited in: GenServer.call(Crawly.RequestsStorage, {:store, {Homebase, [%Crawly.Request{url: "https://www.backcountry.com/search?q=pannier+bag", headers: [], prev_response: nil, options: [], middlewares: [Crawly.Middlewares.DomainFilter, Crawly.Middlewares.RequestOptions, Crawly.Middlewares.UniqueRequest, Crawly.Middlewares.RobotsTxt], retries: 0}]}}, 5000)
        ** (EXIT) time out
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage_worker.ex:85: Crawly.RequestsStorage.Worker.do_call/2
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage.ex:114: Crawly.RequestsStorage.handle_call/3
    (stdlib 4.0.1) gen_server.erl:1146: :gen_server.try_handle_call/4
    (stdlib 4.0.1) gen_server.erl:1175: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3

12:11:49.692 [debug] ** (exit) exited in: GenServer.call(#PID<0.2046.0>, :pop, 5000)
    ** (EXIT) no process: the process is not alive or there's no process currently associated with the given name, possibly because its application isn't started
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage_worker.ex:85: Crawly.RequestsStorage.Worker.do_call/2
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage.ex:114: Crawly.RequestsStorage.handle_call/3
    (stdlib 4.0.1) gen_server.erl:1146: :gen_server.try_handle_call/4
    (stdlib 4.0.1) gen_server.erl:1175: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3

12:11:49.693 [debug] ** (exit) exited in: GenServer.call(#PID<0.2046.0>, :pop, 5000)
    ** (EXIT) no process: the process is not alive or there's no process currently associated with the given name, possibly because its application isn't started
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage_worker.ex:85: Crawly.RequestsStorage.Worker.do_call/2
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage.ex:114: Crawly.RequestsStorage.handle_call/3
    (stdlib 4.0.1) gen_server.erl:1146: :gen_server.try_handle_call/4
    (stdlib 4.0.1) gen_server.erl:1175: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3

12:11:49.693 [debug] ** (exit) exited in: GenServer.call(#PID<0.2046.0>, :pop, 5000)
    ** (EXIT) no process: the process is not alive or there's no process currently associated with the given name, possibly because its application isn't started
    (elixir 1.14.0) lib/gen_server.ex:1038: GenServer.call/3
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage_worker.ex:85: Crawly.RequestsStorage.Worker.do_call/2
    (crawly 0.14.0) lib/crawly/requests_storage/requests_storage.ex:114: Crawly.RequestsStorage.handle_call/3
    (stdlib 4.0.1) gen_server.erl:1146: :gen_server.try_handle_call/4
    (stdlib 4.0.1) gen_server.erl:1175: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3

12:11:49.694 [debug] Starting requests storage worker for Homebase...

12:11:49.694 [debug] Started 4 workers for Homebase

12:11:49.695 [error] GenServer #PID<0.2047.0> terminating
** (UndefinedFunctionError) function :ok.url/0 is undefined (module :ok is not available)
    :ok.url()
    (crawly 0.14.0) lib/crawly/fetchers/httpoison_fetcher.ex:11: Crawly.Fetchers.HTTPoisonFetcher.fetch/2
    (crawly 0.14.0) lib/crawly/worker.ex:89: Crawly.Worker.get_response/1
    (crawly 0.14.0) lib/crawly/worker.ex:48: Crawly.Worker.handle_info/2
    (stdlib 4.0.1) gen_server.erl:1120: :gen_server.try_dispatch/4
    (stdlib 4.0.1) gen_server.erl:1197: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
Last message: :work
State: %Crawly.Worker{backoff: 10000, spider_name: Homebase, crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e"}

12:11:49.697 [error] GenServer #PID<0.2048.0> terminating
** (UndefinedFunctionError) function :ok.url/0 is undefined (module :ok is not available)
    :ok.url()
    (crawly 0.14.0) lib/crawly/fetchers/httpoison_fetcher.ex:11: Crawly.Fetchers.HTTPoisonFetcher.fetch/2
    (crawly 0.14.0) lib/crawly/worker.ex:89: Crawly.Worker.get_response/1
    (crawly 0.14.0) lib/crawly/worker.ex:48: Crawly.Worker.handle_info/2
    (stdlib 4.0.1) gen_server.erl:1120: :gen_server.try_dispatch/4
    (stdlib 4.0.1) gen_server.erl:1197: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
Last message: :work
State: %Crawly.Worker{backoff: 10000, spider_name: Homebase, crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e"}

12:11:49.699 [error] GenServer #PID<0.2049.0> terminating
** (UndefinedFunctionError) function :ok.url/0 is undefined (module :ok is not available)
    :ok.url()
    (crawly 0.14.0) lib/crawly/fetchers/httpoison_fetcher.ex:11: Crawly.Fetchers.HTTPoisonFetcher.fetch/2
    (crawly 0.14.0) lib/crawly/worker.ex:89: Crawly.Worker.get_response/1
    (crawly 0.14.0) lib/crawly/worker.ex:48: Crawly.Worker.handle_info/2
    (stdlib 4.0.1) gen_server.erl:1120: :gen_server.try_dispatch/4
    (stdlib 4.0.1) gen_server.erl:1197: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
Last message: :work
State: %Crawly.Worker{backoff: 10000, spider_name: Homebase, crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e"}

12:11:49.701 [error] GenServer #PID<0.2050.0> terminating
** (UndefinedFunctionError) function :ok.url/0 is undefined (module :ok is not available)
    :ok.url()
    (crawly 0.14.0) lib/crawly/fetchers/httpoison_fetcher.ex:11: Crawly.Fetchers.HTTPoisonFetcher.fetch/2
    (crawly 0.14.0) lib/crawly/worker.ex:89: Crawly.Worker.get_response/1
    (crawly 0.14.0) lib/crawly/worker.ex:48: Crawly.Worker.handle_info/2
    (stdlib 4.0.1) gen_server.erl:1120: :gen_server.try_dispatch/4
    (stdlib 4.0.1) gen_server.erl:1197: :gen_server.handle_msg/6
    (stdlib 4.0.1) proc_lib.erl:240: :proc_lib.init_p_do_apply/3
Last message: :work
State: %Crawly.Worker{backoff: 10000, spider_name: Homebase, crawl_id: "5a7ba068-377d-11ed-9fe0-8c8590b36e7e"}
azrosen92 commented 1 year ago

I just opened this PR against crawly. I'm not actually sure that this is a bug with the library itself. It almost seems like a GenServer bug or something's up with my environment... I really have no idea. But that change does fix the issue that I'm having. If you have any insight into what's going on, that would be much appreciated! I'm not super familiar with GenServer just yet.

Ziinc commented 1 year ago

Your initial request is not being stored by the RequestStorageWorker and is timing out due to an issue with your CA certs, I am assuming

This line is out of the ordinary and occurs just above the start of the errors.

12:11:44.761 [warning] Description: 'Authenticity is not established by certificate path validation'
     Reason: 'Option {verify, verify_peer} and cacertfile/cacerts is missing'

This has nothing to do with the requests popping. the pop returns :ok because your initial start urls were not even stored.

There are a few possibilities:

  1. your initial request is not stored at all due to error in one of the middlewares. unlikely as stacktrace does not show this.
  2. cacerts issue is causing initial request error and resulting in the insertion loop when request failure results in a retry.

I would suggest investigating #2.

Ziinc commented 1 year ago

@azrosen92 please reopen if you are still unable to hunt down the issue even after the above.