SpeciesFileGroup / taxonworks

Workbench for biodiversity informatics.
http://taxonworks.org
MIT License
87 stars 26 forks source link

Addings objects to Collecting events causes triggering loop - leading to extremely long load/save times for Collecting events #1630

Closed tmcelrath closed 3 years ago

tmcelrath commented 4 years ago

Describe the bug

Once a collecting event has had objects added, loading that collecting event takes longer and longer - leading to a max of ~32 seconds.

To Reproduce Steps to reproduce the behavior:

  1. When I create a new collecting event, then start adding collection objects, e.g. in comprehensive digitization
  2. Then I keep adding objects.
  3. And I add enough.
  4. Then loading that collecting event takes a long time to save (~32 seconds), or open georeference.

Expected behavior Should not cause triggering loop and should save quicker.

Screenshots Various console/network logs of issue: image image image image

Environment (please identify where you experience this bug: One of

LocoDelAssembly commented 4 years ago

This time I found log entries compatible with the screenshots:

[taxonworks-5f85f4f7b7-5kk5t] I, [2020-07-29T16:22:00.501632 #2515]  INFO -- :   Parameters: {"collecting_event"=>{"id"=>298730, "verbatim_label"=>"PANAMA:Canal Zone:\nCoco Solo Base\n28MAR-11APR1961\nlight trap  JL Hawkins", "print_label"=>nil, "document_label"=>nil, "verbatim_locality"=>"Canal Zone: Coco Solo Base", "verbatim_longitude"=>nil, "verbatim_latitude"=>nil, "verbatim_geolocation_uncertainty"=>nil, "verbatim_trip_identifier"=>nil, "verbatim_collectors"=>"JL Hawkins", "verbatim_method"=>"light trap", "verbatim_elevation"=>nil, "verbatim_habitat"=>nil, "verbatim_datum"=>nil, "verbatim_date"=>"28MAR-11APR1961", "geographic_area_id"=>33859, "minimum_elevation"=>nil, "maximum_elevation"=>nil, "elevation_precision"=>nil, "start_date_day"=>25, "start_date_month"=>3, "start_date_year"=>1961, "end_date_day"=>12, "end_date_month"=>4, "end_date_year"=>1961, "time_start_hour"=>nil, "time_start_minute"=>nil, "time_start_second"=>nil, "time_end_hour"=>nil, "time_end_minute"=>nil, "time_end_second"=>nil, "field_notes"=>nil, "md5_of_verbatim_label"=>"8a27d8a305a23de55394e12e9efd4674", "min_ma"=>nil, "max_ma"=>nil, "cached"=>"PANAMA:Canal Zone:\nCoco Solo Base\n28MAR-11APR1961\nlight trap  JL Hawkins", "cached_level0_geographic_name"=>nil, "cached_level1_geographic_name"=>nil, "cached_level2_geographic_name"=>nil, "group"=>nil, "formation"=>nil, "member"=>nil, "lithology"=>nil, "identifiers"=>[], "created_by_id"=>44, "updated_by_id"=>44, "project_id"=>1, "created_at"=>"2020-07-20T16:17:44.349Z", "updated_at"=>"2020-07-29T16:16:56.263Z", "object_tag"=>"PANAMA:Canal Zone:\nCoco Solo Base\n28MAR-11APR1961\nlight trap  JL Hawkins", "object_label"=>nil, "global_id"=>"gid://taxon-works/CollectingEvent/298730", "base_class"=>"CollectingEvent", "url_for"=>"https://sfg.taxonworks.org/collecting_events/298730.json", "object_url"=>"/collecting_events/298730", "collector_roles"=>[{"id"=>355640, "position"=>1, "type"=>"Collector", "person"=>{"id"=>92722, "type"=>"Person::Vetted", "last_name"=>"Hawkins", "first_name"=>"J.L.", "suffix"=>nil, "prefix"=>nil, "cached"=>"Hawkins, J.L.", "year_born"=>nil, "year_died"=>nil, "year_active_start"=>nil, "year_active_end"=>nil, "created_by_id"=>44, "updated_by_id"=>44, "created_at"=>"2020-07-20T16:15:44.963Z", "updated_at"=>"2020-07-20T16:15:44.963Z", "label_html"=>"Hawkins, J.L. <span class=\"feedback feedback-secondary feedback-thin\">lived: ?-? active: <i>unknown</i></span> <span class=\"feedback feedback-thin feedback-primary\">2 uses</span> <span class=\"feedback feedback-thin feedback-secondary\">Collector</span> <span class=\"feedback feedback-thin feedback-success\">In&nbsp;Project</span>", "object_tag"=>"Hawkins, J.L.", "object_label"=>nil, "global_id"=>"gid://taxon-works/Person::Vetted/92722", "base_class"=>"Person", "url_for"=>"https://sfg.taxonworks.org/collecting_events/298730.json", "object_url"=>"/people/92722"}}], "roles_attributes"=>[{"id"=>355640, "position"=>1, "type"=>"Collector", "person"=>{"id"=>92722, "type"=>"Person::Vetted", "last_name"=>"Hawkins", "first_name"=>"J.L.", "suffix"=>nil, "prefix"=>nil, "cached"=>"Hawkins, J.L.", "year_born"=>nil, "year_died"=>nil, "year_active_start"=>nil, "year_active_end"=>nil, "created_by_id"=>44, "updated_by_id"=>44, "created_at"=>"2020-07-20T16:15:44.963Z", "updated_at"=>"2020-07-20T16:15:44.963Z", "label_html"=>"Hawkins, J.L. <span class=\"feedback feedback-secondary feedback-thin\">lived: ?-? active: <i>unknown</i></span> <span class=\"feedback feedback-thin feedback-primary\">2 uses</span> <span class=\"feedback feedback-thin feedback-secondary\">Collector</span> <span class=\"feedback feedback-thin feedback-success\">In&nbsp;Project</span>", "object_tag"=>"Hawkins, J.L.", "object_label"=>nil, "global_id"=>"gid://taxon-works/Person::Vetted/92722", "base_class"=>"Person", "url_for"=>"https://sfg.taxonworks.org/collecting_events/298730.json", "object_url"=>"/people/92722"}}]}, "id"=>"298730"}
.
.
.
[taxonworks-5f85f4f7b7-5kk5t] I, [2020-07-29T16:22:34.408937 #2515]  INFO -- :   Rendering collecting_events/show.json.jbuilder

Will attempt reproducing locally because production logs don't show DB interaction.

tmcelrath commented 4 years ago

Possibly relevant - CEs without photos attached do not seem to be affected.

LocoDelAssembly commented 4 years ago

I'm now seen very large times to render taxon_names/select_options.json.jbuilder (Duration: 23219.5ms | Allocations: 7141687).

Are you perceiving delays only when saving or also when setting determinations @tmcelrath?

tmcelrath commented 4 years ago

Saving. There is a separate, longstanding issue that I can only replicate sporadically, that occasionally, searching for taxon names just takes forever. If I modify the name slightly I can be almost instant.

tmcelrath commented 4 years ago

@LocoDelAssembly want me to submit as separate issue?

LocoDelAssembly commented 4 years ago

Yes please, if you remember an input that can trigger the problem much better since I'm not sure I can recover it from the logs (despite now I have extended info compared to last week).

tmcelrath commented 4 years ago

@LocoDelAssembly that issue has been submitted as: #1642

LocoDelAssembly commented 4 years ago

Seems I found log entries for the other issue, but I cannot see a single instance of update (PATCH) that takes two-digits seconds so far... Wonder if the update could be frozen on the frontend side until all outstanding requests are responded @jlpereira? In this case, the smart selector search that was superseded by a subsequent search is still waited on to finish.

This would be a different scenario than the one depicted in the screenshots above where can be clearly seen that it is the update request itself that is taking a lot of time to reply.

LocoDelAssembly commented 3 years ago

@tmcelrath is this issue still popping up on regular use?

tmcelrath commented 3 years ago

Nope.