AquaAuma / FishGlob_data

Database and methods related to the manuscript "An integrated database of fish biodiversity sampled with scientific bottom trawl surveys"
Creative Commons Attribution 4.0 International
21 stars 7 forks source link

surprisingly small CPUE values for DFO-QCS #30

Closed zoekitchel closed 9 months ago

zoekitchel commented 10 months ago

Noticed surprisingly low CPUE values for DFO-QCS

zoekitchel commented 10 months ago

Found issue in cleaning code where wgt/m^2 is divided by 1000000 m^2/km^2 instead of multiplied, fixed in lines 107-115 of get_dfo-qcs.R

zoekitchel commented 10 months ago

Same changes for Hectate Strait, Strait of Georgia, West Coast Haida Gwaii and West Coast Vancouver Island cleaning codes. Made all changes, but the code threw a few errors:

-get_dfo-sog.R on Line 369: `######### Apply trimming per survey_unit method 2

dat_new_method2 <- apply_trimming_per_survey_unit_method2(clean_sog) summarise() has grouped output by 'survey_unit'. You can override using the .groups argument. Error in if (ext@xmin > -360.01 & ext@xmax < 360.01 & ext@ymin > -90.01 & : missing value where TRUE/FALSE needed`

-get_wchg.R on Line 369: `######### Apply trimming per survey_unit method 2

dat_new_method2 <- apply_trimming_per_survey_unit_method2(clean_wchg) Adding missing grouping variables: survey, haul_id, source, timestamp, country, sub_area, continent, stat_rec, station, stratum, month, day, quarter, season, haul_dur, area_swept, gear, depth, sbt, sst, verbatim_name summarise() has grouped output by 'survey_unit'. You can override using the .groups argument. Joining withby = join_by(survey, haul_id, source, timestamp, country, sub_area, continent, stat_rec, station, stratum, month, day, quarter, season, haul_dur, area_swept, gear, depth, sbt, sst, verbatim_name, survey_unit, year, location, latitude, longitude, cell) Error in dplyr::group_by(): ! Must group by variables found in.data. ✖ Column haul_id is not found. Runrlang::last_trace()` to see where the error occurred.```

I went ahead and pushed these 'cleaned' files to google drive, but they does NOT have flagging columns filled

AquaAuma commented 9 months ago

I'll re-run these cleaning scripts and trouble shoot where needed, thanks for modifying the codes!! I see where problems are, and there also seems to be more issues with the trimming methods outputs, I'll look into it

AquaAuma commented 9 months ago

@zoekitchel, I found out why you encountered issues: