Closed NewGraphEnvironment closed 5 months ago
Actually - I think we are going to need to leave h/p as 2 for Thompson second day to avoid other complexity. We can just put a note in Step 2 that fish were returned to the site after the first pass.
A couple sites that were a bit funky and have missing data:
8478_ds_ef1
- Trib to McDonell Lake
8478_ds_ef3
- Trib to McDonell Lake
Next time, update/add data in step_4_stream_site_data
spreadsheet from fiss and fish cards, then when you run the scripts in extract_inputs.R
to populate step_2_fish_coll_data
you will have all the correct data. I edited that data once it was already in step_2_fish_coll_data
which meant I also had to edit the data in step_4_stream_site_data
... still learning!
Data that was added by hand. Noting this down in case it needs to be re done... which would not be fun! Beware that some info was taken from fish cards and some from fiss cards
sampling_method, ef_seconds, enclosure, voltage, frequency for all sites
198215
- Dale Creek all ef sites.
123377
Thompson Creek
ds
sites with haul pass = 2, Temp, conductivity, turbitidy, added from fish cardus
sites, temp, conductivity, turbitidy, added from fiss cardds_ef1
added site length from fiss card8478
Trib to McDonell Lake
ds
sites, Temp, conductivity added from fiss card_us
site, fiss card labelled 8547_us
but I'm pretty sure is should be 8478_us
because there is a fish card with data labelled 8478_us
(and no fish card labelled 8547_us
), Temp, conductivity, turbidity added from fiss cardds_ef1
widths added from fiss card to step 4, then avg width added to site width in step 2198217_ds
sites - trib to Skeena
This is great! Thank you.
Next time, update/add data in step_4_stream_site_data spreadsheet from fiss and fish cards, then when you run the scripts in extract_inputs.R to populate step_2_fish_coll_data you will have all the correct data.
Another (and potentially preferred option from my perspective) would be to add the data to data_field/2023/form_fiss_site_2023.gpkg
in QQIS, track the changes in version control in the repo via fpr_sp_gpkg_backup
, export to csv with scripts/fiss_site_tidy.R
(template file is here https://github.com/NewGraphEnvironment/dff-2022/blob/master/scripts/fiss_site_tidy.R but likley now needs updateing) and copy paste over step_2_fish_coll_data
Yes I agree I think that is the best way to do it (see slack message).
logging ad hoc changes required to build reporting objects here as am doing so in main
fish submission spreadsheet. Related to https://github.com/NewGraphEnvironment/fish_passage_skeena_2023_reporting/issues/102
main
) - https://github.com/NewGraphEnvironment/fish_passage_skeena_2023_reporting/issues/99enclosure
first letter non-capitalization in first row of raw FDS template is inconsistent with drop down menus for remaining rows. Not sure why or if it matters much but open
vs Open
in rows where it is not in the drop down menu could glitch data dump to provincial databases. may as well correct
some documentation of this process here. https://github.com/NewGraphEnvironment/fish_passage_template_reporting/issues/8
Sites were only enclosed at Thompson downstream sites which also have a second pass on the following day. We tested our efficiency by seeing how many of our captured fish had tags the next day (ie. we put the fish back into our enclosed sites after our first pass and fished them again the next day)
NOTE - for Thompson
h/p
should not be 2 teh second day since the fish were put back in. Also - the downstream sites were enclosed.temperature
,conductivity
andTurbidity
were likely different second day too - card will tell us - should be recorded