NewGraphEnvironment / fish_passage_fraser_2023_reporting

https://newgraphenvironment.github.io/fish_passage_fraser_2023_reporting/
Creative Commons Zero v1.0 Universal
0 stars 1 forks source link

Submit Phase 1 data #108

Open NewGraphEnvironment opened 5 days ago

NewGraphEnvironment commented 5 days ago

Unfortunately we need a windows machine to QA the final spreadsheet, package and submit (from all I can tell anyway).

I have machine so Lucy if you are able to get the spreadsheet ready and photos to ~/Library/CloudStorage/OneDrive-Personal/Projects/submissions/PSCIS/2023/fraser I can take it from there

I will link to here with csv export from the two projects and script used to generate it.

NewGraphEnvironment commented 5 days ago

just changed name of export submission (as above) to match purpose of last linked script ( scripts/05_pscis_export_to_template.R ). Here is the latest version of the script to get all the photos into one place https://github.com/NewGraphEnvironment/fish_passage_skeena_2023_reporting/blob/main/scripts/01_prep_inputs/0140-pscis-export-submission.R

of course will need to be run twice to get photos from both QGIS projects vs just the one as in the Skeena

NewGraphEnvironment commented 5 days ago

Actually - here is the last time an export was done. it was simplified with fs calls and comes from just the reassessments section of fish_passage_moti_2022_reporting

d$pscis_crossing_id will be d$my_crossing_reference since it is phase 1 vs reassessments ;>

## Reassessments-----------------------------------------------------------------
# this was updated July 2024 using Skeena 2023 as template
library(fpr)
library(tidyverse)
library(fs)

# PSCIS Submissions -------------

tfpr_filter_list <- function(idx){
  filestocopy_list[idx]
}

tfpr_photo_change_name <- function(filenames_to_change = filestocopy_list){
  gsub(filenames_to_change, pattern = path, replacement = targetdir)
}

name_repo <- 'fish_passage_moti_2022_reporting'
name_pdf <- 'Moti2022.pdf'
url_github <- 'https://github.com/NewGraphEnvironment/'
url_gitpages <- 'https://newgraphenvironment.github.io/'
name_submission <- 'pscis_reassessments.xlsm'

# need to add photos to local machine to upload to PSCIS
targetdir = fs::path('/Users/airvine/Library/CloudStorage/OneDrive-Personal/Projects/PSCIS/2022/reassessments', name_repo)

d <- fpr::fpr_import_pscis(workbook_name = 'pscis_reassessments.xlsm')

folderstocopy<- d$pscis_crossing_id %>% as.character()

path <- fs::path_wd('data/photos/')

path_to_photos <- fs::path(path, folderstocopy)

# here we transfer just the photos with labels over into the PSCIS directory where we will upload from to the gov interface

fs::dir_create(targetdir)

folderstocreate<- fs::path(targetdir, folderstocopy)

##create the folders
fs::dir_create(folderstocreate)

filestocopy_list <- path_to_photos %>%
  purrr::map(fpr::fpr_photo_paths_to_copy) %>%
  purrr::set_names(basename(folderstocreate))

##view which files do not have any photos to paste by reviewing the empty_files object
empty_idx <- which(!lengths(filestocopy_list))
empty_files <- empty_idx %>% tfpr_filter_list()

##rename long names if necessary

photo_sort_tracking <- path_to_photos %>%
  purrr::map(fpr::fpr_photo_document_all) %>%
  purrr::set_names(folderstocopy) %>%
  bind_rows(.id = 'folder') %>%
  mutate(photo_name = str_squish(str_extract(value, "[^/]*$")),
         photo_name_length = stringr::str_length(photo_name))

###here we back up a csv that gives us the new location and name of the original JPG photos.
## Not ideal because we did some sorting by hand without adding name of camera to the file name but a start on reproducibility nonetheless

##burn to csv
photo_sort_tracking %>%
  readr::write_csv(file = 'data/photos/photo_sort_tracking_reassessments.csv')

filestopaste_list <- filestocopy_list %>%
  map(tfpr_photo_change_name)

##!!!!!!!!!!!!!!!copy over the photos!!!!!!!!!!!!!!!!!!!!!!!
mapply(fs::file_copy,
       path =  filestocopy_list,
       new_path = filestopaste_list)

##also move over the pscis file
fs::file_copy(path = fs::path('data', name_submission),
              new_path = fs::path(targetdir, name_submission),
              overwrite = T)

#make a little readme for the pdf for upload to ecocat and other details
writeLines(
  paste(
    "Online interactive report is located at: ",
    paste0(url_gitpages, name_repo),
    "",
    "A versioned pdf of the report can be downloaded from: ",
    paste0(url_github, name_repo, "/raw/main/docs/", name_pdf),
    "",
    "Raw data is available here: ",
    paste0(url_github, name_repo, "/tree/main/data"),
    "",
    "All scripts to produce online interactive reporting and pdf are located at: ",
    paste0(url_github, name_repo),
    sep = "\n"
  ),
  fs::path(targetdir, 'readme.txt')
)

##in the future we will need to add the copy calls to move the directory off of onedrive to
#  the windows machine used for project submission.  Don't want to set up the machine right now
# so will do it by hand. What an insane pain
lucy-schick commented 5 days ago

cool. I'll get on it

lucy-schick commented 5 days ago

csvs have time attached to the date, we will have to fix and reburn to csv.

Screen Shot 2024-09-13 at 3 44 05 PM

lucy-schick commented 5 days ago

nevermind! found the date column! dont use date_time_start, use date

Screen Shot 2024-09-13 at 3 46 43 PM